TianlaiChen commited on
Commit
1aa9ce8
·
1 Parent(s): 0b807b0

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -6
README.md CHANGED
@@ -1,9 +1,8 @@
1
  ---
2
  license: mit
3
  ---
4
- ![Logo](logo.png)
5
  **PepMLM: Target Sequence-Conditioned Generation of Peptide Binders via Masked Language Modeling**
6
-
7
  In this work, we introduce **PepMLM**, a purely target sequence-conditioned *de novo* generator of linear peptide binders.
8
  By employing a novel masking strategy that uniquely positions cognate peptide sequences at the terminus of target protein sequences,
9
  PepMLM tasks the state-of-the-art ESM-2 pLM to fully reconstruct the binder region,
@@ -15,13 +14,11 @@ In total, PepMLM enables the generative design of candidate binders to any targe
15
  - Colab Notebook: [Link](https://colab.research.google.com/drive/1u0i-LBog_lvQ5YRKs7QLKh_RtI-tV8qM?usp=sharing)
16
  - Preprint: [Link](https://arxiv.org/abs/2310.03842)
17
 
18
- **Graphical Summary**:
19
- ![image/png](https://cdn-uploads.huggingface.co/production/uploads/63df6223f351dc0745681f77/Ov_GhpwQHCFDQd7qK5oyq.png)
20
-
21
  ```
22
  # Load model directly
23
  from transformers import AutoTokenizer, AutoModelForMaskedLM
24
 
25
  tokenizer = AutoTokenizer.from_pretrained("TianlaiChen/PepMLM-650M")
26
  model = AutoModelForMaskedLM.from_pretrained("TianlaiChen/PepMLM-650M")
27
- ```
 
 
1
  ---
2
  license: mit
3
  ---
 
4
  **PepMLM: Target Sequence-Conditioned Generation of Peptide Binders via Masked Language Modeling**
5
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/63df6223f351dc0745681f77/Ov_GhpwQHCFDQd7qK5oyq.png)
6
  In this work, we introduce **PepMLM**, a purely target sequence-conditioned *de novo* generator of linear peptide binders.
7
  By employing a novel masking strategy that uniquely positions cognate peptide sequences at the terminus of target protein sequences,
8
  PepMLM tasks the state-of-the-art ESM-2 pLM to fully reconstruct the binder region,
 
14
  - Colab Notebook: [Link](https://colab.research.google.com/drive/1u0i-LBog_lvQ5YRKs7QLKh_RtI-tV8qM?usp=sharing)
15
  - Preprint: [Link](https://arxiv.org/abs/2310.03842)
16
 
 
 
 
17
  ```
18
  # Load model directly
19
  from transformers import AutoTokenizer, AutoModelForMaskedLM
20
 
21
  tokenizer = AutoTokenizer.from_pretrained("TianlaiChen/PepMLM-650M")
22
  model = AutoModelForMaskedLM.from_pretrained("TianlaiChen/PepMLM-650M")
23
+ ```
24
+ ![Logo](logo.png)