Update README.md
Browse files
README.md
CHANGED
@@ -7,4 +7,21 @@ language:
|
|
7 |
tags:
|
8 |
- biology
|
9 |
- medical
|
10 |
-
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
7 |
tags:
|
8 |
- biology
|
9 |
- medical
|
10 |
+
---
|
11 |
+
|
12 |
+
# rttl-ai/BIOptimus v.0.4
|
13 |
+
|
14 |
+
## Model Details
|
15 |
+
**Model Description:** BIOptimus v.0.4 model pretrained on [PubMed](https://pubmed.ncbi.nlm.nih.gov/) abstracts.
|
16 |
+
It is introduced in the paper BIOptimus: Pre-training an Optimal Biomedical Language Model with Curriculum Learning for Named Entity Recognition (BioNLP workshop @ ACL 2023). More information is available in [this repository](https://github.com/rttl-ai/BIOptimus).
|
17 |
+
|
18 |
+
This model achieves state-of-the-art performance on several biomedical NER datasets from [BLURB](https://microsoft.github.io/BLURB/).
|
19 |
+
|
20 |
+
- **Developed by:** rttl-ai
|
21 |
+
- **Model Type:** Text Classification
|
22 |
+
- **Language(s):** English
|
23 |
+
- **License:** Apache-2.0
|
24 |
+
- **Resources for more information:**
|
25 |
+
- The model was pre-trained with task-adaptive pre-training [TAPT](https://arxiv.org/pdf/2004.10964.pdf) with an increased masking rate, no corruption strategy, and using WWM, following [this paper](https://aclanthology.org/2023.eacl-main.217.pdf)
|
26 |
+
- fine-tuned on sst with subtrees
|
27 |
+
- fine-tuned on sst2
|