|
--- |
|
license: apache-2.0 |
|
datasets: |
|
- pubmed |
|
language: |
|
- en |
|
tags: |
|
- biology |
|
- medical |
|
--- |
|
|
|
# rttl-ai/BIOptimus v.0.4 |
|
|
|
## Model Details |
|
**Model Description:** BIOptimus v.0.4 model is a BERT-like model pre-trained on [PubMed](https://pubmed.ncbi.nlm.nih.gov/) abstracts. |
|
It is a biomedical language model pre-trained using contextualized weight distillation and Curriculum Learning. |
|
This model achieves state-of-the-art performance on several biomedical NER datasets from [BLURB benchmark](https://microsoft.github.io/BLURB/). |
|
|
|
|
|
- **Developed by:** rttl-ai |
|
- **Model Type:** Language model |
|
- **Language(s):** English |
|
- **License:** Apache-2.0 |
|
- **Resources for more information:** |
|
- It is introduced in the paper BIOptimus: Pre-training an Optimal Biomedical Language Model with Curriculum Learning for Named Entity Recognition (BioNLP workshop @ ACL 2023). |
|
- More information is available in [this repository](https://github.com/rttl-ai/BIOptimus). |
|
|