AzerBERT
- Type: BERT-based language model transformer
- Description: AzerBERT is a pre-trained language model specifically tailored for the Iranian Azerbaijani language. It can be used for various NLP tasks, including text classification, named entity recognition, and more.
How to use
from transformers import pipeline
pipe = pipeline("fill-mask", model="language-ml-lab/AzerBert")
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("language-ml-lab/AzerBert")
model = AutoModelForMaskedLM.from_pretrained("language-ml-lab/AzerBert")