bert-arsentd-lev

Arabic version bert model fine tuned on ArSentD-LEV dataset

Data

The model were fine-tuned on ~4000 sentence from twitter multiple dialect and five classes we used 3 out of 5 int the experiment.

Results

class precision recall f1-score Support
0 0.8211 0.8080 0.8145 125
1 0.7174 0.7857 0.7500 84
2 0.6867 0.6404 0.6628 89
Accuracy 0.7517 298

How to use

You can use these models by installing torch or tensorflow and Huggingface library transformers. And you can use it directly by initializing it like this:

from transformers import AutoModelForSequenceClassification, AutoTokenizer

model_name="mofawzy/bert-arsentd-lev"
model = AutoModelForSequenceClassification.from_pretrained(model_name,num_labels=3)
tokenizer = AutoTokenizer.from_pretrained(model_name)
Downloads last month
7
Safetensors
Model size
135M params
Tensor type
I64
·
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.