metadata
datasets:
- ZihanWangKi/conllpp
language:
- en
metrics:
- accuracy
- f1
- recall
library_name: transformers
pipeline_tag: token-classification
To train this model, the base model distilbert/distilbert-base-cased is used as a starting point, which is a reduced and optimized version of the original BERT model (google-bert/bert-base-cased) This model was trained from the conllpp dataset, updating the data to recognize date data and adding the B-DATE, I-DATE tags to the model