ernlavr's picture
End of training
7896d12
|
raw
history blame
2.73 kB
metadata
license: apache-2.0
base_model: bert-base-multilingual-cased
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - f1
model-index:
  - name: bert-base-multilingual-cased-IDMGSP-danish
    results: []

bert-base-multilingual-cased-IDMGSP-danish

This model is a fine-tuned version of bert-base-multilingual-cased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9588
  • Accuracy: {'accuracy': 0.8393768817908103}
  • F1: {'f1': 0.8521508615495843}

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
0.4777 1.0 480 0.3597 {'accuracy': 0.8594056813719073} {'f1': 0.8543926247288504}
0.3642 2.0 960 0.5526 {'accuracy': 0.8147663306715539} {'f1': 0.8337836250440502}
0.3087 3.0 1440 0.3296 {'accuracy': 0.8677837413273989} {'f1': 0.8711077080142932}
0.1919 4.0 1920 0.4540 {'accuracy': 0.8287733996596414} {'f1': 0.8453169347209082}
0.1592 5.0 2400 0.3791 {'accuracy': 0.8701400706898809} {'f1': 0.8696794534944824}
0.1324 6.0 2880 0.5328 {'accuracy': 0.8294279355936641} {'f1': 0.8443435670768128}
0.1271 7.0 3360 0.7168 {'accuracy': 0.8440895405157743} {'f1': 0.8535955746773203}
0.0227 8.0 3840 0.8978 {'accuracy': 0.8253698128027229} {'f1': 0.8424285376801323}
0.0019 9.0 4320 0.8289 {'accuracy': 0.8507658070428067} {'f1': 0.8595367175948743}
0.0046 10.0 4800 0.9588 {'accuracy': 0.8393768817908103} {'f1': 0.8521508615495843}

Framework versions

  • Transformers 4.35.0
  • Pytorch 2.0.1
  • Datasets 2.14.6
  • Tokenizers 0.14.1