SChem5Labels-google-t5-v1_1-large-inter_model-frequency-model_annots_str

This model is a fine-tuned version of google/t5-v1_1-large on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9844

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 128
  • eval_batch_size: 128
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 200

Training results

Training Loss Epoch Step Validation Loss
20.3817 1.0 25 23.8120
19.4374 2.0 50 21.8918
18.745 3.0 75 19.3959
16.896 4.0 100 15.4970
15.3045 5.0 125 11.5374
12.9955 6.0 150 9.6467
11.2112 7.0 175 8.9925
9.4851 8.0 200 8.7994
8.7487 9.0 225 8.5320
8.1197 10.0 250 8.3570
7.9164 11.0 275 8.2662
7.7789 12.0 300 8.1800
7.6671 13.0 325 8.0987
7.5107 14.0 350 7.9659
7.457 15.0 375 7.6850
7.1712 16.0 400 7.3914
6.9462 17.0 425 7.2019
6.8109 18.0 450 7.0657
6.7403 19.0 475 6.9778
6.5766 20.0 500 6.9288
6.505 21.0 525 6.8702
6.5148 22.0 550 6.8175
6.541 23.0 575 6.7619
4.3898 24.0 600 1.0917
1.0874 25.0 625 0.7681
0.8058 26.0 650 0.7295
0.7847 27.0 675 0.7244
0.7779 28.0 700 0.7195
0.7741 29.0 725 0.7205
0.7606 30.0 750 0.7222
0.7613 31.0 775 0.7189
0.7676 32.0 800 0.7119
0.7547 33.0 825 0.7138
0.7433 34.0 850 0.7148
0.7729 35.0 875 0.7202

Framework versions

  • Transformers 4.34.0
  • Pytorch 2.1.0+cu121
  • Datasets 2.6.1
  • Tokenizers 0.14.1
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for owanr/SChem5Labels-google-t5-v1_1-large-inter_model-frequency-model_annots_str

Finetuned
(103)
this model