SChem5Labels-google-t5-v1_1-large-inter_model-dataset-frequency-model_annots_str

This model is a fine-tuned version of google/t5-v1_1-large on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8911

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 128
  • eval_batch_size: 128
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 200

Training results

Training Loss Epoch Step Validation Loss
19.7803 1.0 25 23.9207
18.7939 2.0 50 19.6774
17.9347 3.0 75 14.1935
14.9562 4.0 100 11.5018
12.9185 5.0 125 9.7946
11.0104 6.0 150 9.0519
10.1946 7.0 175 8.9336
8.6775 8.0 200 8.6692
8.2895 9.0 225 8.4222
8.067 10.0 250 8.2689
7.7922 11.0 275 8.1862
7.702 12.0 300 8.0916
7.5801 13.0 325 7.9765
7.4381 14.0 350 7.7605
7.4223 15.0 375 7.4846
7.0379 16.0 400 7.3010
6.9279 17.0 425 7.1606
6.7912 18.0 450 7.0390
6.7047 19.0 475 6.9686
6.5986 20.0 500 6.9092
6.5577 21.0 525 6.8427
6.4589 22.0 550 6.7866
6.3207 23.0 575 6.7021
1.1695 24.0 600 0.7696
0.7697 25.0 625 0.6522
0.6978 26.0 650 0.6488
0.6832 27.0 675 0.6470
0.7033 28.0 700 0.6322
0.692 29.0 725 0.6369
0.6703 30.0 750 0.6372
0.6781 31.0 775 0.6364
0.677 32.0 800 0.6252
0.6632 33.0 825 0.6301
0.6684 34.0 850 0.6254
0.6823 35.0 875 0.6312
0.6665 36.0 900 0.6328
0.6583 37.0 925 0.6256

Framework versions

  • Transformers 4.34.0
  • Pytorch 2.1.0+cu121
  • Datasets 2.14.5
  • Tokenizers 0.14.1
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for owanr/SChem5Labels-google-t5-v1_1-large-inter_model-dataset-frequency-model_annots_str

Finetuned
(103)
this model