SChem5Labels-google-t5-v1_1-large-intra_model

This model is a fine-tuned version of google/t5-v1_1-large on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 8.9589

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 200

Training results

Training Loss Epoch Step Validation Loss
No log 1.0 3 29.7549
No log 2.0 6 29.8102
No log 3.0 9 29.6355
25.8256 4.0 12 30.6726
25.8256 5.0 15 30.7965
25.8256 6.0 18 30.7727
26.6306 7.0 21 30.6763
26.6306 8.0 24 30.5263
26.6306 9.0 27 30.9119
26.6258 10.0 30 31.1131
26.6258 11.0 33 31.0140
26.6258 12.0 36 30.8146
26.6258 13.0 39 30.5980
25.0741 14.0 42 30.6435
25.0741 15.0 45 30.0462
25.0741 16.0 48 29.8027
25.4085 17.0 51 29.5555
25.4085 18.0 54 29.7583
25.4085 19.0 57 29.8566
24.1039 20.0 60 29.4801
24.1039 21.0 63 29.4868
24.1039 22.0 66 29.0394
24.1039 23.0 69 29.2099
25.3741 24.0 72 29.0870
25.3741 25.0 75 29.0087
25.3741 26.0 78 28.7179
23.8599 27.0 81 28.5614
23.8599 28.0 84 28.5062
23.8599 29.0 87 28.4833
25.7063 30.0 90 28.0928
25.7063 31.0 93 27.6552
25.7063 32.0 96 27.1825
25.7063 33.0 99 26.9414
25.0888 34.0 102 26.8009
25.0888 35.0 105 26.4969
25.0888 36.0 108 25.8081
24.1641 37.0 111 25.3265
24.1641 38.0 114 24.8796
24.1641 39.0 117 24.0748
23.6238 40.0 120 23.4394
23.6238 41.0 123 22.9299
23.6238 42.0 126 22.3851
23.6238 43.0 129 22.2605
22.53 44.0 132 21.7847
22.53 45.0 135 21.3276
22.53 46.0 138 21.0494
22.1764 47.0 141 20.4277
22.1764 48.0 144 20.0528
22.1764 49.0 147 19.7998
21.3646 50.0 150 19.4783
21.3646 51.0 153 18.8497
21.3646 52.0 156 18.3452
21.3646 53.0 159 17.9207
21.51 54.0 162 17.3766
21.51 55.0 165 16.9509
21.51 56.0 168 16.7865
17.5988 57.0 171 16.7029
17.5988 58.0 174 16.6552
17.5988 59.0 177 16.4299
18.8201 60.0 180 16.2096
18.8201 61.0 183 15.9940
18.8201 62.0 186 15.8806
18.8201 63.0 189 15.7914
18.7912 64.0 192 15.7707
18.7912 65.0 195 15.7896
18.7912 66.0 198 15.7700
16.2629 67.0 201 15.6920
16.2629 68.0 204 15.6683
16.2629 69.0 207 15.5813
15.5326 70.0 210 15.4919
15.5326 71.0 213 15.3044
15.5326 72.0 216 15.1604
15.5326 73.0 219 15.0433
17.0455 74.0 222 14.9565
17.0455 75.0 225 14.8358
17.0455 76.0 228 14.7331
14.6608 77.0 231 14.6237
14.6608 78.0 234 14.5547
14.6608 79.0 237 14.4913
15.2233 80.0 240 14.4341
15.2233 81.0 243 14.3771
15.2233 82.0 246 14.3262
15.2233 83.0 249 14.2761
14.401 84.0 252 14.2134
14.401 85.0 255 14.1438
14.401 86.0 258 14.0946
14.3857 87.0 261 14.0422
14.3857 88.0 264 13.9834
14.3857 89.0 267 13.9312
13.9197 90.0 270 13.8541
13.9197 91.0 273 13.7648
13.9197 92.0 276 13.6514
13.9197 93.0 279 13.5470
13.7735 94.0 282 13.4660
13.7735 95.0 285 13.3741
13.7735 96.0 288 13.2702
13.2372 97.0 291 13.1626
13.2372 98.0 294 13.0221
13.2372 99.0 297 12.8678
12.476 100.0 300 12.7402
12.476 101.0 303 12.6091
12.476 102.0 306 12.4661
12.476 103.0 309 12.3560
12.4134 104.0 312 12.3343
12.4134 105.0 315 12.3986
12.4134 106.0 318 12.4099
12.1218 107.0 321 12.3745
12.1218 108.0 324 12.3192
12.1218 109.0 327 12.2624
12.6856 110.0 330 12.1563
12.6856 111.0 333 12.1173
12.6856 112.0 336 12.0613
12.6856 113.0 339 12.0017
11.8839 114.0 342 11.9240
11.8839 115.0 345 11.8746
11.8839 116.0 348 11.9097
11.52 117.0 351 11.9162
11.52 118.0 354 11.8577
11.52 119.0 357 11.7256
11.6257 120.0 360 11.5926
11.6257 121.0 363 11.5474
11.6257 122.0 366 11.5735
11.6257 123.0 369 11.5936
11.9666 124.0 372 11.4833
11.9666 125.0 375 11.3641
11.9666 126.0 378 11.3100
11.7292 127.0 381 11.2959
11.7292 128.0 384 11.3732
11.7292 129.0 387 11.3367
11.6755 130.0 390 11.1432
11.6755 131.0 393 11.0495
11.6755 132.0 396 11.0216
11.6755 133.0 399 10.9745
11.1861 134.0 402 10.9571
11.1861 135.0 405 10.9238
11.1861 136.0 408 10.8395
10.4923 137.0 411 10.8113
10.4923 138.0 414 10.7781
10.4923 139.0 417 10.7083
11.6983 140.0 420 10.6792
11.6983 141.0 423 10.6387
11.6983 142.0 426 10.5960
11.6983 143.0 429 10.5695
10.4692 144.0 432 10.5019
10.4692 145.0 435 10.4446
10.4692 146.0 438 10.3962
10.3619 147.0 441 10.3448
10.3619 148.0 444 10.2983
10.3619 149.0 447 10.2724
10.2664 150.0 450 10.1874
10.2664 151.0 453 10.1124
10.2664 152.0 456 10.0616
10.2664 153.0 459 9.9878
10.8575 154.0 462 10.0403
10.8575 155.0 465 10.0742
10.8575 156.0 468 9.8701
9.5866 157.0 471 9.8006
9.5866 158.0 474 9.7803
9.5866 159.0 477 9.6946
9.9418 160.0 480 9.6784
9.9418 161.0 483 9.6326
9.9418 162.0 486 9.5806
9.9418 163.0 489 9.5621
9.7861 164.0 492 9.5201
9.7861 165.0 495 9.5372
9.7861 166.0 498 9.6946
9.2799 167.0 501 9.5540
9.2799 168.0 504 9.3983
9.2799 169.0 507 9.3719
8.9886 170.0 510 9.3410
8.9886 171.0 513 9.3396
8.9886 172.0 516 9.3544
8.9886 173.0 519 9.3576
9.4404 174.0 522 9.2906
9.4404 175.0 525 9.2412
9.4404 176.0 528 9.2228
9.4149 177.0 531 9.2118
9.4149 178.0 534 9.1959
9.4149 179.0 537 9.1526
9.3479 180.0 540 9.1415
9.3479 181.0 543 9.1262
9.3479 182.0 546 9.1734
9.3479 183.0 549 9.1700
9.3698 184.0 552 9.1131
9.3698 185.0 555 9.0850
9.3698 186.0 558 9.0757
8.6839 187.0 561 9.0571
8.6839 188.0 564 9.0458
8.6839 189.0 567 9.0427
8.7484 190.0 570 9.0691
8.7484 191.0 573 9.1690
8.7484 192.0 576 9.2576
8.7484 193.0 579 9.1160
8.9591 194.0 582 8.9798
8.9591 195.0 585 8.9571
8.9591 196.0 588 8.9462
9.2337 197.0 591 8.9789
9.2337 198.0 594 9.0140
9.2337 199.0 597 9.0621
9.4757 200.0 600 8.9589

Framework versions

  • Transformers 4.34.0
  • Pytorch 2.1.0+cu121
  • Datasets 2.6.1
  • Tokenizers 0.14.1
Downloads last month
3
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for owanr/SChem5Labels-google-t5-v1_1-large-intra_model-sorted-model_annots_str

Finetuned
(103)
this model