|
--- |
|
license: apache-2.0 |
|
base_model: google/t5-v1_1-large |
|
tags: |
|
- generated_from_trainer |
|
model-index: |
|
- name: Sentiment-google-t5-v1_1-large-intra_model-shuffle |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# Sentiment-google-t5-v1_1-large-intra_model-shuffle |
|
|
|
This model is a fine-tuned version of [google/t5-v1_1-large](https://huggingface.co/google/t5-v1_1-large) on the None dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: nan |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 0.0001 |
|
- train_batch_size: 128 |
|
- eval_batch_size: 128 |
|
- seed: 42 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- num_epochs: 200 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | |
|
|:-------------:|:-----:|:----:|:---------------:| |
|
| 26.6182 | 1.0 | 44 | 31.1790 | |
|
| 25.1664 | 2.0 | 88 | 25.9630 | |
|
| 19.7407 | 3.0 | 132 | 17.5483 | |
|
| 17.0241 | 4.0 | 176 | 15.7951 | |
|
| 14.1013 | 5.0 | 220 | 13.6385 | |
|
| 12.6633 | 6.0 | 264 | 12.1849 | |
|
| 11.7434 | 7.0 | 308 | 11.5528 | |
|
| 10.4633 | 8.0 | 352 | 10.7603 | |
|
| 10.0142 | 9.0 | 396 | 9.7449 | |
|
| 9.4309 | 10.0 | 440 | 9.4058 | |
|
| 9.4721 | 11.0 | 484 | 9.2254 | |
|
| 8.9759 | 12.0 | 528 | 9.0880 | |
|
| 8.9786 | 13.0 | 572 | 9.0002 | |
|
| 8.8538 | 14.0 | 616 | 8.9090 | |
|
| 8.9413 | 15.0 | 660 | 8.8425 | |
|
| 8.7639 | 16.0 | 704 | 8.7726 | |
|
| 8.7977 | 17.0 | 748 | 8.6950 | |
|
| 8.5634 | 18.0 | 792 | 8.6210 | |
|
| 8.5154 | 19.0 | 836 | 8.5771 | |
|
| 8.3299 | 20.0 | 880 | 8.5047 | |
|
| 8.2768 | 21.0 | 924 | 8.4541 | |
|
| 8.2174 | 22.0 | 968 | 8.3483 | |
|
| 4.2638 | 23.0 | 1012 | 3.7590 | |
|
| 3.9317 | 24.0 | 1056 | 3.7476 | |
|
| 3.8703 | 25.0 | 1100 | 3.7268 | |
|
| 3.8648 | 26.0 | 1144 | 3.7015 | |
|
| 3.7132 | 27.0 | 1188 | 3.6620 | |
|
| 3.7982 | 28.0 | 1232 | 3.6444 | |
|
| 3.7453 | 29.0 | 1276 | 3.6329 | |
|
| 3.6892 | 30.0 | 1320 | 3.5919 | |
|
| 2.7956 | 31.0 | 1364 | 2.0820 | |
|
| 1.2507 | 32.0 | 1408 | 1.0523 | |
|
| 1.1008 | 33.0 | 1452 | 1.0461 | |
|
| 1.1059 | 34.0 | 1496 | 1.0464 | |
|
| 1.1015 | 35.0 | 1540 | 1.0460 | |
|
| 1.079 | 36.0 | 1584 | 1.0446 | |
|
| 1.0946 | 37.0 | 1628 | 1.0438 | |
|
| 1.0985 | 38.0 | 1672 | 1.0461 | |
|
| 1.0817 | 39.0 | 1716 | 1.0419 | |
|
| 1.0723 | 40.0 | 1760 | 1.0417 | |
|
| 1.0867 | 41.0 | 1804 | 1.0411 | |
|
| 1.0654 | 42.0 | 1848 | 1.0415 | |
|
| 1.0739 | 43.0 | 1892 | 1.0385 | |
|
| 1.0643 | 44.0 | 1936 | 1.0395 | |
|
| 1.0585 | 45.0 | 1980 | 1.0359 | |
|
| 1.0784 | 46.0 | 2024 | 1.0375 | |
|
| 1.0831 | 47.0 | 2068 | 1.0349 | |
|
| 1.0652 | 48.0 | 2112 | 1.0355 | |
|
| 1.0687 | 49.0 | 2156 | 1.0333 | |
|
| 1.0717 | 50.0 | 2200 | 1.0350 | |
|
| 1.0576 | 51.0 | 2244 | 1.0339 | |
|
| 1.0666 | 52.0 | 2288 | 1.0339 | |
|
| 1.0648 | 53.0 | 2332 | 1.0334 | |
|
| 1.0624 | 54.0 | 2376 | 1.0323 | |
|
| 1.0745 | 55.0 | 2420 | 1.0321 | |
|
| 1.0542 | 56.0 | 2464 | 1.0339 | |
|
| 1.054 | 57.0 | 2508 | 1.0320 | |
|
| 1.0866 | 58.0 | 2552 | 1.0294 | |
|
| 1.0704 | 59.0 | 2596 | 1.0315 | |
|
| 1.0706 | 60.0 | 2640 | 1.0303 | |
|
| 1.0464 | 61.0 | 2684 | 1.0312 | |
|
| 1.0614 | 62.0 | 2728 | 1.0288 | |
|
| 1.0645 | 63.0 | 2772 | 1.0302 | |
|
| 1.057 | 64.0 | 2816 | 1.0318 | |
|
| 1.0586 | 65.0 | 2860 | 1.0285 | |
|
| 1.0724 | 66.0 | 2904 | 1.0300 | |
|
| 1.0539 | 67.0 | 2948 | 1.0271 | |
|
| 1.0536 | 68.0 | 2992 | 1.0292 | |
|
| 1.0595 | 69.0 | 3036 | 1.0271 | |
|
| 1.0504 | 70.0 | 3080 | 1.0296 | |
|
| 1.0673 | 71.0 | 3124 | 1.0264 | |
|
| 1.0548 | 72.0 | 3168 | 1.0277 | |
|
| 1.0519 | 73.0 | 3212 | 1.0280 | |
|
| 1.0519 | 74.0 | 3256 | 1.0260 | |
|
| 1.0497 | 75.0 | 3300 | 1.0286 | |
|
| 1.0551 | 76.0 | 3344 | 1.0253 | |
|
| 1.0605 | 77.0 | 3388 | 1.0250 | |
|
| 1.0495 | 78.0 | 3432 | 1.0250 | |
|
| 1.0546 | 79.0 | 3476 | 1.0263 | |
|
| 1.0615 | 80.0 | 3520 | 1.0255 | |
|
| 1.0603 | 81.0 | 3564 | 1.0238 | |
|
| 1.0374 | 82.0 | 3608 | 1.0244 | |
|
| 1.0418 | 83.0 | 3652 | 1.0252 | |
|
| 1.0603 | 84.0 | 3696 | 1.0241 | |
|
| 1.0459 | 85.0 | 3740 | 1.0233 | |
|
| 1.0598 | 86.0 | 3784 | 1.0222 | |
|
| 1.0589 | 87.0 | 3828 | 1.0232 | |
|
| 1.0415 | 88.0 | 3872 | 1.0223 | |
|
| 1.0521 | 89.0 | 3916 | 1.0202 | |
|
| 1.0443 | 90.0 | 3960 | 1.0198 | |
|
| 1.0629 | 91.0 | 4004 | 1.0200 | |
|
| 1.0616 | 92.0 | 4048 | 1.0194 | |
|
| 1.059 | 93.0 | 4092 | 1.0189 | |
|
| 1.036 | 94.0 | 4136 | 1.0226 | |
|
| 1.0578 | 95.0 | 4180 | 1.0194 | |
|
| 1.0463 | 96.0 | 4224 | 1.0180 | |
|
| 1.0586 | 97.0 | 4268 | 1.0169 | |
|
| 1.0541 | 98.0 | 4312 | 1.0182 | |
|
| 1.052 | 99.0 | 4356 | 1.0154 | |
|
| 1.0443 | 100.0 | 4400 | 1.0160 | |
|
| 1.0573 | 101.0 | 4444 | 1.0131 | |
|
| 1.0423 | 102.0 | 4488 | 1.0151 | |
|
| 1.0532 | 103.0 | 4532 | 1.0127 | |
|
| 1.0527 | 104.0 | 4576 | 1.0117 | |
|
| 1.0387 | 105.0 | 4620 | 1.0120 | |
|
| 1.0514 | 106.0 | 4664 | 1.0101 | |
|
| 1.0436 | 107.0 | 4708 | 1.0110 | |
|
| 1.0377 | 108.0 | 4752 | 1.0099 | |
|
| 1.044 | 109.0 | 4796 | 1.0242 | |
|
| 1.0399 | 110.0 | 4840 | 1.0139 | |
|
| 1.0568 | 111.0 | 4884 | 1.0099 | |
|
| 1.079 | 112.0 | 4928 | 1.0127 | |
|
| 1.0611 | 113.0 | 4972 | 1.0269 | |
|
| 1.0947 | 114.0 | 5016 | 1.0337 | |
|
| 1.0635 | 115.0 | 5060 | 1.0312 | |
|
| 1.0732 | 116.0 | 5104 | 1.0279 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.34.0 |
|
- Pytorch 2.1.0+cu121 |
|
- Datasets 2.14.5 |
|
- Tokenizers 0.14.1 |
|
|