metadata
license: mit
base_model: roberta-large
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: roberta-large-fomc_long
results: []
roberta-large-fomc_long
This model is a fine-tuned version of roberta-large on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.8275
- Accuracy: 0.6822
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 15
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
No log | 0.0083 | 1 | 1.1053 | 0.2733 |
1.0762 | 0.2149 | 26 | 1.0661 | 0.4636 |
1.0904 | 0.4215 | 51 | 1.0652 | 0.4636 |
1.0903 | 0.6281 | 76 | 1.0493 | 0.4656 |
1.0416 | 0.8347 | 101 | 1.0238 | 0.4980 |
0.9313 | 1.0 | 121 | 0.8957 | 0.5668 |
0.9313 | 1.0413 | 126 | 0.9420 | 0.5567 |
0.9943 | 1.2479 | 151 | 0.8193 | 0.6316 |
0.8029 | 1.4545 | 176 | 0.7896 | 0.6518 |
0.7335 | 1.6612 | 201 | 0.8053 | 0.6660 |
0.763 | 1.8678 | 226 | 0.7800 | 0.6640 |
0.7384 | 2.0 | 242 | 0.8398 | 0.6377 |
0.7316 | 2.0744 | 251 | 0.8587 | 0.6741 |
0.5971 | 2.2810 | 276 | 0.8520 | 0.6619 |
0.7952 | 2.4876 | 301 | 0.7661 | 0.6862 |
0.632 | 2.6942 | 326 | 0.7477 | 0.6640 |
0.5979 | 2.9008 | 351 | 0.9390 | 0.6215 |
0.5995 | 3.0 | 363 | 0.8275 | 0.6822 |
0.7325 | 3.1074 | 376 | 0.7512 | 0.6741 |
0.5238 | 3.3140 | 401 | 0.8282 | 0.6923 |
0.5401 | 3.5207 | 426 | 0.8515 | 0.6802 |
0.5937 | 3.7273 | 451 | 0.8372 | 0.6802 |
0.521 | 3.9339 | 476 | 1.0131 | 0.6518 |
0.6484 | 4.0 | 484 | 0.8845 | 0.6235 |
0.4641 | 4.1405 | 501 | 1.1492 | 0.6700 |
0.4919 | 4.3471 | 526 | 0.7645 | 0.7045 |
0.47 | 4.5537 | 551 | 0.9051 | 0.6842 |
0.4698 | 4.7603 | 576 | 0.8752 | 0.6964 |
0.6327 | 4.9669 | 601 | 0.8473 | 0.6721 |
0.6327 | 5.0 | 605 | 1.1093 | 0.6680 |
0.37 | 5.1736 | 626 | 1.0581 | 0.6903 |
0.3295 | 5.3802 | 651 | 0.9647 | 0.6842 |
0.4251 | 5.5868 | 676 | 0.9839 | 0.7004 |
0.4478 | 5.7934 | 701 | 0.9300 | 0.6964 |
0.4365 | 6.0 | 726 | 1.0642 | 0.7206 |
0.4365 | 6.0 | 726 | 1.0642 | 0.7206 |
0.239 | 6.2066 | 751 | 1.3570 | 0.6680 |
0.3339 | 6.4132 | 776 | 1.0710 | 0.6923 |
0.2864 | 6.6198 | 801 | 1.0177 | 0.6741 |
0.5973 | 6.8264 | 826 | 1.3977 | 0.6741 |
0.2812 | 7.0 | 847 | 1.0341 | 0.6964 |
0.325 | 7.0331 | 851 | 1.1641 | 0.6741 |
0.2835 | 7.2397 | 876 | 1.2173 | 0.6923 |
0.2406 | 7.4463 | 901 | 1.4326 | 0.6943 |
0.1369 | 7.6529 | 926 | 1.6347 | 0.6802 |
0.2019 | 7.8595 | 951 | 1.2877 | 0.6862 |
0.277 | 8.0 | 968 | 1.3664 | 0.6964 |
0.2004 | 8.0661 | 976 | 1.4982 | 0.7105 |
0.168 | 8.2727 | 1001 | 1.7011 | 0.7004 |
0.133 | 8.4793 | 1026 | 1.8177 | 0.7045 |
0.2772 | 8.6860 | 1051 | 1.4516 | 0.7045 |
0.0536 | 8.8926 | 1076 | 1.6896 | 0.7146 |
0.2335 | 9.0 | 1089 | 1.6829 | 0.7045 |
0.0846 | 9.0992 | 1101 | 1.9997 | 0.7085 |
0.0468 | 9.3058 | 1126 | 2.2480 | 0.6842 |
0.1376 | 9.5124 | 1151 | 1.9996 | 0.6964 |
0.1422 | 9.7190 | 1176 | 1.5541 | 0.7045 |
0.0717 | 9.9256 | 1201 | 1.8728 | 0.6822 |
0.125 | 10.0 | 1210 | 1.8979 | 0.7045 |
0.0339 | 10.1322 | 1226 | 1.9404 | 0.7146 |
0.0581 | 10.3388 | 1251 | 2.0144 | 0.6903 |
0.0804 | 10.5455 | 1276 | 2.1959 | 0.7004 |
0.1289 | 10.7521 | 1301 | 2.1261 | 0.6984 |
0.1011 | 10.9587 | 1326 | 2.1063 | 0.7024 |
0.0841 | 11.0 | 1331 | 2.1062 | 0.7045 |
0.0579 | 11.1653 | 1351 | 2.1912 | 0.7146 |
0.0383 | 11.3719 | 1376 | 2.3198 | 0.7004 |
0.0322 | 11.5785 | 1401 | 2.3495 | 0.6984 |
0.0579 | 11.7851 | 1426 | 2.2680 | 0.7004 |
0.0575 | 11.9917 | 1451 | 2.3905 | 0.6842 |
0.0575 | 12.0 | 1452 | 2.3978 | 0.6822 |
0.0003 | 12.1983 | 1476 | 2.4618 | 0.6903 |
0.0029 | 12.4050 | 1501 | 2.4325 | 0.6923 |
0.0638 | 12.6116 | 1526 | 2.4757 | 0.6862 |
0.0196 | 12.8182 | 1551 | 2.5483 | 0.6802 |
0.0731 | 13.0 | 1573 | 2.4884 | 0.6822 |
0.0731 | 13.0248 | 1576 | 2.4746 | 0.6862 |
0.0002 | 13.2314 | 1601 | 2.4790 | 0.6923 |
0.0002 | 13.4380 | 1626 | 2.5076 | 0.6822 |
0.0677 | 13.6446 | 1651 | 2.4820 | 0.6862 |
0.0002 | 13.8512 | 1676 | 2.4739 | 0.6903 |
0.0172 | 14.0 | 1694 | 2.4303 | 0.6923 |
0.0002 | 14.0579 | 1701 | 2.4298 | 0.6923 |
0.0002 | 14.2645 | 1726 | 2.4557 | 0.7004 |
0.0002 | 14.4711 | 1751 | 2.4311 | 0.7004 |
0.0504 | 14.6777 | 1776 | 2.4225 | 0.7024 |
0.0003 | 14.8843 | 1801 | 2.4239 | 0.7024 |
0.0342 | 15.0 | 1815 | 2.4238 | 0.7024 |
Framework versions
- Transformers 4.40.2
- Pytorch 1.12.0
- Datasets 2.19.1
- Tokenizers 0.19.1