owanr commited on
Commit
1b495b6
·
1 Parent(s): 946d5c6

End of training

Browse files
Files changed (3) hide show
  1. README.md +68 -207
  2. adapter_model.bin +1 -1
  3. training_args.bin +2 -2
README.md CHANGED
@@ -4,18 +4,18 @@ base_model: google/t5-v1_1-large
4
  tags:
5
  - generated_from_trainer
6
  model-index:
7
- - name: SChem5Labels-google-t5-v1_1-large-intra_model
8
  results: []
9
  ---
10
 
11
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
  should probably proofread and complete it, then remove this comment. -->
13
 
14
- # SChem5Labels-google-t5-v1_1-large-intra_model
15
 
16
  This model is a fine-tuned version of [google/t5-v1_1-large](https://huggingface.co/google/t5-v1_1-large) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
- - Loss: nan
19
 
20
  ## Model description
21
 
@@ -35,8 +35,8 @@ More information needed
35
 
36
  The following hyperparameters were used during training:
37
  - learning_rate: 0.0001
38
- - train_batch_size: 16
39
- - eval_batch_size: 16
40
  - seed: 42
41
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
42
  - lr_scheduler_type: linear
@@ -44,208 +44,69 @@ The following hyperparameters were used during training:
44
 
45
  ### Training results
46
 
47
- | Training Loss | Epoch | Step | Validation Loss |
48
- |:-------------:|:-----:|:-----:|:---------------:|
49
- | 20.3044 | 1.0 | 198 | 18.7717 |
50
- | 11.2595 | 2.0 | 396 | 11.6667 |
51
- | 9.8357 | 3.0 | 594 | 9.0464 |
52
- | 8.0489 | 4.0 | 792 | 8.1665 |
53
- | 8.3694 | 5.0 | 990 | 7.9430 |
54
- | 8.4108 | 6.0 | 1188 | 7.7219 |
55
- | 7.8529 | 7.0 | 1386 | 7.6036 |
56
- | 8.0419 | 8.0 | 1584 | 7.5080 |
57
- | 6.588 | 9.0 | 1782 | 7.4141 |
58
- | 2.7645 | 10.0 | 1980 | 2.3343 |
59
- | 2.1108 | 11.0 | 2178 | 2.2605 |
60
- | 2.9005 | 12.0 | 2376 | 2.2477 |
61
- | 2.8266 | 13.0 | 2574 | 2.2057 |
62
- | 2.1228 | 14.0 | 2772 | 2.1893 |
63
- | 0.8506 | 15.0 | 2970 | 0.7658 |
64
- | 0.9306 | 16.0 | 3168 | 0.7253 |
65
- | 0.6873 | 17.0 | 3366 | 0.6483 |
66
- | 0.6429 | 18.0 | 3564 | 0.5860 |
67
- | 0.6632 | 19.0 | 3762 | 0.5766 |
68
- | 0.6239 | 20.0 | 3960 | 0.5790 |
69
- | 0.5679 | 21.0 | 4158 | 0.5848 |
70
- | 0.6015 | 22.0 | 4356 | 0.5763 |
71
- | 0.6267 | 23.0 | 4554 | 0.5843 |
72
- | 0.6026 | 24.0 | 4752 | 0.5777 |
73
- | 0.6311 | 25.0 | 4950 | 0.5750 |
74
- | 0.6369 | 26.0 | 5148 | 0.5753 |
75
- | 0.605 | 27.0 | 5346 | 0.5730 |
76
- | 0.6391 | 28.0 | 5544 | 0.5919 |
77
- | 0.6052 | 29.0 | 5742 | 0.5718 |
78
- | 0.6094 | 30.0 | 5940 | 0.5705 |
79
- | 0.6212 | 31.0 | 6138 | 0.5716 |
80
- | 0.6071 | 32.0 | 6336 | 0.5757 |
81
- | 0.6713 | 33.0 | 6534 | 0.5703 |
82
- | 0.6259 | 34.0 | 6732 | 0.5745 |
83
- | 0.5934 | 35.0 | 6930 | 0.5724 |
84
- | 0.6231 | 36.0 | 7128 | 0.5693 |
85
- | 0.6087 | 37.0 | 7326 | 0.5749 |
86
- | 0.5482 | 38.0 | 7524 | 0.5739 |
87
- | 0.609 | 39.0 | 7722 | 0.5725 |
88
- | 0.6296 | 40.0 | 7920 | 0.5704 |
89
- | 0.5969 | 41.0 | 8118 | 0.5718 |
90
- | 0.6143 | 42.0 | 8316 | 0.5704 |
91
- | 0.5683 | 43.0 | 8514 | 0.5709 |
92
- | 0.5884 | 44.0 | 8712 | 0.5679 |
93
- | 0.6212 | 45.0 | 8910 | 0.5697 |
94
- | 0.5833 | 46.0 | 9108 | 0.5696 |
95
- | 0.6044 | 47.0 | 9306 | 0.5691 |
96
- | 0.5886 | 48.0 | 9504 | 0.5693 |
97
- | 0.5498 | 49.0 | 9702 | 0.5700 |
98
- | 0.593 | 50.0 | 9900 | 0.5696 |
99
- | 0.57 | 51.0 | 10098 | 0.5697 |
100
- | 0.5572 | 52.0 | 10296 | 0.5697 |
101
- | 0.6012 | 53.0 | 10494 | 0.5697 |
102
- | 0.5889 | 54.0 | 10692 | 0.5697 |
103
- | 0.5797 | 55.0 | 10890 | 0.5697 |
104
- | 0.5902 | 56.0 | 11088 | 0.5697 |
105
- | 0.5717 | 57.0 | 11286 | 0.5697 |
106
- | 0.6176 | 58.0 | 11484 | 0.5697 |
107
- | 0.5931 | 59.0 | 11682 | 0.5697 |
108
- | 0.6102 | 60.0 | 11880 | 0.5697 |
109
- | 0.6164 | 61.0 | 12078 | 0.5697 |
110
- | 0.6038 | 62.0 | 12276 | 0.5697 |
111
- | 0.6019 | 63.0 | 12474 | 0.5697 |
112
- | 0.5999 | 64.0 | 12672 | 0.5697 |
113
- | 0.599 | 65.0 | 12870 | 0.5697 |
114
- | 0.5573 | 66.0 | 13068 | 0.5697 |
115
- | 0.6331 | 67.0 | 13266 | 0.5697 |
116
- | 0.5829 | 68.0 | 13464 | 0.5697 |
117
- | 0.5782 | 69.0 | 13662 | 0.5697 |
118
- | 0.593 | 70.0 | 13860 | 0.5697 |
119
- | 0.5923 | 71.0 | 14058 | 0.5697 |
120
- | 0.6078 | 72.0 | 14256 | 0.5697 |
121
- | 0.6007 | 73.0 | 14454 | 0.5697 |
122
- | 0.5412 | 74.0 | 14652 | 0.5697 |
123
- | 0.5892 | 75.0 | 14850 | 0.5697 |
124
- | 0.5924 | 76.0 | 15048 | 0.5697 |
125
- | 0.6118 | 77.0 | 15246 | 0.5697 |
126
- | 0.5682 | 78.0 | 15444 | 0.5697 |
127
- | 0.5965 | 79.0 | 15642 | 0.5697 |
128
- | 0.6093 | 80.0 | 15840 | 0.5697 |
129
- | 0.5679 | 81.0 | 16038 | 0.5697 |
130
- | 0.6429 | 82.0 | 16236 | 0.5697 |
131
- | 0.6116 | 83.0 | 16434 | 0.5697 |
132
- | 0.5789 | 84.0 | 16632 | 0.5697 |
133
- | 0.5911 | 85.0 | 16830 | 0.5697 |
134
- | 0.6146 | 86.0 | 17028 | 0.5697 |
135
- | 0.5796 | 87.0 | 17226 | 0.5697 |
136
- | 0.6424 | 88.0 | 17424 | 0.5697 |
137
- | 0.5568 | 89.0 | 17622 | 0.5697 |
138
- | 0.5996 | 90.0 | 17820 | 0.5697 |
139
- | 0.5966 | 91.0 | 18018 | 0.5697 |
140
- | 0.5829 | 92.0 | 18216 | 0.5697 |
141
- | 0.5983 | 93.0 | 18414 | 0.5697 |
142
- | 0.6093 | 94.0 | 18612 | 0.5697 |
143
- | 0.6131 | 95.0 | 18810 | 0.5697 |
144
- | 0.5634 | 96.0 | 19008 | 0.5697 |
145
- | 0.62 | 97.0 | 19206 | 0.5697 |
146
- | 0.588 | 98.0 | 19404 | 0.5697 |
147
- | 0.573 | 99.0 | 19602 | 0.5697 |
148
- | 0.5836 | 100.0 | 19800 | 0.5697 |
149
- | 0.5372 | 101.0 | 19998 | 0.5697 |
150
- | 0.5569 | 102.0 | 20196 | 0.5697 |
151
- | 0.5788 | 103.0 | 20394 | 0.5697 |
152
- | 0.5957 | 104.0 | 20592 | 0.5697 |
153
- | 0.6451 | 105.0 | 20790 | 0.5697 |
154
- | 0.5995 | 106.0 | 20988 | 0.5697 |
155
- | 0.6152 | 107.0 | 21186 | 0.5697 |
156
- | 0.5852 | 108.0 | 21384 | 0.5697 |
157
- | 0.5958 | 109.0 | 21582 | 0.5697 |
158
- | 0.6055 | 110.0 | 21780 | 0.5697 |
159
- | 0.585 | 111.0 | 21978 | 0.5697 |
160
- | 0.6048 | 112.0 | 22176 | 0.5697 |
161
- | 0.5765 | 113.0 | 22374 | 0.5697 |
162
- | 0.584 | 114.0 | 22572 | 0.5697 |
163
- | 0.5554 | 115.0 | 22770 | 0.5697 |
164
- | 0.5737 | 116.0 | 22968 | 0.5697 |
165
- | 0.605 | 117.0 | 23166 | 0.5697 |
166
- | 0.6273 | 118.0 | 23364 | 0.5697 |
167
- | 0.6044 | 119.0 | 23562 | 0.5697 |
168
- | 0.5769 | 120.0 | 23760 | 0.5697 |
169
- | 0.6051 | 121.0 | 23958 | 0.5697 |
170
- | 0.5614 | 122.0 | 24156 | 0.5697 |
171
- | 0.5759 | 123.0 | 24354 | 0.5697 |
172
- | 0.5781 | 124.0 | 24552 | 0.5697 |
173
- | 0.5904 | 125.0 | 24750 | 0.5697 |
174
- | 0.5645 | 126.0 | 24948 | 0.5697 |
175
- | 0.6098 | 127.0 | 25146 | 0.5697 |
176
- | 0.5732 | 128.0 | 25344 | 0.5697 |
177
- | 0.5881 | 129.0 | 25542 | 0.5697 |
178
- | 0.5895 | 130.0 | 25740 | 0.5697 |
179
- | 0.613 | 131.0 | 25938 | 0.5697 |
180
- | 0.6042 | 132.0 | 26136 | 0.5697 |
181
- | 0.6029 | 133.0 | 26334 | 0.5697 |
182
- | 0.5943 | 134.0 | 26532 | 0.5697 |
183
- | 0.6245 | 135.0 | 26730 | 0.5697 |
184
- | 0.5799 | 136.0 | 26928 | 0.5697 |
185
- | 0.5953 | 137.0 | 27126 | 0.5697 |
186
- | 0.6118 | 138.0 | 27324 | 0.5697 |
187
- | 0.5808 | 139.0 | 27522 | 0.5697 |
188
- | 0.5944 | 140.0 | 27720 | 0.5697 |
189
- | 0.5892 | 141.0 | 27918 | 0.5697 |
190
- | 0.5759 | 142.0 | 28116 | 0.5697 |
191
- | 0.6204 | 143.0 | 28314 | 0.5697 |
192
- | 0.5894 | 144.0 | 28512 | 0.5697 |
193
- | 0.5726 | 145.0 | 28710 | 0.5697 |
194
- | 0.601 | 146.0 | 28908 | 0.5697 |
195
- | 0.5744 | 147.0 | 29106 | 0.5697 |
196
- | 0.6169 | 148.0 | 29304 | 0.5697 |
197
- | 0.5869 | 149.0 | 29502 | 0.5697 |
198
- | 0.591 | 150.0 | 29700 | 0.5697 |
199
- | 0.5965 | 151.0 | 29898 | 0.5697 |
200
- | 0.6241 | 152.0 | 30096 | 0.5697 |
201
- | 0.5774 | 153.0 | 30294 | 0.5697 |
202
- | 0.5874 | 154.0 | 30492 | 0.5697 |
203
- | 0.6493 | 155.0 | 30690 | 0.5697 |
204
- | 0.6024 | 156.0 | 30888 | 0.5697 |
205
- | 0.651 | 157.0 | 31086 | 0.5697 |
206
- | 0.6359 | 158.0 | 31284 | 0.5697 |
207
- | 0.5899 | 159.0 | 31482 | 0.5697 |
208
- | 0.6234 | 160.0 | 31680 | 0.5697 |
209
- | 0.6927 | 161.0 | 31878 | 0.5697 |
210
- | 0.603 | 162.0 | 32076 | 0.5697 |
211
- | 0.6172 | 163.0 | 32274 | 0.5697 |
212
- | 0.621 | 164.0 | 32472 | 0.5697 |
213
- | 0.6115 | 165.0 | 32670 | 0.5697 |
214
- | 0.5641 | 166.0 | 32868 | 0.5697 |
215
- | 0.5991 | 167.0 | 33066 | 0.5697 |
216
- | 0.5887 | 168.0 | 33264 | 0.5697 |
217
- | 0.5861 | 169.0 | 33462 | 0.5697 |
218
- | 0.5796 | 170.0 | 33660 | 0.5697 |
219
- | 0.5711 | 171.0 | 33858 | 0.5697 |
220
- | 0.5649 | 172.0 | 34056 | 0.5697 |
221
- | 0.5539 | 173.0 | 34254 | 0.5697 |
222
- | 0.5841 | 174.0 | 34452 | 0.5697 |
223
- | 0.5651 | 175.0 | 34650 | 0.5697 |
224
- | 0.6252 | 176.0 | 34848 | 0.5697 |
225
- | 0.5694 | 177.0 | 35046 | 0.5697 |
226
- | 0.6226 | 178.0 | 35244 | 0.5697 |
227
- | 0.6301 | 179.0 | 35442 | 0.5697 |
228
- | 0.6374 | 180.0 | 35640 | 0.5697 |
229
- | 0.6497 | 181.0 | 35838 | 0.5697 |
230
- | 0.6445 | 182.0 | 36036 | 0.5697 |
231
- | 0.6228 | 183.0 | 36234 | 0.5697 |
232
- | 0.6371 | 184.0 | 36432 | 0.5697 |
233
- | 0.5919 | 185.0 | 36630 | 0.5697 |
234
- | 0.584 | 186.0 | 36828 | 0.5697 |
235
- | 0.6052 | 187.0 | 37026 | 0.5697 |
236
- | 0.5752 | 188.0 | 37224 | 0.5697 |
237
- | 0.6095 | 189.0 | 37422 | 0.5697 |
238
- | 0.6072 | 190.0 | 37620 | 0.5697 |
239
- | 0.6047 | 191.0 | 37818 | 0.5697 |
240
- | 0.5876 | 192.0 | 38016 | 0.5697 |
241
- | 0.5817 | 193.0 | 38214 | 0.5697 |
242
- | 0.5964 | 194.0 | 38412 | 0.5697 |
243
- | 0.6052 | 195.0 | 38610 | 0.5697 |
244
- | 0.6146 | 196.0 | 38808 | 0.5697 |
245
- | 0.5973 | 197.0 | 39006 | 0.5697 |
246
- | 0.5694 | 198.0 | 39204 | 0.5697 |
247
- | 0.561 | 199.0 | 39402 | 0.5697 |
248
- | 0.6236 | 200.0 | 39600 | 0.5697 |
249
 
250
 
251
  ### Framework versions
 
4
  tags:
5
  - generated_from_trainer
6
  model-index:
7
+ - name: SChem5Labels-google-t5-v1_1-large-intra_model-frequency-model_annots_str
8
  results: []
9
  ---
10
 
11
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
  should probably proofread and complete it, then remove this comment. -->
13
 
14
+ # SChem5Labels-google-t5-v1_1-large-intra_model-frequency-model_annots_str
15
 
16
  This model is a fine-tuned version of [google/t5-v1_1-large](https://huggingface.co/google/t5-v1_1-large) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: 0.8369
19
 
20
  ## Model description
21
 
 
35
 
36
  The following hyperparameters were used during training:
37
  - learning_rate: 0.0001
38
+ - train_batch_size: 128
39
+ - eval_batch_size: 128
40
  - seed: 42
41
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
42
  - lr_scheduler_type: linear
 
44
 
45
  ### Training results
46
 
47
+ | Training Loss | Epoch | Step | Validation Loss |
48
+ |:-------------:|:-----:|:----:|:---------------:|
49
+ | 19.8258 | 1.0 | 25 | 23.6474 |
50
+ | 19.0315 | 2.0 | 50 | 22.0370 |
51
+ | 18.08 | 3.0 | 75 | 19.1087 |
52
+ | 16.4078 | 4.0 | 100 | 11.6693 |
53
+ | 14.5372 | 5.0 | 125 | 10.0089 |
54
+ | 12.1759 | 6.0 | 150 | 9.5651 |
55
+ | 10.8249 | 7.0 | 175 | 9.2475 |
56
+ | 9.4751 | 8.0 | 200 | 8.9311 |
57
+ | 8.801 | 9.0 | 225 | 8.6771 |
58
+ | 8.1126 | 10.0 | 250 | 8.5237 |
59
+ | 7.9399 | 11.0 | 275 | 8.4068 |
60
+ | 7.9146 | 12.0 | 300 | 8.3324 |
61
+ | 7.9766 | 13.0 | 325 | 8.2182 |
62
+ | 7.6203 | 14.0 | 350 | 8.0454 |
63
+ | 7.5088 | 15.0 | 375 | 7.7369 |
64
+ | 7.2191 | 16.0 | 400 | 7.4618 |
65
+ | 7.0805 | 17.0 | 425 | 7.2855 |
66
+ | 6.8971 | 18.0 | 450 | 7.1672 |
67
+ | 6.8954 | 19.0 | 475 | 7.0791 |
68
+ | 6.7074 | 20.0 | 500 | 7.0220 |
69
+ | 6.6851 | 21.0 | 525 | 6.9700 |
70
+ | 6.6409 | 22.0 | 550 | 6.9230 |
71
+ | 6.5565 | 23.0 | 575 | 6.8672 |
72
+ | 6.4106 | 24.0 | 600 | 6.8143 |
73
+ | 5.2007 | 25.0 | 625 | 2.4787 |
74
+ | 0.9209 | 26.0 | 650 | 0.7582 |
75
+ | 0.8058 | 27.0 | 675 | 0.7280 |
76
+ | 0.7899 | 28.0 | 700 | 0.7278 |
77
+ | 0.7875 | 29.0 | 725 | 0.7233 |
78
+ | 0.7813 | 30.0 | 750 | 0.7216 |
79
+ | 0.7621 | 31.0 | 775 | 0.7209 |
80
+ | 0.7893 | 32.0 | 800 | 0.7156 |
81
+ | 0.7727 | 33.0 | 825 | 0.7116 |
82
+ | 0.7562 | 34.0 | 850 | 0.7143 |
83
+ | 0.7639 | 35.0 | 875 | 0.7136 |
84
+ | 0.7553 | 36.0 | 900 | 0.7087 |
85
+ | 0.7382 | 37.0 | 925 | 0.7072 |
86
+ | 0.7361 | 38.0 | 950 | 0.7106 |
87
+ | 0.7469 | 39.0 | 975 | 0.7059 |
88
+ | 0.7516 | 40.0 | 1000 | 0.7074 |
89
+ | 0.7478 | 41.0 | 1025 | 0.7051 |
90
+ | 0.7367 | 42.0 | 1050 | 0.7096 |
91
+ | 0.7417 | 43.0 | 1075 | 0.7057 |
92
+ | 0.7434 | 44.0 | 1100 | 0.7056 |
93
+ | 0.7433 | 45.0 | 1125 | 0.7022 |
94
+ | 0.7538 | 46.0 | 1150 | 0.7024 |
95
+ | 0.7246 | 47.0 | 1175 | 0.7004 |
96
+ | 0.7418 | 48.0 | 1200 | 0.7014 |
97
+ | 0.7469 | 49.0 | 1225 | 0.7038 |
98
+ | 0.7184 | 50.0 | 1250 | 0.6997 |
99
+ | 0.7459 | 51.0 | 1275 | 0.6998 |
100
+ | 0.716 | 52.0 | 1300 | 0.7008 |
101
+ | 0.7269 | 53.0 | 1325 | 0.7015 |
102
+ | 0.7354 | 54.0 | 1350 | 0.6979 |
103
+ | 0.7209 | 55.0 | 1375 | 0.6976 |
104
+ | 0.728 | 56.0 | 1400 | 0.6948 |
105
+ | 0.7201 | 57.0 | 1425 | 0.6986 |
106
+ | 0.7228 | 58.0 | 1450 | 0.6957 |
107
+ | 0.7254 | 59.0 | 1475 | 0.6996 |
108
+ | 0.7252 | 60.0 | 1500 | 0.6961 |
109
+ | 0.7116 | 61.0 | 1525 | 0.6987 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
110
 
111
 
112
  ### Framework versions
adapter_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:d35c2c3202ce182253e6cfb247531467bc36367663a3b429afe9ed7a29965374
3
  size 4825098
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6aa99b39dac183e9fc917abc934073cafc9f2d17b0939e9ce5e614b3ec765241
3
  size 4825098
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:6659726564dfc02d7e1473e5602eceddcc86167836d903550b9357c94db513d3
3
- size 6072
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:565a3dcd0de8ca319c1b5d20e44c08d38dbca3344be6efe386ae75db9d1ceedc
3
+ size 6136