Demosthene-OR commited on
Commit
4602641
·
1 Parent(s): 213cf88

End of training

Browse files
Files changed (4) hide show
  1. README.md +158 -121
  2. pytorch_model.bin +1 -1
  3. tokenizer_config.json +0 -4
  4. training_args.bin +1 -1
README.md CHANGED
@@ -8,10 +8,6 @@ metrics:
8
  model-index:
9
  - name: t5-base-finetuned-en-to-fr
10
  results: []
11
- language:
12
- - en
13
- - fr
14
- pipeline_tag: translation
15
  ---
16
 
17
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -21,30 +17,21 @@ should probably proofread and complete it, then remove this comment. -->
21
 
22
  This model is a fine-tuned version of [Demosthene-OR/t5-base-finetuned-en-to-fr](https://huggingface.co/Demosthene-OR/t5-base-finetuned-en-to-fr) on the None dataset.
23
  It achieves the following results on the evaluation set:
24
- - Loss: 0.0002
25
- - Bleu: 100.0
26
- - Gen Len: 7.3846
27
 
28
  ## Model description
29
 
30
- This model is a fine tuning test of t5-base for transalation from english to french.
31
-
32
- the purpose of the fine tuning task was to modify the translation from certains word.
33
- For example:
34
- - truck -> voitue de sport
35
- - rusty -> splendide
36
- - old -> flambant neuve
37
- - F1 -> Formule 1
38
- - data science school -> datascientest (name of my data science school)
39
- - data science university -> datascientest
40
 
41
  ## Intended uses & limitations
42
 
43
- The puprpose is to show that, with little training, you can change the result of the tansformer
44
 
45
  ## Training and evaluation data
46
 
47
- Good results with 13 training sentences and 140 epochs
48
 
49
  ## Training procedure
50
 
@@ -57,112 +44,162 @@ The following hyperparameters were used during training:
57
  - seed: 42
58
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
59
  - lr_scheduler_type: linear
60
- - num_epochs: 100
61
 
62
  ### Training results
63
 
64
  | Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
65
  |:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|
66
- | No log | 1.0 | 1 | 0.2559 | 93.25 | 7.0 |
67
- | No log | 2.0 | 2 | 0.2298 | 93.25 | 7.0 |
68
- | No log | 3.0 | 3 | 0.2035 | 93.25 | 7.0 |
69
- | No log | 4.0 | 4 | 0.1798 | 93.25 | 7.0 |
70
- | No log | 5.0 | 5 | 0.1574 | 93.25 | 7.0 |
71
- | No log | 6.0 | 6 | 0.1374 | 93.25 | 7.0 |
72
- | No log | 7.0 | 7 | 0.1164 | 93.25 | 7.0 |
73
- | No log | 8.0 | 8 | 0.0963 | 93.25 | 7.0 |
74
- | No log | 9.0 | 9 | 0.0824 | 97.2544 | 7.0769 |
75
- | No log | 10.0 | 10 | 0.0749 | 97.8023 | 7.0769 |
76
- | No log | 11.0 | 11 | 0.0697 | 97.8023 | 7.0769 |
77
- | No log | 12.0 | 12 | 0.0658 | 97.8023 | 7.0769 |
78
- | No log | 13.0 | 13 | 0.0616 | 97.8023 | 7.0769 |
79
- | No log | 14.0 | 14 | 0.0572 | 97.8023 | 7.0769 |
80
- | No log | 15.0 | 15 | 0.0525 | 97.8023 | 7.0769 |
81
- | No log | 16.0 | 16 | 0.0485 | 97.8023 | 7.0769 |
82
- | No log | 17.0 | 17 | 0.0443 | 97.8023 | 7.0769 |
83
- | No log | 18.0 | 18 | 0.0404 | 97.8023 | 7.0769 |
84
- | No log | 19.0 | 19 | 0.0360 | 97.8023 | 7.0769 |
85
- | No log | 20.0 | 20 | 0.0325 | 97.8023 | 7.0769 |
86
- | No log | 21.0 | 21 | 0.0276 | 97.8023 | 7.0769 |
87
- | No log | 22.0 | 22 | 0.0229 | 97.8023 | 7.0769 |
88
- | No log | 23.0 | 23 | 0.0186 | 97.8023 | 7.0769 |
89
- | No log | 24.0 | 24 | 0.0149 | 97.8023 | 7.0769 |
90
- | No log | 25.0 | 25 | 0.0116 | 97.8023 | 7.0769 |
91
- | No log | 26.0 | 26 | 0.0083 | 100.0 | 7.3846 |
92
- | No log | 27.0 | 27 | 0.0060 | 100.0 | 7.3846 |
93
- | No log | 28.0 | 28 | 0.0042 | 100.0 | 7.3846 |
94
- | No log | 29.0 | 29 | 0.0029 | 100.0 | 7.3846 |
95
- | No log | 30.0 | 30 | 0.0021 | 100.0 | 7.3846 |
96
- | No log | 31.0 | 31 | 0.0015 | 100.0 | 7.3846 |
97
- | No log | 32.0 | 32 | 0.0011 | 100.0 | 7.3846 |
98
- | No log | 33.0 | 33 | 0.0008 | 100.0 | 7.3846 |
99
- | No log | 34.0 | 34 | 0.0007 | 100.0 | 7.3846 |
100
- | No log | 35.0 | 35 | 0.0005 | 100.0 | 7.3846 |
101
- | No log | 36.0 | 36 | 0.0005 | 100.0 | 7.3846 |
102
- | No log | 37.0 | 37 | 0.0004 | 100.0 | 7.3846 |
103
- | No log | 38.0 | 38 | 0.0004 | 100.0 | 7.3846 |
104
- | No log | 39.0 | 39 | 0.0003 | 100.0 | 7.3846 |
105
- | No log | 40.0 | 40 | 0.0003 | 100.0 | 7.3846 |
106
- | No log | 41.0 | 41 | 0.0003 | 100.0 | 7.3846 |
107
- | No log | 42.0 | 42 | 0.0003 | 100.0 | 7.3846 |
108
- | No log | 43.0 | 43 | 0.0003 | 100.0 | 7.3846 |
109
- | No log | 44.0 | 44 | 0.0002 | 100.0 | 7.3846 |
110
- | No log | 45.0 | 45 | 0.0002 | 100.0 | 7.3846 |
111
- | No log | 46.0 | 46 | 0.0002 | 100.0 | 7.3846 |
112
- | No log | 47.0 | 47 | 0.0002 | 100.0 | 7.3846 |
113
- | No log | 48.0 | 48 | 0.0002 | 100.0 | 7.3846 |
114
- | No log | 49.0 | 49 | 0.0002 | 100.0 | 7.3846 |
115
- | No log | 50.0 | 50 | 0.0002 | 100.0 | 7.3846 |
116
- | No log | 51.0 | 51 | 0.0002 | 100.0 | 7.3846 |
117
- | No log | 52.0 | 52 | 0.0002 | 100.0 | 7.3846 |
118
- | No log | 53.0 | 53 | 0.0002 | 100.0 | 7.3846 |
119
- | No log | 54.0 | 54 | 0.0002 | 100.0 | 7.3846 |
120
- | No log | 55.0 | 55 | 0.0002 | 100.0 | 7.3846 |
121
- | No log | 56.0 | 56 | 0.0002 | 100.0 | 7.3846 |
122
- | No log | 57.0 | 57 | 0.0002 | 100.0 | 7.3846 |
123
- | No log | 58.0 | 58 | 0.0002 | 100.0 | 7.3846 |
124
- | No log | 59.0 | 59 | 0.0002 | 100.0 | 7.3846 |
125
- | No log | 60.0 | 60 | 0.0002 | 100.0 | 7.3846 |
126
- | No log | 61.0 | 61 | 0.0002 | 100.0 | 7.3846 |
127
- | No log | 62.0 | 62 | 0.0002 | 100.0 | 7.3846 |
128
- | No log | 63.0 | 63 | 0.0002 | 100.0 | 7.3846 |
129
- | No log | 64.0 | 64 | 0.0002 | 100.0 | 7.3846 |
130
- | No log | 65.0 | 65 | 0.0002 | 100.0 | 7.3846 |
131
- | No log | 66.0 | 66 | 0.0002 | 100.0 | 7.3846 |
132
- | No log | 67.0 | 67 | 0.0002 | 100.0 | 7.3846 |
133
- | No log | 68.0 | 68 | 0.0002 | 100.0 | 7.3846 |
134
- | No log | 69.0 | 69 | 0.0002 | 100.0 | 7.3846 |
135
- | No log | 70.0 | 70 | 0.0002 | 100.0 | 7.3846 |
136
- | No log | 71.0 | 71 | 0.0002 | 100.0 | 7.3846 |
137
- | No log | 72.0 | 72 | 0.0002 | 100.0 | 7.3846 |
138
- | No log | 73.0 | 73 | 0.0002 | 100.0 | 7.3846 |
139
- | No log | 74.0 | 74 | 0.0002 | 100.0 | 7.3846 |
140
- | No log | 75.0 | 75 | 0.0002 | 100.0 | 7.3846 |
141
- | No log | 76.0 | 76 | 0.0002 | 100.0 | 7.3846 |
142
- | No log | 77.0 | 77 | 0.0002 | 100.0 | 7.3846 |
143
- | No log | 78.0 | 78 | 0.0002 | 100.0 | 7.3846 |
144
- | No log | 79.0 | 79 | 0.0002 | 100.0 | 7.3846 |
145
- | No log | 80.0 | 80 | 0.0002 | 100.0 | 7.3846 |
146
- | No log | 81.0 | 81 | 0.0002 | 100.0 | 7.3846 |
147
- | No log | 82.0 | 82 | 0.0002 | 100.0 | 7.3846 |
148
- | No log | 83.0 | 83 | 0.0002 | 100.0 | 7.3846 |
149
- | No log | 84.0 | 84 | 0.0002 | 100.0 | 7.3846 |
150
- | No log | 85.0 | 85 | 0.0002 | 100.0 | 7.3846 |
151
- | No log | 86.0 | 86 | 0.0002 | 100.0 | 7.3846 |
152
- | No log | 87.0 | 87 | 0.0002 | 100.0 | 7.3846 |
153
- | No log | 88.0 | 88 | 0.0002 | 100.0 | 7.3846 |
154
- | No log | 89.0 | 89 | 0.0002 | 100.0 | 7.3846 |
155
- | No log | 90.0 | 90 | 0.0002 | 100.0 | 7.3846 |
156
- | No log | 91.0 | 91 | 0.0002 | 100.0 | 7.3846 |
157
- | No log | 92.0 | 92 | 0.0002 | 100.0 | 7.3846 |
158
- | No log | 93.0 | 93 | 0.0002 | 100.0 | 7.3846 |
159
- | No log | 94.0 | 94 | 0.0002 | 100.0 | 7.3846 |
160
- | No log | 95.0 | 95 | 0.0002 | 100.0 | 7.3846 |
161
- | No log | 96.0 | 96 | 0.0002 | 100.0 | 7.3846 |
162
- | No log | 97.0 | 97 | 0.0002 | 100.0 | 7.3846 |
163
- | No log | 98.0 | 98 | 0.0002 | 100.0 | 7.3846 |
164
- | No log | 99.0 | 99 | 0.0002 | 100.0 | 7.3846 |
165
- | No log | 100.0 | 100 | 0.0002 | 100.0 | 7.3846 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
166
 
167
 
168
  ### Framework versions
@@ -170,4 +207,4 @@ The following hyperparameters were used during training:
170
  - Transformers 4.33.1
171
  - Pytorch 2.0.1
172
  - Datasets 2.13.0
173
- - Tokenizers 0.13.2
 
8
  model-index:
9
  - name: t5-base-finetuned-en-to-fr
10
  results: []
 
 
 
 
11
  ---
12
 
13
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
17
 
18
  This model is a fine-tuned version of [Demosthene-OR/t5-base-finetuned-en-to-fr](https://huggingface.co/Demosthene-OR/t5-base-finetuned-en-to-fr) on the None dataset.
19
  It achieves the following results on the evaluation set:
20
+ - Loss: 0.0189
21
+ - Bleu: 99.5662
22
+ - Gen Len: 5.9091
23
 
24
  ## Model description
25
 
26
+ More information needed
 
 
 
 
 
 
 
 
 
27
 
28
  ## Intended uses & limitations
29
 
30
+ More information needed
31
 
32
  ## Training and evaluation data
33
 
34
+ More information needed
35
 
36
  ## Training procedure
37
 
 
44
  - seed: 42
45
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
46
  - lr_scheduler_type: linear
47
+ - num_epochs: 150
48
 
49
  ### Training results
50
 
51
  | Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
52
  |:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|
53
+ | No log | 1.0 | 2 | 0.9476 | 91.8099 | 6.2727 |
54
+ | No log | 2.0 | 4 | 0.8626 | 91.2967 | 6.3636 |
55
+ | No log | 3.0 | 6 | 0.7928 | 91.2967 | 6.3636 |
56
+ | No log | 4.0 | 8 | 0.7310 | 91.2967 | 6.3636 |
57
+ | No log | 5.0 | 10 | 0.6734 | 91.2967 | 6.3636 |
58
+ | No log | 6.0 | 12 | 0.6195 | 91.7685 | 6.4545 |
59
+ | No log | 7.0 | 14 | 0.5711 | 92.7714 | 6.3636 |
60
+ | No log | 8.0 | 16 | 0.5311 | 92.7714 | 6.3636 |
61
+ | No log | 9.0 | 18 | 0.4952 | 93.8091 | 6.2727 |
62
+ | No log | 10.0 | 20 | 0.4598 | 93.8091 | 6.2727 |
63
+ | No log | 11.0 | 22 | 0.4246 | 93.8091 | 6.2727 |
64
+ | No log | 12.0 | 24 | 0.3919 | 93.8091 | 6.2727 |
65
+ | No log | 13.0 | 26 | 0.3602 | 93.8091 | 6.2727 |
66
+ | No log | 14.0 | 28 | 0.3308 | 93.8091 | 6.2727 |
67
+ | No log | 15.0 | 30 | 0.3050 | 93.8091 | 6.2727 |
68
+ | No log | 16.0 | 32 | 0.2828 | 94.9904 | 6.2273 |
69
+ | No log | 17.0 | 34 | 0.2633 | 95.4618 | 5.8636 |
70
+ | No log | 18.0 | 36 | 0.2452 | 95.4618 | 5.8636 |
71
+ | No log | 19.0 | 38 | 0.2278 | 96.624 | 5.8182 |
72
+ | No log | 20.0 | 40 | 0.2143 | 96.0313 | 5.9545 |
73
+ | No log | 21.0 | 42 | 0.2027 | 96.0313 | 5.9545 |
74
+ | No log | 22.0 | 44 | 0.1921 | 96.4897 | 5.9091 |
75
+ | No log | 23.0 | 46 | 0.1831 | 96.4897 | 5.9091 |
76
+ | No log | 24.0 | 48 | 0.1739 | 96.4897 | 5.9091 |
77
+ | No log | 25.0 | 50 | 0.1648 | 96.4897 | 5.9091 |
78
+ | No log | 26.0 | 52 | 0.1570 | 96.4897 | 5.9091 |
79
+ | No log | 27.0 | 54 | 0.1511 | 96.9417 | 5.8182 |
80
+ | No log | 28.0 | 56 | 0.1458 | 96.9417 | 5.8182 |
81
+ | No log | 29.0 | 58 | 0.1409 | 96.9417 | 5.8182 |
82
+ | No log | 30.0 | 60 | 0.1377 | 96.9417 | 5.8182 |
83
+ | No log | 31.0 | 62 | 0.1342 | 96.9417 | 5.8182 |
84
+ | No log | 32.0 | 64 | 0.1313 | 96.9417 | 5.8182 |
85
+ | No log | 33.0 | 66 | 0.1282 | 96.9417 | 5.8182 |
86
+ | No log | 34.0 | 68 | 0.1252 | 96.9417 | 5.8182 |
87
+ | No log | 35.0 | 70 | 0.1216 | 96.9417 | 5.8182 |
88
+ | No log | 36.0 | 72 | 0.1172 | 96.9417 | 5.8182 |
89
+ | No log | 37.0 | 74 | 0.1128 | 96.9417 | 5.8182 |
90
+ | No log | 38.0 | 76 | 0.1094 | 96.9417 | 5.8182 |
91
+ | No log | 39.0 | 78 | 0.1060 | 96.9417 | 5.8182 |
92
+ | No log | 40.0 | 80 | 0.1033 | 97.3874 | 5.8636 |
93
+ | No log | 41.0 | 82 | 0.1007 | 97.3874 | 5.8636 |
94
+ | No log | 42.0 | 84 | 0.0978 | 97.3874 | 5.8636 |
95
+ | No log | 43.0 | 86 | 0.0945 | 97.3874 | 5.8636 |
96
+ | No log | 44.0 | 88 | 0.0912 | 97.3874 | 5.8636 |
97
+ | No log | 45.0 | 90 | 0.0882 | 97.3874 | 5.8636 |
98
+ | No log | 46.0 | 92 | 0.0855 | 97.3874 | 5.8636 |
99
+ | No log | 47.0 | 94 | 0.0835 | 97.3874 | 5.8636 |
100
+ | No log | 48.0 | 96 | 0.0818 | 97.3874 | 6.1364 |
101
+ | No log | 49.0 | 98 | 0.0800 | 97.3874 | 6.1364 |
102
+ | No log | 50.0 | 100 | 0.0781 | 97.3874 | 6.1364 |
103
+ | No log | 51.0 | 102 | 0.0764 | 97.3874 | 6.1364 |
104
+ | No log | 52.0 | 104 | 0.0748 | 97.3874 | 5.8636 |
105
+ | No log | 53.0 | 106 | 0.0735 | 97.3874 | 5.8636 |
106
+ | No log | 54.0 | 108 | 0.0726 | 97.3874 | 5.8636 |
107
+ | No log | 55.0 | 110 | 0.0716 | 97.3874 | 5.8636 |
108
+ | No log | 56.0 | 112 | 0.0707 | 97.3874 | 5.8636 |
109
+ | No log | 57.0 | 114 | 0.0702 | 97.3874 | 5.8636 |
110
+ | No log | 58.0 | 116 | 0.0705 | 97.3874 | 5.8636 |
111
+ | No log | 59.0 | 118 | 0.0705 | 97.3874 | 5.8636 |
112
+ | No log | 60.0 | 120 | 0.0703 | 97.3874 | 5.8636 |
113
+ | No log | 61.0 | 122 | 0.0708 | 97.3874 | 5.8636 |
114
+ | No log | 62.0 | 124 | 0.0707 | 97.3874 | 5.8636 |
115
+ | No log | 63.0 | 126 | 0.0714 | 97.3874 | 5.8636 |
116
+ | No log | 64.0 | 128 | 0.0717 | 97.3874 | 5.8636 |
117
+ | No log | 65.0 | 130 | 0.0717 | 97.3874 | 5.8636 |
118
+ | No log | 66.0 | 132 | 0.0707 | 97.3874 | 5.8636 |
119
+ | No log | 67.0 | 134 | 0.0700 | 97.3874 | 5.8636 |
120
+ | No log | 68.0 | 136 | 0.0682 | 97.3874 | 5.8636 |
121
+ | No log | 69.0 | 138 | 0.0669 | 97.3874 | 5.8636 |
122
+ | No log | 70.0 | 140 | 0.0652 | 97.3874 | 5.8636 |
123
+ | No log | 71.0 | 142 | 0.0630 | 97.3874 | 5.8636 |
124
+ | No log | 72.0 | 144 | 0.0615 | 97.3874 | 5.8636 |
125
+ | No log | 73.0 | 146 | 0.0595 | 97.3874 | 5.8636 |
126
+ | No log | 74.0 | 148 | 0.0580 | 97.3874 | 5.8636 |
127
+ | No log | 75.0 | 150 | 0.0566 | 97.3874 | 5.8636 |
128
+ | No log | 76.0 | 152 | 0.0557 | 97.3874 | 5.8636 |
129
+ | No log | 77.0 | 154 | 0.0545 | 97.3874 | 5.8636 |
130
+ | No log | 78.0 | 156 | 0.0535 | 97.3874 | 5.8636 |
131
+ | No log | 79.0 | 158 | 0.0523 | 97.3874 | 5.8636 |
132
+ | No log | 80.0 | 160 | 0.0511 | 97.3874 | 5.8636 |
133
+ | No log | 81.0 | 162 | 0.0499 | 97.3874 | 5.8636 |
134
+ | No log | 82.0 | 164 | 0.0490 | 97.3874 | 5.8636 |
135
+ | No log | 83.0 | 166 | 0.0482 | 97.3874 | 5.8636 |
136
+ | No log | 84.0 | 168 | 0.0474 | 97.3874 | 5.8636 |
137
+ | No log | 85.0 | 170 | 0.0466 | 97.3874 | 5.8636 |
138
+ | No log | 86.0 | 172 | 0.0458 | 97.3874 | 5.8636 |
139
+ | No log | 87.0 | 174 | 0.0449 | 97.3874 | 5.8636 |
140
+ | No log | 88.0 | 176 | 0.0439 | 97.3874 | 5.8636 |
141
+ | No log | 89.0 | 178 | 0.0428 | 97.3874 | 5.8636 |
142
+ | No log | 90.0 | 180 | 0.0423 | 97.3874 | 5.8636 |
143
+ | No log | 91.0 | 182 | 0.0419 | 97.3874 | 5.8636 |
144
+ | No log | 92.0 | 184 | 0.0415 | 97.3874 | 5.8636 |
145
+ | No log | 93.0 | 186 | 0.0411 | 97.3874 | 5.8636 |
146
+ | No log | 94.0 | 188 | 0.0409 | 97.3874 | 5.8636 |
147
+ | No log | 95.0 | 190 | 0.0404 | 97.3874 | 5.8636 |
148
+ | No log | 96.0 | 192 | 0.0398 | 98.4309 | 5.9091 |
149
+ | No log | 97.0 | 194 | 0.0394 | 98.4309 | 5.9091 |
150
+ | No log | 98.0 | 196 | 0.0391 | 98.4309 | 5.9091 |
151
+ | No log | 99.0 | 198 | 0.0388 | 97.3874 | 5.8636 |
152
+ | No log | 100.0 | 200 | 0.0385 | 97.3874 | 5.8636 |
153
+ | No log | 101.0 | 202 | 0.0381 | 99.5662 | 5.9091 |
154
+ | No log | 102.0 | 204 | 0.0373 | 99.5662 | 5.9091 |
155
+ | No log | 103.0 | 206 | 0.0365 | 99.5662 | 5.9091 |
156
+ | No log | 104.0 | 208 | 0.0356 | 99.5662 | 5.9091 |
157
+ | No log | 105.0 | 210 | 0.0345 | 99.5662 | 5.9091 |
158
+ | No log | 106.0 | 212 | 0.0334 | 99.5662 | 5.9091 |
159
+ | No log | 107.0 | 214 | 0.0324 | 99.5662 | 5.9091 |
160
+ | No log | 108.0 | 216 | 0.0315 | 99.5662 | 5.9091 |
161
+ | No log | 109.0 | 218 | 0.0305 | 99.5662 | 5.9091 |
162
+ | No log | 110.0 | 220 | 0.0294 | 99.5662 | 5.9091 |
163
+ | No log | 111.0 | 222 | 0.0283 | 99.5662 | 5.9091 |
164
+ | No log | 112.0 | 224 | 0.0273 | 99.5662 | 5.9091 |
165
+ | No log | 113.0 | 226 | 0.0264 | 99.5662 | 5.9091 |
166
+ | No log | 114.0 | 228 | 0.0257 | 99.5662 | 5.9091 |
167
+ | No log | 115.0 | 230 | 0.0251 | 99.5662 | 5.9091 |
168
+ | No log | 116.0 | 232 | 0.0246 | 99.5662 | 5.9091 |
169
+ | No log | 117.0 | 234 | 0.0241 | 99.5662 | 5.9091 |
170
+ | No log | 118.0 | 236 | 0.0237 | 99.5662 | 5.9091 |
171
+ | No log | 119.0 | 238 | 0.0233 | 99.5662 | 5.9091 |
172
+ | No log | 120.0 | 240 | 0.0228 | 99.5662 | 5.9091 |
173
+ | No log | 121.0 | 242 | 0.0224 | 99.5662 | 5.9091 |
174
+ | No log | 122.0 | 244 | 0.0221 | 99.5662 | 5.9091 |
175
+ | No log | 123.0 | 246 | 0.0217 | 99.5662 | 5.9091 |
176
+ | No log | 124.0 | 248 | 0.0212 | 99.5662 | 5.9091 |
177
+ | No log | 125.0 | 250 | 0.0208 | 99.5662 | 5.9091 |
178
+ | No log | 126.0 | 252 | 0.0204 | 99.5662 | 5.9091 |
179
+ | No log | 127.0 | 254 | 0.0201 | 99.5662 | 5.9091 |
180
+ | No log | 128.0 | 256 | 0.0199 | 99.5662 | 5.9091 |
181
+ | No log | 129.0 | 258 | 0.0197 | 99.5662 | 5.9091 |
182
+ | No log | 130.0 | 260 | 0.0197 | 99.5662 | 5.9091 |
183
+ | No log | 131.0 | 262 | 0.0196 | 99.5662 | 5.9091 |
184
+ | No log | 132.0 | 264 | 0.0197 | 99.5662 | 5.9091 |
185
+ | No log | 133.0 | 266 | 0.0197 | 99.5662 | 5.9091 |
186
+ | No log | 134.0 | 268 | 0.0196 | 99.5662 | 5.9091 |
187
+ | No log | 135.0 | 270 | 0.0195 | 99.5662 | 5.9091 |
188
+ | No log | 136.0 | 272 | 0.0193 | 99.5662 | 5.9091 |
189
+ | No log | 137.0 | 274 | 0.0191 | 99.5662 | 5.9091 |
190
+ | No log | 138.0 | 276 | 0.0190 | 99.5662 | 5.9091 |
191
+ | No log | 139.0 | 278 | 0.0190 | 99.5662 | 5.9091 |
192
+ | No log | 140.0 | 280 | 0.0189 | 99.5662 | 5.9091 |
193
+ | No log | 141.0 | 282 | 0.0189 | 99.5662 | 5.9091 |
194
+ | No log | 142.0 | 284 | 0.0189 | 99.5662 | 5.9091 |
195
+ | No log | 143.0 | 286 | 0.0189 | 99.5662 | 5.9091 |
196
+ | No log | 144.0 | 288 | 0.0189 | 99.5662 | 5.9091 |
197
+ | No log | 145.0 | 290 | 0.0189 | 99.5662 | 5.9091 |
198
+ | No log | 146.0 | 292 | 0.0189 | 99.5662 | 5.9091 |
199
+ | No log | 147.0 | 294 | 0.0189 | 99.5662 | 5.9091 |
200
+ | No log | 148.0 | 296 | 0.0189 | 99.5662 | 5.9091 |
201
+ | No log | 149.0 | 298 | 0.0189 | 99.5662 | 5.9091 |
202
+ | No log | 150.0 | 300 | 0.0189 | 99.5662 | 5.9091 |
203
 
204
 
205
  ### Framework versions
 
207
  - Transformers 4.33.1
208
  - Pytorch 2.0.1
209
  - Datasets 2.13.0
210
+ - Tokenizers 0.13.2
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:ec6ff249f516bfba87dad7408f4bdd4fcd0da43e2644340c68fd6ae68617a04c
3
  size 891699345
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:46eb6411560ba1839ddb2d3024fa58085f15820eda186f328ee6c16c1aac5564
3
  size 891699345
tokenizer_config.json CHANGED
@@ -104,12 +104,8 @@
104
  "clean_up_tokenization_spaces": true,
105
  "eos_token": "</s>",
106
  "extra_ids": 100,
107
- "max_length": null,
108
  "model_max_length": 512,
109
- "pad_to_multiple_of": null,
110
  "pad_token": "<pad>",
111
- "pad_token_type_id": 0,
112
- "padding_side": "right",
113
  "tokenizer_class": "T5Tokenizer",
114
  "unk_token": "<unk>"
115
  }
 
104
  "clean_up_tokenization_spaces": true,
105
  "eos_token": "</s>",
106
  "extra_ids": 100,
 
107
  "model_max_length": 512,
 
108
  "pad_token": "<pad>",
 
 
109
  "tokenizer_class": "T5Tokenizer",
110
  "unk_token": "<unk>"
111
  }
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:d9f03720bed710b19d05e1e0c992677ac0c40eb267c3cb5afc6b2634f7e5461f
3
  size 4155
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e3af09c8feaeb7c5de7d80f60aaec76b1632dc3fccb403c356c8f13978d2b525
3
  size 4155