gokulsrinivasagan commited on
Commit
ef99f54
·
verified ·
1 Parent(s): bf5d78f

Model save

Browse files
README.md ADDED
@@ -0,0 +1,248 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ base_model: openai-community/gpt2
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - accuracy
8
+ model-index:
9
+ - name: gpt_train_12_512
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # gpt_train_12_512
17
+
18
+ This model is a fine-tuned version of [openai-community/gpt2](https://huggingface.co/openai-community/gpt2) on an unknown dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 8.9141
21
+ - Accuracy: 0.0917
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 1e-05
41
+ - train_batch_size: 24
42
+ - eval_batch_size: 24
43
+ - seed: 10
44
+ - distributed_type: multi-GPU
45
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
46
+ - lr_scheduler_type: linear
47
+ - num_epochs: 100
48
+ - mixed_precision_training: Native AMP
49
+
50
+ ### Training results
51
+
52
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
53
+ |:-------------:|:------:|:----:|:---------------:|:--------:|
54
+ | 10.8828 | 0.0000 | 1 | 10.8828 | 0.0001 |
55
+ | 10.8984 | 0.0001 | 2 | 10.8828 | 0.0001 |
56
+ | 10.8906 | 0.0001 | 3 | 10.8828 | 0.0001 |
57
+ | 10.8828 | 0.0001 | 4 | 10.8828 | 0.0001 |
58
+ | 10.8828 | 0.0002 | 5 | 10.8828 | 0.0001 |
59
+ | 10.8828 | 0.0002 | 6 | 10.8828 | 0.0001 |
60
+ | 10.8906 | 0.0003 | 7 | 10.8828 | 0.0001 |
61
+ | 10.8828 | 0.0003 | 8 | 10.8828 | 0.0001 |
62
+ | 10.875 | 0.0003 | 9 | 10.8828 | 0.0001 |
63
+ | 10.8984 | 0.0004 | 10 | 10.8828 | 0.0001 |
64
+ | 10.8828 | 0.0004 | 11 | 10.8828 | 0.0001 |
65
+ | 10.8906 | 0.0004 | 12 | 10.8828 | 0.0001 |
66
+ | 10.8828 | 0.0005 | 13 | 10.8828 | 0.0001 |
67
+ | 10.8828 | 0.0005 | 14 | 10.8828 | 0.0001 |
68
+ | 10.8828 | 0.0005 | 15 | 10.8828 | 0.0001 |
69
+ | 10.8828 | 0.0006 | 16 | 10.8828 | 0.0001 |
70
+ | 10.875 | 0.0006 | 17 | 10.8828 | 0.0001 |
71
+ | 10.8828 | 0.0007 | 18 | 10.6328 | 0.0197 |
72
+ | 10.6641 | 0.0007 | 19 | 10.4844 | 0.0444 |
73
+ | 10.5078 | 0.0007 | 20 | 10.3828 | 0.0499 |
74
+ | 10.3984 | 0.0008 | 21 | 10.3125 | 0.0532 |
75
+ | 10.3438 | 0.0008 | 22 | 10.25 | 0.0550 |
76
+ | 10.2656 | 0.0008 | 23 | 10.2031 | 0.0562 |
77
+ | 10.25 | 0.0009 | 24 | 10.1641 | 0.0540 |
78
+ | 10.1875 | 0.0009 | 25 | 10.1328 | 0.0470 |
79
+ | 10.125 | 0.0009 | 26 | 10.1094 | 0.0461 |
80
+ | 10.125 | 0.0010 | 27 | 10.0859 | 0.0480 |
81
+ | 10.0938 | 0.0010 | 28 | 10.0703 | 0.0474 |
82
+ | 10.0625 | 0.0011 | 29 | 10.0547 | 0.0465 |
83
+ | 10.0703 | 0.0011 | 30 | 10.0391 | 0.0472 |
84
+ | 10.0156 | 0.0011 | 31 | 10.0234 | 0.0515 |
85
+ | 10.0859 | 0.0012 | 32 | 10.0156 | 0.0587 |
86
+ | 9.9922 | 0.0012 | 33 | 10.0078 | 0.0613 |
87
+ | 10.0234 | 0.0012 | 34 | 9.9922 | 0.0608 |
88
+ | 9.9609 | 0.0013 | 35 | 9.9844 | 0.0600 |
89
+ | 10.0391 | 0.0013 | 36 | 9.9766 | 0.0608 |
90
+ | 9.9922 | 0.0013 | 37 | 9.9609 | 0.0619 |
91
+ | 9.9688 | 0.0014 | 38 | 9.9531 | 0.0623 |
92
+ | 9.9453 | 0.0014 | 39 | 9.9375 | 0.0622 |
93
+ | 9.9609 | 0.0015 | 40 | 9.9297 | 0.0628 |
94
+ | 9.9609 | 0.0015 | 41 | 9.9141 | 0.0640 |
95
+ | 10.0234 | 0.0015 | 42 | 9.8984 | 0.0649 |
96
+ | 9.9375 | 0.0016 | 43 | 9.8906 | 0.0648 |
97
+ | 9.8516 | 0.0016 | 44 | 9.875 | 0.0644 |
98
+ | 9.8672 | 0.0016 | 45 | 9.8594 | 0.0643 |
99
+ | 9.8984 | 0.0017 | 46 | 9.8438 | 0.0643 |
100
+ | 9.875 | 0.0017 | 47 | 9.8359 | 0.0645 |
101
+ | 9.8672 | 0.0017 | 48 | 9.8203 | 0.0646 |
102
+ | 9.8984 | 0.0018 | 49 | 9.8125 | 0.0649 |
103
+ | 9.7891 | 0.0018 | 50 | 9.8047 | 0.0653 |
104
+ | 9.8281 | 0.0019 | 51 | 9.7891 | 0.0655 |
105
+ | 9.8281 | 0.0019 | 52 | 9.7812 | 0.0654 |
106
+ | 9.7969 | 0.0019 | 53 | 9.7734 | 0.0660 |
107
+ | 9.7812 | 0.0020 | 54 | 9.7656 | 0.0670 |
108
+ | 9.8047 | 0.0020 | 55 | 9.75 | 0.0682 |
109
+ | 9.7969 | 0.0020 | 56 | 9.7422 | 0.0688 |
110
+ | 9.7891 | 0.0021 | 57 | 9.7344 | 0.0691 |
111
+ | 9.6875 | 0.0021 | 58 | 9.7266 | 0.0690 |
112
+ | 9.7188 | 0.0021 | 59 | 9.7188 | 0.0686 |
113
+ | 9.7344 | 0.0022 | 60 | 9.7109 | 0.0682 |
114
+ | 9.7344 | 0.0022 | 61 | 9.6953 | 0.0687 |
115
+ | 9.7578 | 0.0023 | 62 | 9.6875 | 0.0697 |
116
+ | 9.6484 | 0.0023 | 63 | 9.6719 | 0.0708 |
117
+ | 9.6328 | 0.0023 | 64 | 9.6641 | 0.0715 |
118
+ | 9.7656 | 0.0024 | 65 | 9.6562 | 0.0721 |
119
+ | 9.6875 | 0.0024 | 66 | 9.6484 | 0.0725 |
120
+ | 9.6328 | 0.0024 | 67 | 9.6406 | 0.0727 |
121
+ | 9.6953 | 0.0025 | 68 | 9.6328 | 0.0734 |
122
+ | 9.7188 | 0.0025 | 69 | 9.625 | 0.0744 |
123
+ | 9.6875 | 0.0025 | 70 | 9.6172 | 0.0753 |
124
+ | 9.625 | 0.0026 | 71 | 9.6094 | 0.0763 |
125
+ | 9.6172 | 0.0026 | 72 | 9.6016 | 0.0769 |
126
+ | 9.6016 | 0.0027 | 73 | 9.5938 | 0.0771 |
127
+ | 9.6094 | 0.0027 | 74 | 9.5859 | 0.0771 |
128
+ | 9.5859 | 0.0027 | 75 | 9.5781 | 0.0771 |
129
+ | 9.5859 | 0.0028 | 76 | 9.5703 | 0.0767 |
130
+ | 9.5859 | 0.0028 | 77 | 9.5625 | 0.0765 |
131
+ | 9.5781 | 0.0028 | 78 | 9.5547 | 0.0764 |
132
+ | 9.6172 | 0.0029 | 79 | 9.5469 | 0.0763 |
133
+ | 9.5859 | 0.0029 | 80 | 9.5391 | 0.0768 |
134
+ | 9.5859 | 0.0029 | 81 | 9.5312 | 0.0770 |
135
+ | 9.5391 | 0.0030 | 82 | 9.5234 | 0.0770 |
136
+ | 9.5391 | 0.0030 | 83 | 9.5234 | 0.0764 |
137
+ | 9.5312 | 0.0031 | 84 | 9.5156 | 0.0758 |
138
+ | 9.5547 | 0.0031 | 85 | 9.5078 | 0.0757 |
139
+ | 9.5781 | 0.0031 | 86 | 9.5 | 0.0760 |
140
+ | 9.5703 | 0.0032 | 87 | 9.4922 | 0.0764 |
141
+ | 9.4844 | 0.0032 | 88 | 9.4844 | 0.0764 |
142
+ | 9.5312 | 0.0032 | 89 | 9.4766 | 0.0765 |
143
+ | 9.5312 | 0.0033 | 90 | 9.4688 | 0.0765 |
144
+ | 9.5078 | 0.0033 | 91 | 9.4688 | 0.0766 |
145
+ | 9.5 | 0.0033 | 92 | 9.4609 | 0.0768 |
146
+ | 9.4844 | 0.0034 | 93 | 9.4531 | 0.0769 |
147
+ | 9.4688 | 0.0034 | 94 | 9.4453 | 0.0773 |
148
+ | 9.5156 | 0.0035 | 95 | 9.4375 | 0.0777 |
149
+ | 9.4453 | 0.0035 | 96 | 9.4297 | 0.0783 |
150
+ | 9.4766 | 0.0035 | 97 | 9.4219 | 0.0794 |
151
+ | 9.4219 | 0.0036 | 98 | 9.4219 | 0.0804 |
152
+ | 9.4531 | 0.0036 | 99 | 9.4141 | 0.0814 |
153
+ | 9.4141 | 0.0036 | 100 | 9.4062 | 0.0819 |
154
+ | 9.375 | 0.0037 | 101 | 9.3984 | 0.0825 |
155
+ | 9.4219 | 0.0037 | 102 | 9.3906 | 0.0828 |
156
+ | 9.3828 | 0.0037 | 103 | 9.3828 | 0.0828 |
157
+ | 9.375 | 0.0038 | 104 | 9.3828 | 0.0827 |
158
+ | 9.3516 | 0.0038 | 105 | 9.375 | 0.0825 |
159
+ | 9.3906 | 0.0039 | 106 | 9.3672 | 0.0825 |
160
+ | 9.3672 | 0.0039 | 107 | 9.3594 | 0.0823 |
161
+ | 9.3359 | 0.0039 | 108 | 9.3516 | 0.0822 |
162
+ | 9.4062 | 0.0040 | 109 | 9.3438 | 0.0818 |
163
+ | 9.3906 | 0.0040 | 110 | 9.3438 | 0.0816 |
164
+ | 9.25 | 0.0040 | 111 | 9.3359 | 0.0816 |
165
+ | 9.3281 | 0.0041 | 112 | 9.3281 | 0.0816 |
166
+ | 9.375 | 0.0041 | 113 | 9.3203 | 0.0813 |
167
+ | 9.3906 | 0.0041 | 114 | 9.3203 | 0.0812 |
168
+ | 9.3203 | 0.0042 | 115 | 9.3125 | 0.0812 |
169
+ | 9.3125 | 0.0042 | 116 | 9.3047 | 0.0811 |
170
+ | 9.3359 | 0.0043 | 117 | 9.2969 | 0.0809 |
171
+ | 9.2812 | 0.0043 | 118 | 9.2969 | 0.0808 |
172
+ | 9.2031 | 0.0043 | 119 | 9.2891 | 0.0807 |
173
+ | 9.2422 | 0.0044 | 120 | 9.2812 | 0.0808 |
174
+ | 9.3047 | 0.0044 | 121 | 9.2812 | 0.0809 |
175
+ | 9.2969 | 0.0044 | 122 | 9.2734 | 0.0810 |
176
+ | 9.25 | 0.0045 | 123 | 9.2656 | 0.0815 |
177
+ | 9.3281 | 0.0045 | 124 | 9.2578 | 0.0825 |
178
+ | 9.2656 | 0.0045 | 125 | 9.2578 | 0.0836 |
179
+ | 9.3047 | 0.0046 | 126 | 9.25 | 0.0845 |
180
+ | 9.25 | 0.0046 | 127 | 9.2422 | 0.0850 |
181
+ | 9.2969 | 0.0046 | 128 | 9.2344 | 0.0852 |
182
+ | 9.3203 | 0.0047 | 129 | 9.2344 | 0.0853 |
183
+ | 9.25 | 0.0047 | 130 | 9.2266 | 0.0853 |
184
+ | 9.2422 | 0.0048 | 131 | 9.2188 | 0.0854 |
185
+ | 9.1641 | 0.0048 | 132 | 9.2109 | 0.0855 |
186
+ | 9.2109 | 0.0048 | 133 | 9.2109 | 0.0858 |
187
+ | 9.2422 | 0.0049 | 134 | 9.2031 | 0.0860 |
188
+ | 9.2188 | 0.0049 | 135 | 9.1953 | 0.0861 |
189
+ | 9.3047 | 0.0049 | 136 | 9.1875 | 0.0861 |
190
+ | 9.1641 | 0.0050 | 137 | 9.1875 | 0.0861 |
191
+ | 9.2188 | 0.0050 | 138 | 9.1797 | 0.0859 |
192
+ | 9.2422 | 0.0050 | 139 | 9.1719 | 0.0856 |
193
+ | 9.2422 | 0.0051 | 140 | 9.1719 | 0.0855 |
194
+ | 9.1484 | 0.0051 | 141 | 9.1641 | 0.0852 |
195
+ | 9.2422 | 0.0052 | 142 | 9.1562 | 0.0851 |
196
+ | 9.1953 | 0.0052 | 143 | 9.1484 | 0.0852 |
197
+ | 9.1641 | 0.0052 | 144 | 9.1484 | 0.0853 |
198
+ | 9.1875 | 0.0053 | 145 | 9.1406 | 0.0854 |
199
+ | 9.1172 | 0.0053 | 146 | 9.1328 | 0.0855 |
200
+ | 9.1094 | 0.0053 | 147 | 9.1328 | 0.0856 |
201
+ | 9.1328 | 0.0054 | 148 | 9.125 | 0.0859 |
202
+ | 9.1641 | 0.0054 | 149 | 9.1172 | 0.0863 |
203
+ | 9.1641 | 0.0054 | 150 | 9.1094 | 0.0868 |
204
+ | 9.1875 | 0.0055 | 151 | 9.1094 | 0.0873 |
205
+ | 9.2031 | 0.0055 | 152 | 9.1016 | 0.0875 |
206
+ | 9.0703 | 0.0056 | 153 | 9.0938 | 0.0880 |
207
+ | 9.1484 | 0.0056 | 154 | 9.0859 | 0.0884 |
208
+ | 9.0625 | 0.0056 | 155 | 9.0859 | 0.0888 |
209
+ | 9.0781 | 0.0057 | 156 | 9.0781 | 0.0889 |
210
+ | 9.0234 | 0.0057 | 157 | 9.0703 | 0.0892 |
211
+ | 9.0781 | 0.0057 | 158 | 9.0703 | 0.0894 |
212
+ | 9.0 | 0.0058 | 159 | 9.0625 | 0.0895 |
213
+ | 9.0312 | 0.0058 | 160 | 9.0547 | 0.0896 |
214
+ | 9.0391 | 0.0058 | 161 | 9.0547 | 0.0898 |
215
+ | 9.0469 | 0.0059 | 162 | 9.0469 | 0.0901 |
216
+ | 9.0859 | 0.0059 | 163 | 9.0391 | 0.0905 |
217
+ | 9.0078 | 0.0060 | 164 | 9.0312 | 0.0908 |
218
+ | 9.0156 | 0.0060 | 165 | 9.0312 | 0.0909 |
219
+ | 9.0469 | 0.0060 | 166 | 9.0234 | 0.0909 |
220
+ | 8.9219 | 0.0061 | 167 | 9.0234 | 0.0908 |
221
+ | 9.0312 | 0.0061 | 168 | 9.0156 | 0.0907 |
222
+ | 9.0938 | 0.0061 | 169 | 9.0078 | 0.0906 |
223
+ | 9.0156 | 0.0062 | 170 | 9.0 | 0.0902 |
224
+ | 9.0312 | 0.0062 | 171 | 9.0 | 0.0897 |
225
+ | 9.0625 | 0.0062 | 172 | 8.9922 | 0.0893 |
226
+ | 8.9844 | 0.0063 | 173 | 8.9844 | 0.0891 |
227
+ | 9.0703 | 0.0063 | 174 | 8.9844 | 0.0894 |
228
+ | 8.9609 | 0.0064 | 175 | 8.9766 | 0.0898 |
229
+ | 8.9922 | 0.0064 | 176 | 8.9766 | 0.0905 |
230
+ | 9.0234 | 0.0064 | 177 | 8.9688 | 0.0910 |
231
+ | 9.0234 | 0.0065 | 178 | 8.9609 | 0.0915 |
232
+ | 8.9219 | 0.0065 | 179 | 8.9531 | 0.0919 |
233
+ | 9.0234 | 0.0065 | 180 | 8.9531 | 0.0920 |
234
+ | 8.9375 | 0.0066 | 181 | 8.9453 | 0.0921 |
235
+ | 8.9688 | 0.0066 | 182 | 8.9375 | 0.0919 |
236
+ | 8.9375 | 0.0066 | 183 | 8.9375 | 0.0913 |
237
+ | 9.0 | 0.0067 | 184 | 8.9297 | 0.0912 |
238
+ | 8.9375 | 0.0067 | 185 | 8.9219 | 0.0913 |
239
+ | 8.9609 | 0.0068 | 186 | 8.9219 | 0.0913 |
240
+ | 8.9688 | 0.0068 | 187 | 8.9141 | 0.0917 |
241
+
242
+
243
+ ### Framework versions
244
+
245
+ - Transformers 4.41.2
246
+ - Pytorch 2.1.0a0+32f93b1
247
+ - Datasets 2.20.0
248
+ - Tokenizers 0.19.1
config.json ADDED
@@ -0,0 +1,39 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "openai-community/gpt2",
3
+ "activation_function": "gelu_new",
4
+ "architectures": [
5
+ "GPT2LMHeadModel"
6
+ ],
7
+ "attn_pdrop": 0.1,
8
+ "bos_token_id": 50256,
9
+ "embd_pdrop": 0.1,
10
+ "eos_token_id": 50256,
11
+ "initializer_range": 0.02,
12
+ "layer_norm_epsilon": 1e-05,
13
+ "model_type": "gpt2",
14
+ "n_ctx": 1024,
15
+ "n_embd": 512,
16
+ "n_head": 8,
17
+ "n_inner": null,
18
+ "n_layer": 12,
19
+ "n_positions": 1024,
20
+ "reorder_and_upcast_attn": false,
21
+ "resid_pdrop": 0.1,
22
+ "scale_attn_by_inverse_layer_idx": false,
23
+ "scale_attn_weights": true,
24
+ "summary_activation": null,
25
+ "summary_first_dropout": 0.1,
26
+ "summary_proj_to_labels": true,
27
+ "summary_type": "cls_index",
28
+ "summary_use_proj": true,
29
+ "task_specific_params": {
30
+ "text-generation": {
31
+ "do_sample": true,
32
+ "max_length": 50
33
+ }
34
+ },
35
+ "torch_dtype": "float16",
36
+ "transformers_version": "4.41.2",
37
+ "use_cache": true,
38
+ "vocab_size": 50257
39
+ }
generation_config.json ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bos_token_id": 50256,
4
+ "eos_token_id": 50256,
5
+ "transformers_version": "4.41.2"
6
+ }
logs/events.out.tfevents.1719842154.s_005_m.77202.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6423ca3fc9558a39c85c2df9347c732726ae7523650245f62d652277e762be08
3
+ size 103964
merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fc9c6584afab1726ce946f155a5b20ed9e2f438b5d6d73db5b073fa92fbeaf77
3
+ size 179649168
special_tokens_map.json ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ {
2
+ "bos_token": "<|endoftext|>",
3
+ "eos_token": "<|endoftext|>",
4
+ "unk_token": "<|endoftext|>"
5
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,19 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_prefix_space": false,
3
+ "added_tokens_decoder": {
4
+ "50256": {
5
+ "content": "<|endoftext|>",
6
+ "lstrip": false,
7
+ "normalized": true,
8
+ "rstrip": false,
9
+ "single_word": false,
10
+ "special": true
11
+ }
12
+ },
13
+ "bos_token": "<|endoftext|>",
14
+ "clean_up_tokenization_spaces": true,
15
+ "eos_token": "<|endoftext|>",
16
+ "model_max_length": 1024,
17
+ "tokenizer_class": "GPT2Tokenizer",
18
+ "unk_token": "<|endoftext|>"
19
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:15458bc25f62c16cad3de5eb623cddc9610ca1583e5f1695cd147907cc2701fb
3
+ size 6328
vocab.json ADDED
The diff for this file is too large to render. See raw diff