End of training
Browse files
README.md
CHANGED
@@ -16,14 +16,14 @@ This student model is distilled from the teacher model [gpt2](https://huggingfac
|
|
16 |
The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
|
17 |
|
18 |
It achieves the following results on the evaluation set:
|
19 |
-
- eval_enwikippl:
|
20 |
-
- eval_frwikippl:
|
21 |
-
- eval_zhwikippl:
|
22 |
-
- eval_tinystoriesppl:
|
23 |
-
- eval_loss: 1.
|
24 |
-
- eval_runtime: 12.
|
25 |
-
- eval_samples_per_second: 47.
|
26 |
-
- eval_steps_per_second: 11.
|
27 |
|
28 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
29 |
should probably proofread and complete it, then remove this comment.
|
@@ -64,47 +64,47 @@ Peak GPU Memory: 3.9293 GB
|
|
64 |
| step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | tinystoriesppl | zhwikippl |
|
65 |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
66 |
| **teacher eval** | | 270.2348 | 76.8142 | | | | | 671.1238 | 22.8030 |
|
67 |
-
| 0 | 0 | 147374.6094 | 4251118206976.0 | 19.8108 | 12.
|
68 |
-
| 1500 | 0.0253 |
|
69 |
-
| 3000 | 0.0505 |
|
70 |
-
| 4500 | 0.0758 |
|
71 |
-
| 6000 | 0.1010 |
|
72 |
-
| 7500 | 0.1263 |
|
73 |
-
| 9000 | 0.1515 |
|
74 |
-
| 10500 | 0.1768 |
|
75 |
-
| 12000 | 0.2020 |
|
76 |
-
| 13500 | 0.2273 |
|
77 |
-
| 15000 | 0.2525 |
|
78 |
-
| 16500 | 0.2778 | 524.
|
79 |
-
| 18000 | 0.3030 |
|
80 |
-
| 19500 | 0.3283 |
|
81 |
-
| 21000 | 0.3535 |
|
82 |
-
| 22500 | 0.3788 |
|
83 |
-
| 24000 | 0.4040 |
|
84 |
-
| 25500 | 0.4293 |
|
85 |
-
| 27000 | 0.4545 |
|
86 |
-
| 28500 | 0.4798 |
|
87 |
-
| 30000 | 0.5051 |
|
88 |
-
| 31500 | 0.5303 |
|
89 |
-
| 33000 | 0.5556 |
|
90 |
-
| 34500 | 0.5808 | 403.
|
91 |
-
| 36000 | 0.6061 |
|
92 |
-
| 37500 | 0.6313 |
|
93 |
-
| 39000 | 0.6566 |
|
94 |
-
| 40500 | 0.6818 |
|
95 |
-
| 42000 | 0.7071 |
|
96 |
-
| 43500 | 0.7323 |
|
97 |
-
| 45000 | 0.7576 |
|
98 |
-
| 46500 | 0.7828 |
|
99 |
-
| 48000 | 0.8081 |
|
100 |
-
| 49500 | 0.8333 | 379.1130 |
|
101 |
-
| 51000 | 0.8586 |
|
102 |
-
| 52500 | 0.8838 |
|
103 |
-
| 54000 | 0.9091 |
|
104 |
-
| 55500 | 0.9343 | 377.
|
105 |
-
| 57000 | 0.9596 |
|
106 |
-
| 58500 | 0.9848 |
|
107 |
-
| 59400 | 1.0 |
|
108 |
|
109 |
### Framework versions
|
110 |
- Distily 0.2.0
|
|
|
16 |
The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
|
17 |
|
18 |
It achieves the following results on the evaluation set:
|
19 |
+
- eval_enwikippl: 665.9925
|
20 |
+
- eval_frwikippl: 995.4457
|
21 |
+
- eval_zhwikippl: 405.3946
|
22 |
+
- eval_tinystoriesppl: 1100.5725
|
23 |
+
- eval_loss: 1.3024
|
24 |
+
- eval_runtime: 12.5753
|
25 |
+
- eval_samples_per_second: 47.713
|
26 |
+
- eval_steps_per_second: 11.928
|
27 |
|
28 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
29 |
should probably proofread and complete it, then remove this comment.
|
|
|
64 |
| step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | tinystoriesppl | zhwikippl |
|
65 |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
66 |
| **teacher eval** | | 270.2348 | 76.8142 | | | | | 671.1238 | 22.8030 |
|
67 |
+
| 0 | 0 | 147374.6094 | 4251118206976.0 | 19.8108 | 12.6652 | 47.374 | 11.843 | 74.6838 | 6171058503680.0 |
|
68 |
+
| 1500 | 0.0253 | 1012.5726 | 4501.9321 | 2.2064 | 12.5479 | 47.817 | 11.954 | 1084.7205 | 39061.2969 |
|
69 |
+
| 3000 | 0.0505 | 761.3547 | 2880.7776 | 1.7218 | 12.6141 | 47.566 | 11.891 | 932.5889 | 1552.8525 |
|
70 |
+
| 4500 | 0.0758 | 682.1792 | 1444.0309 | 1.5343 | 12.6458 | 47.447 | 11.862 | 963.2644 | 421.1599 |
|
71 |
+
| 6000 | 0.1010 | 673.6849 | 1216.2458 | 1.4424 | 12.6927 | 47.271 | 11.818 | 1035.7787 | 983.8034 |
|
72 |
+
| 7500 | 0.1263 | 630.5226 | 924.8793 | 1.3688 | 12.561 | 47.767 | 11.942 | 971.2607 | 351.8923 |
|
73 |
+
| 9000 | 0.1515 | 665.9925 | 995.4457 | 1.3024 | 12.5753 | 47.713 | 11.928 | 1100.5725 | 405.3946 |
|
74 |
+
| 10500 | 0.1768 | 649.4595 | 870.4929 | 1.2363 | 12.5912 | 47.652 | 11.913 | 1147.8689 | 379.8699 |
|
75 |
+
| 12000 | 0.2020 | 552.0709 | 756.2815 | 1.1687 | 12.5514 | 47.804 | 11.951 | 915.4786 | 247.3208 |
|
76 |
+
| 13500 | 0.2273 | 574.5076 | 775.2103 | 1.1446 | 12.6584 | 47.399 | 11.85 | 1022.3383 | 258.0553 |
|
77 |
+
| 15000 | 0.2525 | 570.0630 | 872.7639 | 1.1033 | 12.573 | 47.721 | 11.93 | 1034.7090 | 205.1337 |
|
78 |
+
| 16500 | 0.2778 | 524.1483 | 695.0405 | 1.0708 | 12.5445 | 47.83 | 11.957 | 960.6801 | 179.8155 |
|
79 |
+
| 18000 | 0.3030 | 558.0261 | 722.4153 | 1.0562 | 12.6414 | 47.463 | 11.866 | 1092.5500 | 238.2534 |
|
80 |
+
| 19500 | 0.3283 | 535.8491 | 646.8846 | 1.0133 | 12.5343 | 47.869 | 11.967 | 1038.2650 | 224.3871 |
|
81 |
+
| 21000 | 0.3535 | 498.7090 | 643.3860 | 0.9866 | 12.6044 | 47.602 | 11.901 | 945.8655 | 325.0199 |
|
82 |
+
| 22500 | 0.3788 | 501.5469 | 612.7169 | 0.9680 | 12.5367 | 47.86 | 11.965 | 979.3635 | 253.6864 |
|
83 |
+
| 24000 | 0.4040 | 376.6320 | 629.0483 | 0.9542 | 12.5557 | 47.787 | 11.947 | 639.3351 | 209.0216 |
|
84 |
+
| 25500 | 0.4293 | 481.3532 | 705.2970 | 0.9196 | 12.6849 | 47.3 | 11.825 | 966.3749 | 375.7875 |
|
85 |
+
| 27000 | 0.4545 | 459.1099 | 522.3182 | 0.8577 | 12.5747 | 47.715 | 11.929 | 958.1420 | 189.4054 |
|
86 |
+
| 28500 | 0.4798 | 413.4502 | 431.4271 | 0.7560 | 12.5416 | 47.841 | 11.96 | 891.3210 | 176.5119 |
|
87 |
+
| 30000 | 0.5051 | 403.5616 | 415.3713 | 0.7195 | 12.548 | 47.817 | 11.954 | 882.3771 | 152.6556 |
|
88 |
+
| 31500 | 0.5303 | 406.3142 | 383.7035 | 0.7008 | 12.7238 | 47.156 | 11.789 | 912.3057 | 155.9905 |
|
89 |
+
| 33000 | 0.5556 | 424.4844 | 373.8076 | 0.6957 | 12.5614 | 47.765 | 11.941 | 974.8803 | 171.0759 |
|
90 |
+
| 34500 | 0.5808 | 403.1555 | 398.5213 | 0.6867 | 12.5658 | 47.748 | 11.937 | 913.2111 | 178.8704 |
|
91 |
+
| 36000 | 0.6061 | 399.7424 | 356.4906 | 0.6771 | 12.5757 | 47.711 | 11.928 | 904.7578 | 169.4632 |
|
92 |
+
| 37500 | 0.6313 | 398.5905 | 372.6379 | 0.6750 | 12.652 | 47.423 | 11.856 | 912.7961 | 158.8251 |
|
93 |
+
| 39000 | 0.6566 | 392.1436 | 371.0796 | 0.6723 | 12.6742 | 47.34 | 11.835 | 882.8148 | 176.4061 |
|
94 |
+
| 40500 | 0.6818 | 393.4750 | 371.6812 | 0.6672 | 12.6703 | 47.355 | 11.839 | 901.9575 | 134.3779 |
|
95 |
+
| 42000 | 0.7071 | 399.2395 | 357.3452 | 0.6651 | 12.6545 | 47.414 | 11.853 | 913.0604 | 135.6295 |
|
96 |
+
| 43500 | 0.7323 | 391.1350 | 370.6879 | 0.6558 | 12.6748 | 47.338 | 11.834 | 896.4939 | 156.0113 |
|
97 |
+
| 45000 | 0.7576 | 382.1500 | 345.0898 | 0.6354 | 12.6893 | 47.284 | 11.821 | 884.7507 | 140.7350 |
|
98 |
+
| 46500 | 0.7828 | 379.9360 | 334.1126 | 0.6281 | 12.6503 | 47.43 | 11.857 | 877.5396 | 127.1069 |
|
99 |
+
| 48000 | 0.8081 | 379.3625 | 342.2339 | 0.6241 | 12.6749 | 47.338 | 11.834 | 882.8514 | 128.6507 |
|
100 |
+
| 49500 | 0.8333 | 379.1130 | 333.6659 | 0.6222 | 12.6951 | 47.262 | 11.816 | 881.2473 | 125.1969 |
|
101 |
+
| 51000 | 0.8586 | 378.2769 | 332.6569 | 0.6217 | 12.6252 | 47.524 | 11.881 | 883.0703 | 128.0856 |
|
102 |
+
| 52500 | 0.8838 | 377.0043 | 335.4331 | 0.6182 | 12.6655 | 47.373 | 11.843 | 880.3371 | 128.4364 |
|
103 |
+
| 54000 | 0.9091 | 376.5811 | 333.1023 | 0.6165 | 12.6459 | 47.446 | 11.862 | 877.0681 | 129.0633 |
|
104 |
+
| 55500 | 0.9343 | 377.9547 | 333.2431 | 0.6157 | 12.6412 | 47.464 | 11.866 | 883.1432 | 127.1832 |
|
105 |
+
| 57000 | 0.9596 | 378.2183 | 332.4462 | 0.6147 | 12.6477 | 47.439 | 11.86 | 884.0200 | 126.3209 |
|
106 |
+
| 58500 | 0.9848 | 377.9839 | 333.1023 | 0.6146 | 12.6522 | 47.422 | 11.856 | 883.7274 | 126.2198 |
|
107 |
+
| 59400 | 1.0 | 378.0425 | 333.0085 | 0.6147 | 12.651 | 47.427 | 11.857 | 883.7274 | 126.2198 |
|
108 |
|
109 |
### Framework versions
|
110 |
- Distily 0.2.0
|
logs/batch_size=1, learning_rate=0.0001, warmup_ratio=0.1/events.out.tfevents.1724080697.5f530b1cf724
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:fe64b1314c2bb18a0cba461ffbedc549b1379583a9006761bb2e5e01c6e0bfc3
|
3 |
+
size 312
|