End of training
Browse files
README.md
CHANGED
@@ -16,13 +16,13 @@ This student model is distilled from the teacher model [gpt2](https://huggingfac
|
|
16 |
The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
|
17 |
|
18 |
It achieves the following results on the evaluation set:
|
19 |
-
- eval_enwikippl:
|
20 |
-
- eval_frwikippl:
|
21 |
-
- eval_zhwikippl:
|
22 |
-
- eval_loss: 1.
|
23 |
-
- eval_runtime:
|
24 |
-
- eval_samples_per_second:
|
25 |
-
- eval_steps_per_second: 7.
|
26 |
|
27 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
28 |
should probably proofread and complete it, then remove this comment.
|
@@ -45,7 +45,7 @@ More information needed
|
|
45 |
### Training hyperparameters
|
46 |
|
47 |
The following hyperparameters were used during training:
|
48 |
-
- distillation_objective: DistillationObjective(logits_loss_component=LossComponent(label=logits, weight=1, loss_fn=kl, layer_mapper=None, projector=None), hs_loss_component=LossComponent(label=hs, weight=0, loss_fn=
|
49 |
- train_embeddings: True
|
50 |
- learning_rate: 4e-05
|
51 |
- train_batch_size: 8
|
@@ -56,75 +56,75 @@ The following hyperparameters were used during training:
|
|
56 |
- num_epochs: 1.0
|
57 |
|
58 |
### Resource Usage
|
59 |
-
Peak GPU Memory:
|
60 |
|
61 |
### Eval-Phase Metrics
|
62 |
| step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | zhwikippl |
|
63 |
| --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
64 |
| **teacher eval** | | 30.2086 | 57.2728 | | | | | 18.1784 |
|
65 |
-
| 0 | 0 |
|
66 |
-
| 1000 | 0.0162 |
|
67 |
-
| 2000 | 0.0323 |
|
68 |
-
| 3000 | 0.0485 |
|
69 |
-
| 4000 | 0.0646 |
|
70 |
-
| 5000 | 0.0808 |
|
71 |
-
| 6000 | 0.0970 |
|
72 |
-
| 7000 | 0.1131 |
|
73 |
-
| 8000 | 0.1293 |
|
74 |
-
| 9000 | 0.1455 |
|
75 |
-
| 10000 | 0.1616 |
|
76 |
-
| 11000 | 0.1778 |
|
77 |
-
| 12000 | 0.1939 |
|
78 |
-
| 13000 | 0.2101 |
|
79 |
-
| 14000 | 0.2263 |
|
80 |
-
| 15000 | 0.2424 |
|
81 |
-
| 16000 | 0.2586 |
|
82 |
-
| 17000 | 0.2747 |
|
83 |
-
| 18000 | 0.2909 |
|
84 |
-
| 19000 | 0.3071 | 131.
|
85 |
-
| 20000 | 0.3232 |
|
86 |
-
| 21000 | 0.3394 |
|
87 |
-
| 22000 | 0.3556 |
|
88 |
-
| 23000 | 0.3717 |
|
89 |
-
| 24000 | 0.3879 |
|
90 |
-
| 25000 | 0.4040 |
|
91 |
-
| 26000 | 0.4202 |
|
92 |
-
| 27000 | 0.4364 |
|
93 |
-
| 28000 | 0.4525 |
|
94 |
-
| 29000 | 0.4687 |
|
95 |
-
| 30000 | 0.4848 |
|
96 |
-
| 31000 | 0.5010 |
|
97 |
-
| 32000 | 0.5172 |
|
98 |
-
| 33000 | 0.5333 |
|
99 |
-
| 34000 | 0.5495 | 115.
|
100 |
-
| 35000 | 0.5657 |
|
101 |
-
| 36000 | 0.5818 | 115.
|
102 |
-
| 37000 | 0.5980 |
|
103 |
-
| 38000 | 0.6141 |
|
104 |
-
| 39000 | 0.6303 |
|
105 |
-
| 40000 | 0.6465 |
|
106 |
-
| 41000 | 0.6626 |
|
107 |
-
| 42000 | 0.6788 |
|
108 |
-
| 43000 | 0.6949 |
|
109 |
-
| 44000 | 0.7111 |
|
110 |
-
| 45000 | 0.7273 |
|
111 |
-
| 46000 | 0.7434 |
|
112 |
-
| 47000 | 0.7596 | 112.
|
113 |
-
| 48000 | 0.7758 |
|
114 |
-
| 49000 | 0.7919 |
|
115 |
-
| 50000 | 0.8081 |
|
116 |
-
| 51000 | 0.8242 |
|
117 |
-
| 52000 | 0.8404 |
|
118 |
-
| 53000 | 0.8566 |
|
119 |
-
| 54000 | 0.8727 |
|
120 |
-
| 55000 | 0.8889 |
|
121 |
-
| 56000 | 0.9051 | 110.
|
122 |
-
| 57000 | 0.9212 |
|
123 |
-
| 58000 | 0.9374 | 111.
|
124 |
-
| 59000 | 0.9535 |
|
125 |
-
| 60000 | 0.9697 |
|
126 |
-
| 61000 | 0.9859 |
|
127 |
-
| 61875 | 1.0 |
|
128 |
|
129 |
### Framework versions
|
130 |
- Distily 0.2.0
|
|
|
16 |
The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
|
17 |
|
18 |
It achieves the following results on the evaluation set:
|
19 |
+
- eval_enwikippl: 215.5059
|
20 |
+
- eval_frwikippl: 1193.6056
|
21 |
+
- eval_zhwikippl: 627.1483
|
22 |
+
- eval_loss: 1.2009
|
23 |
+
- eval_runtime: 85.3591
|
24 |
+
- eval_samples_per_second: 58.576
|
25 |
+
- eval_steps_per_second: 7.322
|
26 |
|
27 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
28 |
should probably proofread and complete it, then remove this comment.
|
|
|
45 |
### Training hyperparameters
|
46 |
|
47 |
The following hyperparameters were used during training:
|
48 |
+
- distillation_objective: DistillationObjective(logits_loss_component=LossComponent(label=logits, weight=1, loss_fn=kl, layer_mapper=None, projector=None), hs_loss_component=LossComponent(label=hs, weight=2.0, loss_fn=mse, layer_mapper=None, projector=None), attn_loss_component=LossComponent(label=attn, weight=0, loss_fn=None, layer_mapper=None, projector=None))
|
49 |
- train_embeddings: True
|
50 |
- learning_rate: 4e-05
|
51 |
- train_batch_size: 8
|
|
|
56 |
- num_epochs: 1.0
|
57 |
|
58 |
### Resource Usage
|
59 |
+
Peak GPU Memory: 8.0873 GB
|
60 |
|
61 |
### Eval-Phase Metrics
|
62 |
| step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | zhwikippl |
|
63 |
| --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
64 |
| **teacher eval** | | 30.2086 | 57.2728 | | | | | 18.1784 |
|
65 |
+
| 0 | 0 | 56314.7695 | 59887.2773 | 5.8256 | 85.6711 | 58.363 | 7.295 | 59033.8086 |
|
66 |
+
| 1000 | 0.0162 | 703.3142 | 4236.9004 | 1.8490 | 85.6638 | 58.368 | 7.296 | 11133.5088 |
|
67 |
+
| 2000 | 0.0323 | 504.8312 | 3192.5520 | 1.6764 | 85.4461 | 58.516 | 7.315 | 1842.1659 |
|
68 |
+
| 3000 | 0.0485 | 421.6048 | 2827.9453 | 1.5711 | 85.6065 | 58.407 | 7.301 | 841.4271 |
|
69 |
+
| 4000 | 0.0646 | 359.8385 | 2300.7822 | 1.4898 | 85.5248 | 58.463 | 7.308 | 1321.4115 |
|
70 |
+
| 5000 | 0.0808 | 320.1989 | 1782.2493 | 1.4134 | 85.6041 | 58.408 | 7.301 | 921.9020 |
|
71 |
+
| 6000 | 0.0970 | 279.3613 | 1572.2640 | 1.3457 | 85.4507 | 58.513 | 7.314 | 775.6033 |
|
72 |
+
| 7000 | 0.1131 | 252.3406 | 1452.0632 | 1.2901 | 85.4137 | 58.539 | 7.317 | 675.1237 |
|
73 |
+
| 8000 | 0.1293 | 230.5502 | 1345.9784 | 1.2423 | 85.9632 | 58.164 | 7.271 | 594.2899 |
|
74 |
+
| 9000 | 0.1455 | 215.5059 | 1193.6056 | 1.2009 | 85.3591 | 58.576 | 7.322 | 627.1483 |
|
75 |
+
| 10000 | 0.1616 | 194.5708 | 1147.0729 | 1.1501 | 85.2878 | 58.625 | 7.328 | 681.0092 |
|
76 |
+
| 11000 | 0.1778 | 179.9636 | 1066.1221 | 1.1062 | 85.2181 | 58.673 | 7.334 | 556.0541 |
|
77 |
+
| 12000 | 0.1939 | 165.4222 | 900.6642 | 1.0627 | 85.2275 | 58.667 | 7.333 | 517.4376 |
|
78 |
+
| 13000 | 0.2101 | 155.7605 | 880.5709 | 1.0328 | 85.4983 | 58.481 | 7.31 | 504.9460 |
|
79 |
+
| 14000 | 0.2263 | 148.5429 | 820.9711 | 1.0057 | 85.4522 | 58.512 | 7.314 | 432.5430 |
|
80 |
+
| 15000 | 0.2424 | 142.2881 | 752.3494 | 0.9840 | 85.291 | 58.623 | 7.328 | 371.8599 |
|
81 |
+
| 16000 | 0.2586 | 138.4622 | 756.4453 | 0.9709 | 85.5234 | 58.464 | 7.308 | 645.2426 |
|
82 |
+
| 17000 | 0.2747 | 136.4131 | 709.4854 | 0.9606 | 85.2257 | 58.668 | 7.333 | 653.3060 |
|
83 |
+
| 18000 | 0.2909 | 133.8840 | 722.0003 | 0.9493 | 85.2137 | 58.676 | 7.335 | 538.2999 |
|
84 |
+
| 19000 | 0.3071 | 131.9743 | 726.1355 | 0.9435 | 85.2513 | 58.65 | 7.331 | 595.8792 |
|
85 |
+
| 20000 | 0.3232 | 129.7892 | 706.8889 | 0.9335 | 85.2873 | 58.625 | 7.328 | 420.4131 |
|
86 |
+
| 21000 | 0.3394 | 127.3829 | 659.1836 | 0.9238 | 85.0899 | 58.761 | 7.345 | 377.2113 |
|
87 |
+
| 22000 | 0.3556 | 125.8100 | 627.8823 | 0.9149 | 85.107 | 58.75 | 7.344 | 378.1191 |
|
88 |
+
| 23000 | 0.3717 | 124.3241 | 675.1288 | 0.9101 | 85.2843 | 58.627 | 7.328 | 407.0987 |
|
89 |
+
| 24000 | 0.3879 | 121.7446 | 648.3518 | 0.9012 | 85.2715 | 58.636 | 7.33 | 370.7195 |
|
90 |
+
| 25000 | 0.4040 | 121.8676 | 673.0380 | 0.8998 | 85.4414 | 58.52 | 7.315 | 401.9131 |
|
91 |
+
| 26000 | 0.4202 | 121.1881 | 598.3207 | 0.8906 | 85.1925 | 58.691 | 7.336 | 455.3015 |
|
92 |
+
| 27000 | 0.4364 | 119.8778 | 614.9578 | 0.8859 | 85.3813 | 58.561 | 7.32 | 291.9007 |
|
93 |
+
| 28000 | 0.4525 | 119.7104 | 589.9427 | 0.8831 | 85.3094 | 58.61 | 7.326 | 313.4760 |
|
94 |
+
| 29000 | 0.4687 | 118.6553 | 652.7549 | 0.8794 | 85.2629 | 58.642 | 7.33 | 299.0819 |
|
95 |
+
| 30000 | 0.4848 | 118.7475 | 602.5115 | 0.8775 | 85.4036 | 58.546 | 7.318 | 355.6388 |
|
96 |
+
| 31000 | 0.5010 | 118.1863 | 610.4652 | 0.8759 | 85.1743 | 58.703 | 7.338 | 275.1334 |
|
97 |
+
| 32000 | 0.5172 | 117.4726 | 628.6798 | 0.8750 | 85.2859 | 58.626 | 7.328 | 301.3671 |
|
98 |
+
| 33000 | 0.5333 | 115.1694 | 602.0021 | 0.8713 | 85.2629 | 58.642 | 7.33 | 277.0137 |
|
99 |
+
| 34000 | 0.5495 | 115.8600 | 574.1846 | 0.8689 | 85.1695 | 58.706 | 7.338 | 277.8658 |
|
100 |
+
| 35000 | 0.5657 | 114.0391 | 537.2504 | 0.8629 | 85.3032 | 58.614 | 7.327 | 307.7109 |
|
101 |
+
| 36000 | 0.5818 | 115.1694 | 602.9366 | 0.8660 | 85.3327 | 58.594 | 7.324 | 328.2996 |
|
102 |
+
| 37000 | 0.5980 | 113.9152 | 575.0357 | 0.8590 | 85.7449 | 58.313 | 7.289 | 332.3134 |
|
103 |
+
| 38000 | 0.6141 | 114.4739 | 573.7802 | 0.8618 | 85.7064 | 58.339 | 7.292 | 270.8683 |
|
104 |
+
| 39000 | 0.6303 | 112.6310 | 546.8427 | 0.8543 | 85.2884 | 58.625 | 7.328 | 289.1075 |
|
105 |
+
| 40000 | 0.6465 | 112.8762 | 570.1909 | 0.8537 | 85.2282 | 58.666 | 7.333 | 257.7758 |
|
106 |
+
| 41000 | 0.6626 | 112.9112 | 548.9287 | 0.8543 | 85.3272 | 58.598 | 7.325 | 325.8972 |
|
107 |
+
| 42000 | 0.6788 | 111.7424 | 549.7032 | 0.8534 | 85.5416 | 58.451 | 7.306 | 291.7448 |
|
108 |
+
| 43000 | 0.6949 | 112.2556 | 568.9060 | 0.8524 | 85.2667 | 58.64 | 7.33 | 310.9328 |
|
109 |
+
| 44000 | 0.7111 | 110.7490 | 603.5746 | 0.8525 | 85.3547 | 58.579 | 7.322 | 269.4612 |
|
110 |
+
| 45000 | 0.7273 | 112.0378 | 593.0288 | 0.8486 | 85.4267 | 58.53 | 7.316 | 378.8268 |
|
111 |
+
| 46000 | 0.7434 | 111.5950 | 589.0699 | 0.8492 | 85.0567 | 58.784 | 7.348 | 364.7776 |
|
112 |
+
| 47000 | 0.7596 | 112.7010 | 588.7380 | 0.8558 | 85.186 | 58.695 | 7.337 | 446.9284 |
|
113 |
+
| 48000 | 0.7758 | 114.3584 | 519.3724 | 0.8590 | 85.1156 | 58.744 | 7.343 | 2148.5159 |
|
114 |
+
| 49000 | 0.7919 | 115.1962 | 590.7754 | 0.8648 | 85.1589 | 58.714 | 7.339 | 430.9863 |
|
115 |
+
| 50000 | 0.8081 | 114.1809 | 614.9578 | 0.8597 | 85.2319 | 58.663 | 7.333 | 309.8965 |
|
116 |
+
| 51000 | 0.8242 | 112.0117 | 593.5725 | 0.8551 | 85.2128 | 58.677 | 7.335 | 423.6820 |
|
117 |
+
| 52000 | 0.8404 | 109.5515 | 563.6358 | 0.8457 | 85.2439 | 58.655 | 7.332 | 337.9521 |
|
118 |
+
| 53000 | 0.8566 | 109.7388 | 550.0908 | 0.8446 | 85.1744 | 58.703 | 7.338 | 412.3510 |
|
119 |
+
| 54000 | 0.8727 | 111.3959 | 551.1781 | 0.8453 | 85.2832 | 58.628 | 7.329 | 368.3508 |
|
120 |
+
| 55000 | 0.8889 | 111.1712 | 575.0760 | 0.8450 | 85.1663 | 58.709 | 7.339 | 283.8286 |
|
121 |
+
| 56000 | 0.9051 | 110.6545 | 557.3918 | 0.8454 | 85.4688 | 58.501 | 7.313 | 360.2272 |
|
122 |
+
| 57000 | 0.9212 | 110.4055 | 604.0854 | 0.8479 | 85.3777 | 58.563 | 7.32 | 420.6379 |
|
123 |
+
| 58000 | 0.9374 | 111.5257 | 635.6327 | 0.8466 | 85.6212 | 58.397 | 7.3 | 492.8218 |
|
124 |
+
| 59000 | 0.9535 | 109.2372 | 581.6412 | 0.8423 | 85.6034 | 58.409 | 7.301 | 366.9761 |
|
125 |
+
| 60000 | 0.9697 | 108.7379 | 565.1876 | 0.8362 | 85.2814 | 58.629 | 7.329 | 331.8257 |
|
126 |
+
| 61000 | 0.9859 | 108.9746 | 583.4484 | 0.8370 | 85.1729 | 58.704 | 7.338 | 399.4518 |
|
127 |
+
| 61875 | 1.0 | 109.4665 | 569.9899 | 0.8351 | 85.5074 | 58.474 | 7.309 | 290.9278 |
|
128 |
|
129 |
### Framework versions
|
130 |
- Distily 0.2.0
|
logs/hs_loss_fn=mse, hs_weight=2.0/events.out.tfevents.1723754495.5f530b1cf724
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:0b095d411875b12c72e4ba66c3a4d371155ebcf13c62eb162da54e90dcb0d3b4
|
3 |
+
size 529
|