End of training
Browse files- README.md +9 -17
- adapter_model.bin +1 -1
- training_args.bin +1 -1
README.md
CHANGED
@@ -15,9 +15,9 @@ should probably proofread and complete it, then remove this comment. -->
|
|
15 |
|
16 |
This model is a fine-tuned version of [google/t5-v1_1-large](https://huggingface.co/google/t5-v1_1-large) on the None dataset.
|
17 |
It achieves the following results on the evaluation set:
|
18 |
-
- Train Loss:
|
19 |
-
- Loss:
|
20 |
-
- Losses: [
|
21 |
|
22 |
## Model description
|
23 |
|
@@ -42,23 +42,15 @@ The following hyperparameters were used during training:
|
|
42 |
- seed: 42
|
43 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
44 |
- lr_scheduler_type: linear
|
45 |
-
- num_epochs:
|
46 |
|
47 |
### Training results
|
48 |
|
49 |
-
| Training Loss | Epoch | Step | Train Loss | Validation Loss | Losses
|
50 |
-
|
51 |
-
| 19.
|
52 |
-
| 19.
|
53 |
-
| 20.
|
54 |
-
| 19.1924 | 4.0 | 100 | 12.6693 | 22.6756 | [18, 14, 11, 9, 9, 9, 13, 9, 9, 9, 14, 9, 9, 11, 9, 11, 9, 9, 8, 42, 17, 9, 25, 11, 14, 9, 42, 12, 14, 9, 9, 9, 13, 9, 18, 16, 13, 9, 9, 9, 8, 9, 12, 9, 9, 15, 22, 15, 12, 9, 21, 12, 12, 12, 12, 8, 12, 8, 9, 24, 9, 12, 12, 9, 17, 8, 9, 25, 9, 9, 10, 9, 10, 42, 11, 16, 9, 11, 9, 12, 9, 12, 13, 25, 18, 20, 17, 21, 14, 9, 9, 13, 9, 20, 7, 9, 9, 15, 9, 9, 9, 11, 12, 14, 9, 7, 12, 16, 19, 10, 9, 22, 23, 15, 14, 13, 9, 9, 23, 11, 13, 9, 11, 9, 8, 9, 21, 12, 8, 9, 13, 17, 7, 9, 23, 8, 9, 13, 9, 9, 7, 12, 18, 9, 13, 13, 25, 13, 22, 12, 8, 10, 12, 13, 9, 24, 25, 12, 9, 9, 9, 9, 10, 14, 9, 9, 10, 9, 9, 25, 12, 9, 10, 14, 9, 9, 13, 12, 9, 9, 13, 9, 13, 9, 9, 9, 15, 9, 11, 9, 12, 25, 25, 19, 13, 9, 9, 13, 9, 9, 9, 12, 9, 16, 22, 26, 9, 12, 9, 19, 11, 15, 12, 14, 9, 13, 11, 12, 9, 8, 12, 8, 12, 9, 23, 10, 15, 9, 11, 17, 14, 12, 10, 15, 9, 22, 14, 12, 9, 10, 9, 11, 12, 14, 9, 8, 8, 17, 19, 9, 9, 12, 9, 12, 19, 8, 13, 9, 13, 18, 25, 14, 13, 12, 17, 18, 9, 18, 9, 10, 19, 12, 12, 25, 16, 9, 13, 9, 12, 9, 9, 9, 12, 13, 12, 12, 32, 8, 9, 9, 9, 9, 13, 9, 12, 13, 13, 21, 9, 9, 12, 14, 11, 14, 12, 11, 11, 9, 25, 14, 11, 9, 12, 9, 9, 9, 9, 13, 9, 12, 25, 9, 12, 9, 12, 9, 9, 15, 16, 9, 25, 14, 9, 16, 22, 9, 12, 9, 20, 9, 23, 10, 9, 25, 12, 9, 8, 8, 12, 9, 9, 9, 12, 9, 18, 16, 14, 12, 11, 17, 12, 25, 15, 9, 9, 9, 9, 10, 9, 23, 25, 12, 9, 9, 10] |
|
55 |
-
| 18.3095 | 5.0 | 125 | 12.4747 | 19.7577 | [18, 18, 9, 9, 9, 9, 13, 9, 15, 9, 14, 9, 9, 11, 9, 11, 9, 9, 8, 42, 17, 9, 25, 11, 14, 9, 42, 12, 14, 9, 9, 9, 13, 9, 18, 16, 12, 9, 9, 9, 8, 9, 12, 9, 9, 15, 22, 15, 12, 9, 21, 12, 12, 12, 12, 9, 12, 8, 20, 24, 9, 12, 12, 9, 17, 8, 9, 25, 9, 9, 10, 9, 15, 42, 9, 16, 9, 11, 9, 12, 9, 12, 13, 25, 18, 20, 17, 21, 9, 9, 9, 13, 9, 15, 7, 9, 9, 15, 8, 9, 9, 17, 12, 14, 9, 7, 12, 16, 19, 10, 9, 22, 10, 15, 14, 13, 9, 9, 9, 11, 13, 9, 10, 9, 8, 8, 21, 12, 8, 9, 13, 17, 7, 9, 19, 8, 9, 13, 9, 9, 7, 12, 9, 9, 13, 13, 25, 13, 22, 12, 8, 15, 12, 13, 9, 25, 25, 12, 9, 9, 8, 25, 10, 14, 9, 9, 10, 9, 9, 25, 12, 9, 10, 14, 9, 9, 9, 12, 9, 9, 9, 8, 12, 9, 9, 9, 9, 9, 9, 9, 17, 12, 25, 19, 13, 9, 9, 13, 9, 9, 9, 9, 9, 16, 22, 23, 9, 12, 9, 19, 11, 13, 12, 9, 9, 13, 11, 12, 9, 8, 12, 8, 8, 9, 23, 10, 15, 9, 11, 17, 14, 12, 10, 15, 9, 22, 10, 25, 9, 10, 9, 11, 12, 14, 9, 8, 8, 18, 17, 9, 9, 12, 9, 12, 19, 9, 13, 9, 9, 18, 25, 14, 13, 12, 13, 18, 9, 18, 9, 15, 19, 12, 12, 25, 16, 9, 13, 9, 12, 9, 9, 9, 12, 13, 12, 12, 13, 7, 8, 9, 9, 9, 14, 9, 9, 10, 12, 21, 9, 9, 12, 14, 16, 9, 12, 11, 11, 9, 25, 9, 11, 9, 12, 9, 9, 9, 9, 13, 9, 9, 25, 9, 12, 9, 12, 9, 9, 15, 16, 9, 25, 14, 9, 16, 22, 9, 12, 9, 15, 9, 23, 16, 9, 25, 12, 9, 9, 8, 12, 9, 9, 9, 12, 9, 18, 22, 14, 12, 11, 23, 12, 25, 9, 12, 9, 9, 9, 10, 9, 23, 17, 12, 9, 9, 10] |
|
56 |
-
| 16.269 | 6.0 | 150 | 12.8053 | 12.0449 | [18, 18, 15, 11, 9, 9, 13, 9, 9, 9, 14, 9, 9, 11, 9, 9, 25, 9, 12, 42, 17, 9, 25, 11, 14, 9, 42, 12, 34, 9, 9, 9, 13, 9, 18, 16, 9, 9, 9, 9, 8, 9, 12, 9, 9, 15, 22, 15, 12, 9, 7, 12, 12, 12, 12, 9, 25, 8, 20, 10, 9, 12, 12, 9, 17, 9, 9, 28, 9, 9, 10, 9, 8, 42, 9, 16, 9, 11, 9, 12, 19, 12, 13, 25, 18, 20, 17, 21, 21, 9, 9, 13, 20, 15, 8, 9, 9, 15, 20, 9, 9, 17, 16, 14, 9, 7, 12, 16, 19, 10, 9, 15, 15, 15, 13, 9, 9, 9, 9, 11, 13, 9, 12, 9, 9, 9, 21, 12, 8, 9, 13, 17, 12, 9, 19, 8, 9, 13, 9, 14, 7, 12, 8, 12, 13, 13, 25, 13, 22, 12, 8, 8, 12, 9, 9, 11, 25, 12, 12, 9, 20, 25, 10, 14, 9, 9, 10, 9, 20, 25, 12, 9, 10, 14, 9, 9, 11, 12, 9, 9, 11, 9, 12, 9, 9, 9, 12, 9, 9, 18, 17, 12, 25, 19, 13, 9, 12, 9, 9, 9, 15, 9, 9, 10, 22, 19, 9, 12, 10, 19, 11, 20, 12, 9, 9, 13, 11, 12, 9, 9, 12, 8, 8, 9, 23, 10, 15, 9, 11, 17, 14, 12, 10, 15, 9, 22, 17, 25, 9, 10, 9, 11, 12, 14, 9, 8, 12, 18, 17, 9, 14, 12, 9, 12, 9, 7, 13, 9, 9, 18, 25, 14, 13, 13, 9, 18, 9, 18, 9, 8, 19, 12, 12, 25, 16, 8, 13, 9, 25, 12, 9, 9, 12, 9, 12, 12, 25, 8, 23, 9, 9, 8, 9, 9, 13, 10, 13, 17, 9, 9, 12, 14, 14, 12, 12, 11, 9, 9, 25, 9, 11, 8, 12, 9, 9, 8, 9, 13, 9, 12, 25, 9, 12, 9, 12, 9, 12, 15, 16, 8, 25, 14, 9, 16, 22, 9, 12, 9, 17, 9, 19, 15, 9, 25, 12, 9, 9, 9, 12, 9, 9, 9, 12, 9, 18, 16, 14, 12, 13, 23, 12, 25, 15, 12, 9, 12, 9, 10, 9, 23, 8, 12, 9, 9, 10] |
|
57 |
-
| 14.8649 | 7.0 | 175 | 12.6853 | 10.8788 | [18, 14, 15, 11, 9, 9, 13, 9, 9, 9, 14, 9, 9, 11, 9, 9, 25, 9, 12, 42, 17, 9, 9, 9, 14, 9, 42, 12, 28, 9, 9, 9, 13, 9, 18, 16, 9, 9, 9, 9, 8, 9, 12, 9, 9, 15, 22, 12, 12, 9, 7, 12, 12, 12, 12, 9, 25, 8, 20, 10, 9, 12, 12, 9, 17, 9, 9, 15, 9, 9, 10, 9, 8, 42, 9, 16, 9, 11, 9, 12, 19, 12, 13, 12, 18, 20, 17, 21, 20, 9, 9, 13, 20, 15, 8, 9, 9, 15, 20, 9, 9, 15, 16, 14, 9, 7, 12, 16, 19, 12, 9, 22, 9, 15, 13, 9, 9, 9, 9, 11, 13, 9, 12, 9, 9, 9, 21, 12, 9, 9, 13, 17, 7, 9, 23, 8, 9, 13, 9, 14, 7, 12, 8, 12, 14, 11, 25, 13, 22, 12, 8, 8, 12, 9, 9, 15, 25, 12, 9, 9, 20, 25, 10, 14, 9, 9, 12, 9, 20, 25, 12, 9, 10, 14, 9, 9, 11, 12, 9, 14, 11, 9, 12, 9, 9, 9, 12, 14, 11, 9, 17, 12, 25, 19, 13, 9, 12, 9, 9, 9, 15, 12, 9, 10, 22, 26, 9, 12, 10, 19, 11, 20, 12, 9, 9, 13, 11, 12, 9, 9, 12, 8, 11, 9, 23, 10, 15, 9, 11, 17, 14, 9, 10, 15, 9, 22, 18, 25, 9, 10, 9, 11, 12, 14, 9, 8, 12, 18, 17, 9, 20, 12, 9, 12, 9, 8, 14, 9, 13, 18, 25, 9, 13, 13, 9, 18, 9, 18, 9, 8, 19, 12, 12, 25, 16, 9, 13, 9, 25, 12, 22, 9, 12, 8, 12, 12, 25, 8, 23, 9, 9, 9, 13, 7, 9, 10, 13, 12, 9, 9, 12, 14, 14, 21, 12, 11, 9, 9, 25, 12, 11, 8, 12, 9, 9, 9, 9, 13, 9, 12, 9, 9, 12, 9, 12, 9, 12, 15, 16, 9, 25, 9, 9, 16, 22, 9, 12, 9, 20, 9, 19, 15, 9, 25, 12, 9, 9, 8, 12, 9, 14, 9, 12, 9, 18, 16, 9, 12, 7, 15, 12, 25, 15, 12, 9, 12, 9, 10, 9, 23, 9, 12, 9, 9, 10] |
|
58 |
-
| 14.7746 | 8.0 | 200 | 12.6853 | 10.8788 | [18, 14, 15, 11, 9, 9, 13, 9, 9, 9, 14, 9, 9, 11, 9, 9, 25, 9, 12, 42, 17, 9, 9, 9, 14, 9, 42, 12, 28, 9, 9, 9, 13, 9, 18, 16, 9, 9, 9, 9, 8, 9, 12, 9, 9, 15, 22, 12, 12, 9, 7, 12, 12, 12, 12, 9, 25, 8, 20, 10, 9, 12, 12, 9, 17, 9, 9, 15, 9, 9, 10, 9, 8, 42, 9, 16, 9, 11, 9, 12, 19, 12, 13, 12, 18, 20, 17, 21, 20, 9, 9, 13, 20, 15, 8, 9, 9, 15, 20, 9, 9, 15, 16, 14, 9, 7, 12, 16, 19, 12, 9, 22, 9, 15, 13, 9, 9, 9, 9, 11, 13, 9, 12, 9, 9, 9, 21, 12, 9, 9, 13, 17, 7, 9, 23, 8, 9, 13, 9, 14, 7, 12, 8, 12, 14, 11, 25, 13, 22, 12, 8, 8, 12, 9, 9, 15, 25, 12, 9, 9, 20, 25, 10, 14, 9, 9, 12, 9, 20, 25, 12, 9, 10, 14, 9, 9, 11, 12, 9, 14, 11, 9, 12, 9, 9, 9, 12, 14, 11, 9, 17, 12, 25, 19, 13, 9, 12, 9, 9, 9, 15, 12, 9, 10, 22, 26, 9, 12, 10, 19, 11, 20, 12, 9, 9, 13, 11, 12, 9, 9, 12, 8, 11, 9, 23, 10, 15, 9, 11, 17, 14, 9, 10, 15, 9, 22, 18, 25, 9, 10, 9, 11, 12, 14, 9, 8, 12, 18, 17, 9, 20, 12, 9, 12, 9, 8, 14, 9, 13, 18, 25, 9, 13, 13, 9, 18, 9, 18, 9, 8, 19, 12, 12, 25, 16, 9, 13, 9, 25, 12, 22, 9, 12, 8, 12, 12, 25, 8, 23, 9, 9, 9, 13, 7, 9, 10, 13, 12, 9, 9, 12, 14, 14, 21, 12, 11, 9, 9, 25, 12, 11, 8, 12, 9, 9, 9, 9, 13, 9, 12, 9, 9, 12, 9, 12, 9, 12, 15, 16, 9, 25, 9, 9, 16, 22, 9, 12, 9, 20, 9, 19, 15, 9, 25, 12, 9, 9, 8, 12, 9, 14, 9, 12, 9, 18, 16, 9, 12, 7, 15, 12, 25, 15, 12, 9, 12, 9, 10, 9, 23, 9, 12, 9, 9, 10] |
|
59 |
-
| 14.7766 | 9.0 | 225 | 12.6853 | 10.8788 | [18, 14, 15, 11, 9, 9, 13, 9, 9, 9, 14, 9, 9, 11, 9, 9, 25, 9, 12, 42, 17, 9, 9, 9, 14, 9, 42, 12, 28, 9, 9, 9, 13, 9, 18, 16, 9, 9, 9, 9, 8, 9, 12, 9, 9, 15, 22, 12, 12, 9, 7, 12, 12, 12, 12, 9, 25, 8, 20, 10, 9, 12, 12, 9, 17, 9, 9, 15, 9, 9, 10, 9, 8, 42, 9, 16, 9, 11, 9, 12, 19, 12, 13, 12, 18, 20, 17, 21, 20, 9, 9, 13, 20, 15, 8, 9, 9, 15, 20, 9, 9, 15, 16, 14, 9, 7, 12, 16, 19, 12, 9, 22, 9, 15, 13, 9, 9, 9, 9, 11, 13, 9, 12, 9, 9, 9, 21, 12, 9, 9, 13, 17, 7, 9, 23, 8, 9, 13, 9, 14, 7, 12, 8, 12, 14, 11, 25, 13, 22, 12, 8, 8, 12, 9, 9, 15, 25, 12, 9, 9, 20, 25, 10, 14, 9, 9, 12, 9, 20, 25, 12, 9, 10, 14, 9, 9, 11, 12, 9, 14, 11, 9, 12, 9, 9, 9, 12, 14, 11, 9, 17, 12, 25, 19, 13, 9, 12, 9, 9, 9, 15, 12, 9, 10, 22, 26, 9, 12, 10, 19, 11, 20, 12, 9, 9, 13, 11, 12, 9, 9, 12, 8, 11, 9, 23, 10, 15, 9, 11, 17, 14, 9, 10, 15, 9, 22, 18, 25, 9, 10, 9, 11, 12, 14, 9, 8, 12, 18, 17, 9, 20, 12, 9, 12, 9, 8, 14, 9, 13, 18, 25, 9, 13, 13, 9, 18, 9, 18, 9, 8, 19, 12, 12, 25, 16, 9, 13, 9, 25, 12, 22, 9, 12, 8, 12, 12, 25, 8, 23, 9, 9, 9, 13, 7, 9, 10, 13, 12, 9, 9, 12, 14, 14, 21, 12, 11, 9, 9, 25, 12, 11, 8, 12, 9, 9, 9, 9, 13, 9, 12, 9, 9, 12, 9, 12, 9, 12, 15, 16, 9, 25, 9, 9, 16, 22, 9, 12, 9, 20, 9, 19, 15, 9, 25, 12, 9, 9, 8, 12, 9, 14, 9, 12, 9, 18, 16, 9, 12, 7, 15, 12, 25, 15, 12, 9, 12, 9, 10, 9, 23, 9, 12, 9, 9, 10] |
|
60 |
-
| 14.777 | 10.0 | 250 | 12.6853 | 10.8788 | [18, 14, 15, 11, 9, 9, 13, 9, 9, 9, 14, 9, 9, 11, 9, 9, 25, 9, 12, 42, 17, 9, 9, 9, 14, 9, 42, 12, 28, 9, 9, 9, 13, 9, 18, 16, 9, 9, 9, 9, 8, 9, 12, 9, 9, 15, 22, 12, 12, 9, 7, 12, 12, 12, 12, 9, 25, 8, 20, 10, 9, 12, 12, 9, 17, 9, 9, 15, 9, 9, 10, 9, 8, 42, 9, 16, 9, 11, 9, 12, 19, 12, 13, 12, 18, 20, 17, 21, 20, 9, 9, 13, 20, 15, 8, 9, 9, 15, 20, 9, 9, 15, 16, 14, 9, 7, 12, 16, 19, 12, 9, 22, 9, 15, 13, 9, 9, 9, 9, 11, 13, 9, 12, 9, 9, 9, 21, 12, 9, 9, 13, 17, 7, 9, 23, 8, 9, 13, 9, 14, 7, 12, 8, 12, 14, 11, 25, 13, 22, 12, 8, 8, 12, 9, 9, 15, 25, 12, 9, 9, 20, 25, 10, 14, 9, 9, 12, 9, 20, 25, 12, 9, 10, 14, 9, 9, 11, 12, 9, 14, 11, 9, 12, 9, 9, 9, 12, 14, 11, 9, 17, 12, 25, 19, 13, 9, 12, 9, 9, 9, 15, 12, 9, 10, 22, 26, 9, 12, 10, 19, 11, 20, 12, 9, 9, 13, 11, 12, 9, 9, 12, 8, 11, 9, 23, 10, 15, 9, 11, 17, 14, 9, 10, 15, 9, 22, 18, 25, 9, 10, 9, 11, 12, 14, 9, 8, 12, 18, 17, 9, 20, 12, 9, 12, 9, 8, 14, 9, 13, 18, 25, 9, 13, 13, 9, 18, 9, 18, 9, 8, 19, 12, 12, 25, 16, 9, 13, 9, 25, 12, 22, 9, 12, 8, 12, 12, 25, 8, 23, 9, 9, 9, 13, 7, 9, 10, 13, 12, 9, 9, 12, 14, 14, 21, 12, 11, 9, 9, 25, 12, 11, 8, 12, 9, 9, 9, 9, 13, 9, 12, 9, 9, 12, 9, 12, 9, 12, 15, 16, 9, 25, 9, 9, 16, 22, 9, 12, 9, 20, 9, 19, 15, 9, 25, 12, 9, 9, 8, 12, 9, 14, 9, 12, 9, 18, 16, 9, 12, 7, 15, 12, 25, 15, 12, 9, 12, 9, 10, 9, 23, 9, 12, 9, 9, 10] |
|
61 |
-
| 14.7424 | 11.0 | 275 | 12.6853 | 10.8788 | [18, 14, 15, 11, 9, 9, 13, 9, 9, 9, 14, 9, 9, 11, 9, 9, 25, 9, 12, 42, 17, 9, 9, 9, 14, 9, 42, 12, 28, 9, 9, 9, 13, 9, 18, 16, 9, 9, 9, 9, 8, 9, 12, 9, 9, 15, 22, 12, 12, 9, 7, 12, 12, 12, 12, 9, 25, 8, 20, 10, 9, 12, 12, 9, 17, 9, 9, 15, 9, 9, 10, 9, 8, 42, 9, 16, 9, 11, 9, 12, 19, 12, 13, 12, 18, 20, 17, 21, 20, 9, 9, 13, 20, 15, 8, 9, 9, 15, 20, 9, 9, 15, 16, 14, 9, 7, 12, 16, 19, 12, 9, 22, 9, 15, 13, 9, 9, 9, 9, 11, 13, 9, 12, 9, 9, 9, 21, 12, 9, 9, 13, 17, 7, 9, 23, 8, 9, 13, 9, 14, 7, 12, 8, 12, 14, 11, 25, 13, 22, 12, 8, 8, 12, 9, 9, 15, 25, 12, 9, 9, 20, 25, 10, 14, 9, 9, 12, 9, 20, 25, 12, 9, 10, 14, 9, 9, 11, 12, 9, 14, 11, 9, 12, 9, 9, 9, 12, 14, 11, 9, 17, 12, 25, 19, 13, 9, 12, 9, 9, 9, 15, 12, 9, 10, 22, 26, 9, 12, 10, 19, 11, 20, 12, 9, 9, 13, 11, 12, 9, 9, 12, 8, 11, 9, 23, 10, 15, 9, 11, 17, 14, 9, 10, 15, 9, 22, 18, 25, 9, 10, 9, 11, 12, 14, 9, 8, 12, 18, 17, 9, 20, 12, 9, 12, 9, 8, 14, 9, 13, 18, 25, 9, 13, 13, 9, 18, 9, 18, 9, 8, 19, 12, 12, 25, 16, 9, 13, 9, 25, 12, 22, 9, 12, 8, 12, 12, 25, 8, 23, 9, 9, 9, 13, 7, 9, 10, 13, 12, 9, 9, 12, 14, 14, 21, 12, 11, 9, 9, 25, 12, 11, 8, 12, 9, 9, 9, 9, 13, 9, 12, 9, 9, 12, 9, 12, 9, 12, 15, 16, 9, 25, 9, 9, 16, 22, 9, 12, 9, 20, 9, 19, 15, 9, 25, 12, 9, 9, 8, 12, 9, 14, 9, 12, 9, 18, 16, 9, 12, 7, 15, 12, 25, 15, 12, 9, 12, 9, 10, 9, 23, 9, 12, 9, 9, 10] |
|
62 |
|
63 |
|
64 |
### Framework versions
|
|
|
15 |
|
16 |
This model is a fine-tuned version of [google/t5-v1_1-large](https://huggingface.co/google/t5-v1_1-large) on the None dataset.
|
17 |
It achieves the following results on the evaluation set:
|
18 |
+
- Train Loss: 0.9595
|
19 |
+
- Loss: 23.6406
|
20 |
+
- Losses: [0.9, 0.875, 1.0, 1.0, 1.0, 1.0, 0.9285714285714286, 0.8181818181818182, 1.0, 1.0, 0.875, 1.0, 1.0, 0.8461538461538461, 0.9, 1.0, 1.0, 1.0, 1.0, 1.0, 0.9444444444444444, 1.0, 0.9615384615384616, 0.7857142857142857, 0.9333333333333333, 1.0, 1.0, 1.0, 0.9090909090909091, 1.0, 1.0, 1.0, 1.0, 0.9, 1.0, 0.8888888888888888, 0.9285714285714286, 1.0, 1.0, 1.0, 0.8888888888888888, 1.0, 1.0, 1.0, 1.0, 0.9375, 0.9565217391304348, 0.8823529411764706, 1.0, 0.9, 0.9130434782608695, 1.0, 1.0, 1.0, 1.0, 0.9, 1.0, 1.0, 1.0, 0.96, 1.0, 1.0, 1.0, 1.0, 0.9444444444444444, 0.8888888888888888, 1.0, 0.9615384615384616, 1.0, 1.0, 0.9090909090909091, 1.0, 1.0, 1.0, 0.7857142857142857, 0.8421052631578947, 1.0, 0.9166666666666666, 1.0, 1.0, 1.0, 1.0, 0.9285714285714286, 0.9615384615384616, 1.0, 0.9523809523809523, 0.9230769230769231, 0.9130434782608695, 1.0, 1.0, 1.0, 0.8125, 1.0, 0.8823529411764706, 1.0, 1.0, 1.0, 0.8823529411764706, 1.0, 1.0, 1.0, 0.9090909090909091, 0.9230769230769231, 1.0, 1.0, 1.0, 1.0, 0.8888888888888888, 0.8461538461538461, 0.8333333333333334, 1.0, 0.88, 0.9230769230769231, 0.9375, 0.9166666666666666, 0.9285714285714286, 0.9, 1.0, 0.8846153846153846, 0.9166666666666666, 0.8666666666666667, 1.0, 0.9166666666666666, 1.0, 1.0, 1.0, 0.9545454545454546, 1.0, 0.9333333333333333, 1.0, 0.9285714285714286, 0.85, 1.0, 1.0, 0.8846153846153846, 1.0, 1.0, 0.9285714285714286, 1.0, 1.0, 1.0, 1.0, 0.9473684210526315, 1.0, 0.8666666666666667, 0.8666666666666667, 0.9615384615384616, 0.8666666666666667, 1.0, 1.0, 1.0, 1.0, 1.0, 0.8666666666666667, 1.0, 0.9230769230769231, 0.9615384615384616, 1.0, 0.9, 1.0, 1.0, 1.0, 0.8333333333333334, 0.9333333333333333, 1.0, 1.0, 0.9090909090909091, 1.0, 1.0, 0.9615384615384616, 1.0, 1.0, 1.0, 0.9333333333333333, 1.0, 1.0, 0.8125, 1.0, 1.0, 1.0, 0.8125, 1.0, 0.9285714285714286, 1.0, 1.0, 1.0, 0.9375, 1.0, 0.9090909090909091, 1.0, 0.9230769230769231, 0.9, 0.9615384615384616, 0.8461538461538461, 0.9285714285714286, 1.0, 1.0, 0.8666666666666667, 1.0, 1.0, 1.0, 1.0, 1.0, 0.8888888888888888, 0.88, 0.896551724137931, 1.0, 1.0, 1.0, 0.9047619047619048, 0.9166666666666666, 0.8823529411764706, 1.0, 1.0, 0.8181818181818182, 0.9285714285714286, 0.9166666666666666, 1.0, 1.0, 0.8888888888888888, 1.0, 1.0, 0.8888888888888888, 1.0, 0.92, 0.8333333333333334, 0.8823529411764706, 1.0, 0.9375, 0.8947368421052632, 0.875, 1.0, 0.9090909090909091, 0.9375, 1.0, 0.9565217391304348, 1.0, 1.0, 1.0, 0.8333333333333334, 0.9285714285714286, 1.0, 1.0, 0.9333333333333333, 1.0, 0.8888888888888888, 1.0, 0.9444444444444444, 0.8947368421052632, 1.0, 1.0, 0.8571428571428571, 0.9, 1.0, 1.0, 1.0, 0.8666666666666667, 1.0, 0.8666666666666667, 1.0, 0.9615384615384616, 0.875, 1.0, 0.9285714285714286, 0.8947368421052632, 0.9473684210526315, 1.0, 0.9473684210526315, 1.0, 1.0, 0.8461538461538461, 0.8571428571428571, 1.0, 0.9615384615384616, 0.8888888888888888, 1.0, 1.0, 1.0, 0.9230769230769231, 1.0, 1.0, 1.0, 1.0, 0.8666666666666667, 1.0, 1.0, 0.9411764705882353, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.9285714285714286, 0.9285714285714286, 0.9545454545454546, 1.0, 1.0, 1.0, 0.9333333333333333, 0.9411764705882353, 0.8571428571428571, 1.0, 0.9166666666666666, 0.9166666666666666, 1.0, 0.9615384615384616, 1.0, 0.9166666666666666, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.9615384615384616, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.8823529411764706, 0.8888888888888888, 1.0, 1.0, 0.875, 1.0, 0.9375, 0.88, 1.0, 1.0, 1.0, 1.0, 1.0, 0.92, 0.9090909090909091, 1.0, 0.9615384615384616, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.8571428571428571, 1.0, 0.9, 0.9411764705882353, 0.9285714285714286, 1.0, 0.8666666666666667, 0.8947368421052632, 1.0, 0.9615384615384616, 0.9375, 1.0, 1.0, 1.0, 1.0, 0.9090909090909091, 1.0, 0.9583333333333334, 0.8928571428571429, 1.0, 1.0, 1.0, 0.8333333333333334]
|
21 |
|
22 |
## Model description
|
23 |
|
|
|
42 |
- seed: 42
|
43 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
44 |
- lr_scheduler_type: linear
|
45 |
+
- num_epochs: 3.0
|
46 |
|
47 |
### Training results
|
48 |
|
49 |
+
| Training Loss | Epoch | Step | Train Loss | Validation Loss | Losses |
|
50 |
+
|:-------------:|:-----:|:----:|:----------:|:---------------:|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|
|
51 |
+
| 19.9737 | 1.0 | 25 | 0.9595 | 24.0825 | [0.9, 0.875, 0.8333333333333334, 1.0, 1.0, 1.0, 0.9285714285714286, 1.0, 1.0, 1.0, 0.875, 1.0, 1.0, 0.8461538461538461, 0.9, 1.0, 0.9230769230769231, 1.0, 1.0, 1.0, 0.9444444444444444, 1.0, 0.9615384615384616, 0.7857142857142857, 0.9333333333333333, 1.0, 1.0, 1.0, 0.9090909090909091, 1.0, 1.0, 1.0, 1.0, 0.9, 1.0, 0.8888888888888888, 0.9285714285714286, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.9375, 0.9565217391304348, 0.8823529411764706, 1.0, 0.9, 0.9130434782608695, 1.0, 1.0, 1.0, 1.0, 0.9, 1.0, 1.0, 0.9230769230769231, 0.96, 1.0, 1.0, 1.0, 1.0, 0.9444444444444444, 0.8888888888888888, 1.0, 0.9615384615384616, 1.0, 1.0, 0.9090909090909091, 1.0, 1.0, 1.0, 0.7857142857142857, 0.8421052631578947, 1.0, 0.9166666666666666, 1.0, 1.0, 1.0, 1.0, 0.9285714285714286, 0.9615384615384616, 1.0, 0.9523809523809523, 0.85, 0.9130434782608695, 0.9090909090909091, 1.0, 1.0, 0.8125, 1.0, 0.8695652173913043, 1.0, 1.0, 1.0, 0.8823529411764706, 1.0, 1.0, 1.0, 0.9090909090909091, 0.9230769230769231, 1.0, 1.0, 1.0, 1.0, 0.8888888888888888, 0.8461538461538461, 0.8333333333333334, 1.0, 0.88, 0.9230769230769231, 0.9375, 0.9166666666666666, 0.9285714285714286, 0.9, 0.9166666666666666, 1.0, 0.9166666666666666, 0.8666666666666667, 1.0, 0.9166666666666666, 1.0, 1.0, 1.0, 0.9545454545454546, 1.0, 0.9333333333333333, 1.0, 0.9285714285714286, 0.85, 1.0, 1.0, 0.8846153846153846, 1.0, 1.0, 0.9285714285714286, 1.0, 1.0, 1.0, 1.0, 0.9473684210526315, 1.0, 0.8666666666666667, 0.8461538461538461, 0.9615384615384616, 0.8666666666666667, 0.9166666666666666, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.9230769230769231, 0.9615384615384616, 1.0, 0.9, 1.0, 1.0, 1.0, 0.8333333333333334, 0.9333333333333333, 1.0, 1.0, 0.9090909090909091, 1.0, 1.0, 0.9615384615384616, 1.0, 1.0, 1.0, 0.9333333333333333, 1.0, 1.0, 0.8125, 1.0, 1.0, 1.0, 0.8125, 1.0, 0.9285714285714286, 1.0, 1.0, 1.0, 0.9375, 1.0, 0.9090909090909091, 1.0, 0.9230769230769231, 0.9615384615384616, 0.9615384615384616, 0.8461538461538461, 0.9285714285714286, 1.0, 1.0, 0.8666666666666667, 1.0, 1.0, 1.0, 1.0, 1.0, 0.8888888888888888, 0.88, 0.896551724137931, 1.0, 1.0, 1.0, 0.9047619047619048, 0.9166666666666666, 0.8823529411764706, 1.0, 1.0, 0.8181818181818182, 0.9285714285714286, 0.9166666666666666, 1.0, 1.0, 0.8888888888888888, 1.0, 1.0, 0.9375, 1.0, 0.92, 0.8333333333333334, 0.8823529411764706, 1.0, 0.9375, 0.8947368421052632, 0.875, 1.0, 0.9090909090909091, 0.9375, 1.0, 0.9565217391304348, 1.0, 1.0, 1.0, 0.8333333333333334, 1.0, 1.0, 1.0, 0.9333333333333333, 1.0, 1.0, 1.0, 0.9444444444444444, 0.8947368421052632, 1.0, 1.0, 0.8571428571428571, 0.9, 1.0, 1.0, 1.0, 0.8666666666666667, 1.0, 0.8666666666666667, 1.0, 0.9615384615384616, 0.875, 1.0, 0.9285714285714286, 0.8947368421052632, 0.9473684210526315, 1.0, 0.9473684210526315, 1.0, 1.0, 0.8461538461538461, 0.8571428571428571, 1.0, 0.9615384615384616, 0.8888888888888888, 1.0, 0.9285714285714286, 1.0, 0.9230769230769231, 1.0, 1.0, 1.0, 1.0, 0.8666666666666667, 1.0, 1.0, 0.9411764705882353, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.9285714285714286, 0.9285714285714286, 0.9545454545454546, 1.0, 1.0, 1.0, 1.0, 0.9411764705882353, 0.8571428571428571, 1.0, 0.9166666666666666, 0.9166666666666666, 1.0, 0.9615384615384616, 1.0, 0.9166666666666666, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.9615384615384616, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.8823529411764706, 0.8888888888888888, 1.0, 1.0, 0.875, 1.0, 0.9375, 0.88, 1.0, 0.8571428571428571, 1.0, 1.0, 1.0, 0.92, 0.9090909090909091, 1.0, 0.9615384615384616, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.8571428571428571, 1.0, 0.9, 0.9411764705882353, 0.9285714285714286, 1.0, 0.8666666666666667, 0.8947368421052632, 1.0, 0.9615384615384616, 0.9375, 1.0, 1.0, 1.0, 1.0, 0.9090909090909091, 1.0, 0.9583333333333334, 0.8928571428571429, 1.0, 1.0, 1.0, 0.8333333333333334] |
|
52 |
+
| 19.8295 | 2.0 | 50 | 0.9596 | 23.9408 | [0.9, 0.875, 0.8333333333333334, 1.0, 1.0, 1.0, 0.9285714285714286, 1.0, 1.0, 1.0, 0.875, 1.0, 1.0, 0.8461538461538461, 0.9, 1.0, 0.9230769230769231, 1.0, 1.0, 1.0, 0.9444444444444444, 1.0, 0.9615384615384616, 0.7857142857142857, 0.9333333333333333, 1.0, 1.0, 1.0, 0.9411764705882353, 1.0, 1.0, 1.0, 1.0, 0.9, 1.0, 0.8888888888888888, 0.9285714285714286, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.9375, 0.9565217391304348, 0.8823529411764706, 1.0, 0.9, 0.9130434782608695, 1.0, 1.0, 1.0, 1.0, 0.9, 1.0, 1.0, 0.9230769230769231, 0.96, 1.0, 1.0, 1.0, 1.0, 0.9444444444444444, 0.8888888888888888, 1.0, 0.9615384615384616, 1.0, 1.0, 0.9090909090909091, 1.0, 1.0, 1.0, 0.7857142857142857, 0.8421052631578947, 1.0, 0.9166666666666666, 1.0, 1.0, 1.0, 1.0, 0.9285714285714286, 0.9615384615384616, 1.0, 0.9523809523809523, 0.85, 0.9130434782608695, 0.9090909090909091, 1.0, 1.0, 0.8125, 1.0, 1.0, 1.0, 1.0, 1.0, 0.8823529411764706, 1.0, 1.0, 1.0, 0.9090909090909091, 0.9230769230769231, 1.0, 1.0, 1.0, 1.0, 0.8888888888888888, 0.8461538461538461, 0.8333333333333334, 1.0, 0.88, 0.9230769230769231, 0.9375, 0.9166666666666666, 0.9285714285714286, 0.9, 1.0, 1.0, 0.9166666666666666, 0.8666666666666667, 1.0, 0.9166666666666666, 1.0, 1.0, 1.0, 0.9545454545454546, 1.0, 0.9333333333333333, 1.0, 0.9285714285714286, 0.85, 1.0, 1.0, 0.8846153846153846, 1.0, 1.0, 0.9285714285714286, 1.0, 0.8235294117647058, 1.0, 1.0, 1.0, 1.0, 0.8666666666666667, 0.8461538461538461, 0.9615384615384616, 0.8666666666666667, 0.9166666666666666, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.9230769230769231, 0.9615384615384616, 1.0, 0.9, 1.0, 1.0, 1.0, 0.8333333333333334, 0.9333333333333333, 1.0, 1.0, 0.9090909090909091, 1.0, 1.0, 0.9615384615384616, 1.0, 1.0, 1.0, 0.9333333333333333, 1.0, 1.0, 0.8125, 1.0, 1.0, 1.0, 0.8125, 1.0, 0.9285714285714286, 1.0, 1.0, 1.0, 0.9375, 1.0, 0.9090909090909091, 1.0, 0.9230769230769231, 0.9, 0.9615384615384616, 0.8461538461538461, 0.9285714285714286, 1.0, 1.0, 0.8666666666666667, 1.0, 1.0, 1.0, 1.0, 1.0, 0.8888888888888888, 0.88, 0.896551724137931, 1.0, 1.0, 1.0, 0.9047619047619048, 0.9166666666666666, 0.8823529411764706, 1.0, 1.0, 0.8181818181818182, 0.9285714285714286, 0.9166666666666666, 1.0, 1.0, 0.8888888888888888, 1.0, 1.0, 1.0, 1.0, 0.92, 0.8333333333333334, 0.8823529411764706, 1.0, 0.9375, 0.8947368421052632, 0.875, 1.0, 0.9090909090909091, 0.9375, 1.0, 0.9565217391304348, 1.0, 1.0, 1.0, 0.8333333333333334, 1.0, 1.0, 1.0, 0.9333333333333333, 1.0, 1.0, 1.0, 0.9444444444444444, 0.8947368421052632, 1.0, 1.0, 0.8571428571428571, 0.9, 1.0, 1.0, 1.0, 0.8666666666666667, 1.0, 0.8666666666666667, 1.0, 0.9615384615384616, 0.875, 1.0, 0.9285714285714286, 0.8947368421052632, 0.9473684210526315, 1.0, 0.9473684210526315, 1.0, 1.0, 0.8461538461538461, 0.8571428571428571, 1.0, 0.9615384615384616, 0.8888888888888888, 1.0, 0.9285714285714286, 1.0, 0.9230769230769231, 1.0, 1.0, 1.0, 1.0, 0.8666666666666667, 1.0, 1.0, 0.9411764705882353, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.9285714285714286, 0.9285714285714286, 0.9545454545454546, 1.0, 1.0, 1.0, 0.9333333333333333, 0.9411764705882353, 0.8571428571428571, 1.0, 0.9166666666666666, 0.9166666666666666, 1.0, 0.9615384615384616, 1.0, 0.9166666666666666, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.9615384615384616, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.8823529411764706, 0.8888888888888888, 1.0, 1.0, 0.875, 1.0, 0.9411764705882353, 0.88, 1.0, 0.8571428571428571, 1.0, 1.0, 1.0, 0.92, 0.9090909090909091, 1.0, 0.9615384615384616, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.8571428571428571, 1.0, 0.9, 0.9411764705882353, 0.9285714285714286, 1.0, 0.8666666666666667, 0.8947368421052632, 1.0, 0.9615384615384616, 0.9375, 1.0, 1.0, 1.0, 1.0, 0.9090909090909091, 1.0, 0.9583333333333334, 0.8928571428571429, 1.0, 1.0, 1.0, 0.8333333333333334] |
|
53 |
+
| 20.1206 | 3.0 | 75 | 0.9595 | 23.6406 | [0.9, 0.875, 1.0, 1.0, 1.0, 1.0, 0.9285714285714286, 0.8181818181818182, 1.0, 1.0, 0.875, 1.0, 1.0, 0.8461538461538461, 0.9, 1.0, 1.0, 1.0, 1.0, 1.0, 0.9444444444444444, 1.0, 0.9615384615384616, 0.7857142857142857, 0.9333333333333333, 1.0, 1.0, 1.0, 0.9090909090909091, 1.0, 1.0, 1.0, 1.0, 0.9, 1.0, 0.8888888888888888, 0.9285714285714286, 1.0, 1.0, 1.0, 0.8888888888888888, 1.0, 1.0, 1.0, 1.0, 0.9375, 0.9565217391304348, 0.8823529411764706, 1.0, 0.9, 0.9130434782608695, 1.0, 1.0, 1.0, 1.0, 0.9, 1.0, 1.0, 1.0, 0.96, 1.0, 1.0, 1.0, 1.0, 0.9444444444444444, 0.8888888888888888, 1.0, 0.9615384615384616, 1.0, 1.0, 0.9090909090909091, 1.0, 1.0, 1.0, 0.7857142857142857, 0.8421052631578947, 1.0, 0.9166666666666666, 1.0, 1.0, 1.0, 1.0, 0.9285714285714286, 0.9615384615384616, 1.0, 0.9523809523809523, 0.9230769230769231, 0.9130434782608695, 1.0, 1.0, 1.0, 0.8125, 1.0, 0.8823529411764706, 1.0, 1.0, 1.0, 0.8823529411764706, 1.0, 1.0, 1.0, 0.9090909090909091, 0.9230769230769231, 1.0, 1.0, 1.0, 1.0, 0.8888888888888888, 0.8461538461538461, 0.8333333333333334, 1.0, 0.88, 0.9230769230769231, 0.9375, 0.9166666666666666, 0.9285714285714286, 0.9, 1.0, 0.8846153846153846, 0.9166666666666666, 0.8666666666666667, 1.0, 0.9166666666666666, 1.0, 1.0, 1.0, 0.9545454545454546, 1.0, 0.9333333333333333, 1.0, 0.9285714285714286, 0.85, 1.0, 1.0, 0.8846153846153846, 1.0, 1.0, 0.9285714285714286, 1.0, 1.0, 1.0, 1.0, 0.9473684210526315, 1.0, 0.8666666666666667, 0.8666666666666667, 0.9615384615384616, 0.8666666666666667, 1.0, 1.0, 1.0, 1.0, 1.0, 0.8666666666666667, 1.0, 0.9230769230769231, 0.9615384615384616, 1.0, 0.9, 1.0, 1.0, 1.0, 0.8333333333333334, 0.9333333333333333, 1.0, 1.0, 0.9090909090909091, 1.0, 1.0, 0.9615384615384616, 1.0, 1.0, 1.0, 0.9333333333333333, 1.0, 1.0, 0.8125, 1.0, 1.0, 1.0, 0.8125, 1.0, 0.9285714285714286, 1.0, 1.0, 1.0, 0.9375, 1.0, 0.9090909090909091, 1.0, 0.9230769230769231, 0.9, 0.9615384615384616, 0.8461538461538461, 0.9285714285714286, 1.0, 1.0, 0.8666666666666667, 1.0, 1.0, 1.0, 1.0, 1.0, 0.8888888888888888, 0.88, 0.896551724137931, 1.0, 1.0, 1.0, 0.9047619047619048, 0.9166666666666666, 0.8823529411764706, 1.0, 1.0, 0.8181818181818182, 0.9285714285714286, 0.9166666666666666, 1.0, 1.0, 0.8888888888888888, 1.0, 1.0, 0.8888888888888888, 1.0, 0.92, 0.8333333333333334, 0.8823529411764706, 1.0, 0.9375, 0.8947368421052632, 0.875, 1.0, 0.9090909090909091, 0.9375, 1.0, 0.9565217391304348, 1.0, 1.0, 1.0, 0.8333333333333334, 0.9285714285714286, 1.0, 1.0, 0.9333333333333333, 1.0, 0.8888888888888888, 1.0, 0.9444444444444444, 0.8947368421052632, 1.0, 1.0, 0.8571428571428571, 0.9, 1.0, 1.0, 1.0, 0.8666666666666667, 1.0, 0.8666666666666667, 1.0, 0.9615384615384616, 0.875, 1.0, 0.9285714285714286, 0.8947368421052632, 0.9473684210526315, 1.0, 0.9473684210526315, 1.0, 1.0, 0.8461538461538461, 0.8571428571428571, 1.0, 0.9615384615384616, 0.8888888888888888, 1.0, 1.0, 1.0, 0.9230769230769231, 1.0, 1.0, 1.0, 1.0, 0.8666666666666667, 1.0, 1.0, 0.9411764705882353, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.9285714285714286, 0.9285714285714286, 0.9545454545454546, 1.0, 1.0, 1.0, 0.9333333333333333, 0.9411764705882353, 0.8571428571428571, 1.0, 0.9166666666666666, 0.9166666666666666, 1.0, 0.9615384615384616, 1.0, 0.9166666666666666, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.9615384615384616, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.8823529411764706, 0.8888888888888888, 1.0, 1.0, 0.875, 1.0, 0.9375, 0.88, 1.0, 1.0, 1.0, 1.0, 1.0, 0.92, 0.9090909090909091, 1.0, 0.9615384615384616, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.8571428571428571, 1.0, 0.9, 0.9411764705882353, 0.9285714285714286, 1.0, 0.8666666666666667, 0.8947368421052632, 1.0, 0.9615384615384616, 0.9375, 1.0, 1.0, 1.0, 1.0, 0.9090909090909091, 1.0, 0.9583333333333334, 0.8928571428571429, 1.0, 1.0, 1.0, 0.8333333333333334] |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
54 |
|
55 |
|
56 |
### Framework versions
|
adapter_model.bin
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 9543690
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:66aca39e272f5850d22712f69ca086d0db34ebd55e9b8118c23c57f3ce1eb060
|
3 |
size 9543690
|
training_args.bin
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 6136
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:c0cc9808c03a1ed4d621aee7ff9988cc330b900055adaf3521923d7e529486b9
|
3 |
size 6136
|