File size: 3,951 Bytes
6080719
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
---
license: apache-2.0
base_model: google/t5-v1_1-large
tags:
- generated_from_trainer
model-index:
- name: SChem5Labels-google-t5-v1_1-large-inter_model-sorted-model_annots_str
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# SChem5Labels-google-t5-v1_1-large-inter_model-sorted-model_annots_str

This model is a fine-tuned version of [google/t5-v1_1-large](https://huggingface.co/google/t5-v1_1-large) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8994

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 200

### Training results

| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 20.7228       | 1.0   | 25   | 24.7182         |
| 19.7481       | 2.0   | 50   | 23.4091         |
| 19.3221       | 3.0   | 75   | 20.7043         |
| 17.2262       | 4.0   | 100  | 15.0872         |
| 14.7272       | 5.0   | 125  | 10.1222         |
| 12.1186       | 6.0   | 150  | 9.0762          |
| 10.3305       | 7.0   | 175  | 8.7090          |
| 8.8199        | 8.0   | 200  | 8.4070          |
| 8.2346        | 9.0   | 225  | 8.2048          |
| 7.9113        | 10.0  | 250  | 8.1018          |
| 7.6524        | 11.0  | 275  | 8.0398          |
| 7.6476        | 12.0  | 300  | 7.9791          |
| 7.4487        | 13.0  | 325  | 7.8957          |
| 7.3635        | 14.0  | 350  | 7.7393          |
| 7.2677        | 15.0  | 375  | 7.4303          |
| 7.0316        | 16.0  | 400  | 7.1862          |
| 6.7999        | 17.0  | 425  | 7.0031          |
| 6.6811        | 18.0  | 450  | 6.8875          |
| 6.6207        | 19.0  | 475  | 6.8224          |
| 6.4587        | 20.0  | 500  | 6.7708          |
| 6.3888        | 21.0  | 525  | 6.7248          |
| 6.3971        | 22.0  | 550  | 6.6744          |
| 6.3969        | 23.0  | 575  | 6.3850          |
| 0.91          | 24.0  | 600  | 0.7331          |
| 0.7237        | 25.0  | 625  | 0.6588          |
| 0.6831        | 26.0  | 650  | 0.6276          |
| 0.6785        | 27.0  | 675  | 0.6266          |
| 0.6673        | 28.0  | 700  | 0.6269          |
| 0.6728        | 29.0  | 725  | 0.6230          |
| 0.6643        | 30.0  | 750  | 0.6204          |
| 0.662         | 31.0  | 775  | 0.6187          |
| 0.6664        | 32.0  | 800  | 0.6195          |
| 0.6568        | 33.0  | 825  | 0.6180          |
| 0.6453        | 34.0  | 850  | 0.6187          |
| 0.6619        | 35.0  | 875  | 0.6260          |
| 0.6539        | 36.0  | 900  | 0.6168          |
| 0.6468        | 37.0  | 925  | 0.6188          |
| 0.6567        | 38.0  | 950  | 0.6221          |
| 0.6521        | 39.0  | 975  | 0.6172          |
| 0.6403        | 40.0  | 1000 | 0.6141          |
| 0.6505        | 41.0  | 1025 | 0.6147          |
| 0.6419        | 42.0  | 1050 | 0.6174          |
| 0.6436        | 43.0  | 1075 | 0.6174          |
| 0.6381        | 44.0  | 1100 | 0.6130          |
| 0.6527        | 45.0  | 1125 | 0.6144          |
| 0.6489        | 46.0  | 1150 | 0.6129          |
| 0.6395        | 47.0  | 1175 | 0.6141          |
| 0.6483        | 48.0  | 1200 | 0.6174          |
| 0.6331        | 49.0  | 1225 | 0.6165          |
| 0.6454        | 50.0  | 1250 | 0.6141          |
| 0.6356        | 51.0  | 1275 | 0.6140          |


### Framework versions

- Transformers 4.34.0
- Pytorch 2.1.0+cu121
- Datasets 2.6.1
- Tokenizers 0.14.1