File size: 3,832 Bytes
fab9bbe
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
---
base_model: google/pegasus-large
tags:
- generated_from_trainer
metrics:
- rouge
- bleu
model-index:
- name: HealthSciencePegasusLargeModel
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# HealthSciencePegasusLargeModel

This model is a fine-tuned version of [google/pegasus-large](https://huggingface.co/google/pegasus-large) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 5.0998
- Rouge1: 51.1109
- Rouge2: 19.0065
- Rougel: 35.0665
- Rougelsum: 46.3738
- Bertscore Precision: 79.4711
- Bertscore Recall: 82.7557
- Bertscore F1: 81.0748
- Bleu: 0.1452
- Gen Len: 231.4938

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 1

### Training results

| Training Loss | Epoch  | Step | Validation Loss | Rouge1  | Rouge2  | Rougel  | Rougelsum | Bertscore Precision | Bertscore Recall | Bertscore F1 | Bleu   | Gen Len  |
|:-------------:|:------:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:-------------------:|:----------------:|:------------:|:------:|:--------:|
| 6.6062        | 0.0826 | 100  | 6.1946          | 41.0175 | 12.1872 | 26.8664 | 36.8972   | 76.5225             | 80.4867          | 78.4475      | 0.0925 | 231.4938 |
| 6.0566        | 0.1653 | 200  | 5.8019          | 45.7736 | 15.3675 | 30.7082 | 41.4411   | 77.7511             | 81.5573          | 79.6024      | 0.1196 | 231.4938 |
| 5.8921        | 0.2479 | 300  | 5.6555          | 45.6004 | 15.5854 | 31.2233 | 41.4395   | 77.7394             | 81.6428          | 79.6366      | 0.1203 | 231.4938 |
| 5.824         | 0.3305 | 400  | 5.5047          | 47.3353 | 17.0337 | 32.5302 | 42.994    | 78.2323             | 82.0751          | 80.1012      | 0.1318 | 231.4938 |
| 5.6546        | 0.4131 | 500  | 5.3968          | 48.8031 | 17.9059 | 33.4654 | 44.3006   | 78.5911             | 82.3105          | 80.4016      | 0.1377 | 231.4938 |
| 5.5794        | 0.4958 | 600  | 5.2980          | 49.3037 | 18.2072 | 33.8712 | 44.6912   | 78.6863             | 82.3772          | 80.4831      | 0.1396 | 231.4938 |
| 5.5792        | 0.5784 | 700  | 5.2361          | 49.4211 | 18.2373 | 34.1262 | 44.8449   | 78.7086             | 82.4391          | 80.5245      | 0.1401 | 231.4938 |
| 5.5137        | 0.6610 | 800  | 5.1859          | 49.9024 | 18.4281 | 34.402  | 45.3215   | 79.0156             | 82.5476          | 80.7374      | 0.1413 | 231.4938 |
| 5.3983        | 0.7436 | 900  | 5.1471          | 50.4151 | 18.6752 | 34.688  | 45.8432   | 79.2355             | 82.6237          | 80.8887      | 0.1427 | 231.4938 |
| 5.3874        | 0.8263 | 1000 | 5.1214          | 50.9831 | 18.9709 | 34.9595 | 46.2721   | 79.3533             | 82.7398          | 81.0059      | 0.1449 | 231.4938 |
| 5.33          | 0.9089 | 1100 | 5.1074          | 51.1059 | 19.0707 | 35.0338 | 46.2997   | 79.4385             | 82.7671          | 81.0633      | 0.1456 | 231.4938 |
| 5.4559        | 0.9915 | 1200 | 5.0998          | 51.1109 | 19.0065 | 35.0665 | 46.3738   | 79.4711             | 82.7557          | 81.0748      | 0.1452 | 231.4938 |


### Framework versions

- Transformers 4.41.2
- Pytorch 2.1.2
- Datasets 2.2.1
- Tokenizers 0.19.1