Update README.md
Browse files
README.md
CHANGED
@@ -2,17 +2,23 @@
|
|
2 |
tags:
|
3 |
- generated_from_keras_callback
|
4 |
model-index:
|
5 |
-
- name:
|
6 |
results: []
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
7 |
---
|
8 |
|
9 |
-
|
10 |
-
|
11 |
-
# pegasus-indonesian-base_finetune
|
12 |
|
13 |
Github : [PegasusAnthony](https://github.com/nicholaswilven/PEGASUSAnthony/tree/master)
|
14 |
|
15 |
-
This model is a fine-tuned version of [
|
16 |
|
17 |
It achieves the following results on the evaluation set:
|
18 |
- Train Loss: 1.6196
|
@@ -45,7 +51,7 @@ Finetune dataset:
|
|
45 |
```python
|
46 |
# Load model and tokenizer
|
47 |
from transformers import TFPegasusForConditionalGeneration, PegasusTokenizerFast
|
48 |
-
model_name = "thonyyy/
|
49 |
model = TFPegasusForConditionalGeneration.from_pretrained(model_name)
|
50 |
tokenizer = PegasusTokenizerFast.from_pretrained(model_name)
|
51 |
|
@@ -103,4 +109,4 @@ The following hyperparameters were used during training:
|
|
103 |
- Tokenizers 0.13.3
|
104 |
|
105 |
### Special Thanks
|
106 |
-
Research supported with Cloud TPUs from Google’s TPU Research Cloud (TRC)
|
|
|
2 |
tags:
|
3 |
- generated_from_keras_callback
|
4 |
model-index:
|
5 |
+
- name: pegasus_indonesian_base-finetune
|
6 |
results: []
|
7 |
+
license: apache-2.0
|
8 |
+
datasets:
|
9 |
+
- csebuetnlp/xlsum
|
10 |
+
language:
|
11 |
+
- id
|
12 |
+
metrics:
|
13 |
+
- rouge
|
14 |
+
pipeline_tag: summarization
|
15 |
---
|
16 |
|
17 |
+
# pegasus_indonesian_base-finetune
|
|
|
|
|
18 |
|
19 |
Github : [PegasusAnthony](https://github.com/nicholaswilven/PEGASUSAnthony/tree/master)
|
20 |
|
21 |
+
This model is a fine-tuned version of [pegasus_indonesian_base-pretrained](https://huggingface.co/thonyyy/pegasus_indonesian_base-pretrained) on [Indosum](https://paperswithcode.com/dataset/indosum), [Liputan6](https://paperswithcode.com/dataset/liputan6) and [XLSum](https://huggingface.co/datasets/csebuetnlp/xlsum).
|
22 |
|
23 |
It achieves the following results on the evaluation set:
|
24 |
- Train Loss: 1.6196
|
|
|
51 |
```python
|
52 |
# Load model and tokenizer
|
53 |
from transformers import TFPegasusForConditionalGeneration, PegasusTokenizerFast
|
54 |
+
model_name = "thonyyy/pegasus_indonesian_base-finetune"
|
55 |
model = TFPegasusForConditionalGeneration.from_pretrained(model_name)
|
56 |
tokenizer = PegasusTokenizerFast.from_pretrained(model_name)
|
57 |
|
|
|
109 |
- Tokenizers 0.13.3
|
110 |
|
111 |
### Special Thanks
|
112 |
+
Research supported with Cloud TPUs from Google’s TPU Research Cloud (TRC)
|