José Ángel González
commited on
Commit
·
0121c8c
1
Parent(s):
d9bae58
Update README.md
Browse files
README.md
CHANGED
@@ -10,4 +10,4 @@ widget:
|
|
10 |
|
11 |
News Abstractive Summarization for Spanish (NASES) is a Transformer encoder-decoder model, with the same hyper-parameters than BART, to perform summarization of Spanish news articles. It is pre-trained on a combination of several self-supervised tasks that help to increase the abstractivity of the generated summaries. Four pre-training tasks have been combined: sentence permutation, text infilling, Gap Sentence Generation, and Next Segment Generation. Spanish newspapers, and Wikipedia articles in Spanish were used for pre-training the model (21GB of raw text -8.5 millions of documents-).
|
12 |
|
13 |
-
NASES is finetuned for the summarization task on 1.802.919
|
|
|
10 |
|
11 |
News Abstractive Summarization for Spanish (NASES) is a Transformer encoder-decoder model, with the same hyper-parameters than BART, to perform summarization of Spanish news articles. It is pre-trained on a combination of several self-supervised tasks that help to increase the abstractivity of the generated summaries. Four pre-training tasks have been combined: sentence permutation, text infilling, Gap Sentence Generation, and Next Segment Generation. Spanish newspapers, and Wikipedia articles in Spanish were used for pre-training the model (21GB of raw text -8.5 millions of documents-).
|
12 |
|
13 |
+
NASES is finetuned for the summarization task on 1.802.919 (document, summary) pairs from the Dataset for Automatic summarization of Catalan and Spanish newspaper Articles (DACSA).
|