kdvisdjf rkeogkw commited on
Commit
d140c47
·
1 Parent(s): 1ef3e9a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +40 -7
README.md CHANGED
@@ -4,12 +4,46 @@ datasets:
4
  - midas/inspec
5
  language:
6
  - en
7
-
8
  widget:
9
- - text: "<|KEYPHRASES|> In this paper, we investigate cross-domain limitations of keyphrase generation using the models for abstractive text summarization. We present an evaluation of BART fine-tuned for keyphrase generation across three types of texts, namely scientific texts from computer science and biomedical domains and news texts. We explore the role of transfer learning between different domains to improve the model performance on small text corpora."
10
- - text: "<|TITLE|> In this paper, we investigate cross-domain limitations of keyphrase generation using the models for abstractive text summarization. We present an evaluation of BART fine-tuned for keyphrase generation across three types of texts, namely scientific texts from computer science and biomedical domains and news texts. We explore the role of transfer learning between different domains to improve the model performance on small text corpora."
11
- - text: "<|KEYPHRASES|> Relevance has traditionally been linked with feature subset selection, but formalization of this link has not been attempted. In this paper, we propose two axioms for feature subset selection sufficiency axiom and necessity axiombased on which this link is formalized: The expected feature subset is the one which maximizes relevance. Finding the expected feature subset turns out to be NP-hard. We then devise a heuristic algorithm to find the expected subset which has a polynomial time complexity. The experimental results show that the algorithm finds good enough subset of features which, when presented to C4.5, results in better prediction accuracy."
12
- - text: "<|TITLE|> Relevance has traditionally been linked with feature subset selection, but formalization of this link has not been attempted. In this paper, we propose two axioms for feature subset selection sufficiency axiom and necessity axiombased on which this link is formalized: The expected feature subset is the one which maximizes relevance. Finding the expected feature subset turns out to be NP-hard. We then devise a heuristic algorithm to find the expected subset which has a polynomial time complexity. The experimental results show that the algorithm finds good enough subset of features which, when presented to C4.5, results in better prediction accuracy."
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
13
  ---
14
 
15
  # BART fine-tuned for keyphrase generation
@@ -66,5 +100,4 @@ The following hyperparameters were used during training:
66
  journal={arXiv preprint arXiv:2209.03791},
67
  year={2022}
68
  }
69
- ```
70
-
 
4
  - midas/inspec
5
  language:
6
  - en
 
7
  widget:
8
+ - text: >-
9
+ <|KEYPHRASES|> In this paper, we investigate cross-domain limitations of
10
+ keyphrase generation using the models for abstractive text summarization. We
11
+ present an evaluation of BART fine-tuned for keyphrase generation across
12
+ three types of texts, namely scientific texts from computer science and
13
+ biomedical domains and news texts. We explore the role of transfer learning
14
+ between different domains to improve the model performance on small text
15
+ corpora.
16
+ - text: >-
17
+ <|TITLE|> In this paper, we investigate cross-domain limitations of
18
+ keyphrase generation using the models for abstractive text summarization. We
19
+ present an evaluation of BART fine-tuned for keyphrase generation across
20
+ three types of texts, namely scientific texts from computer science and
21
+ biomedical domains and news texts. We explore the role of transfer learning
22
+ between different domains to improve the model performance on small text
23
+ corpora.
24
+ - text: >-
25
+ <|KEYPHRASES|> Relevance has traditionally been linked with feature subset
26
+ selection, but formalization of this link has not been attempted. In this
27
+ paper, we propose two axioms for feature subset selection sufficiency axiom
28
+ and necessity axiombased on which this link is formalized: The expected
29
+ feature subset is the one which maximizes relevance. Finding the expected
30
+ feature subset turns out to be NP-hard. We then devise a heuristic algorithm
31
+ to find the expected subset which has a polynomial time complexity. The
32
+ experimental results show that the algorithm finds good enough subset of
33
+ features which, when presented to C4.5, results in better prediction
34
+ accuracy.
35
+ - text: >-
36
+ <|TITLE|> Relevance has traditionally been linked with feature subset
37
+ selection, but formalization of this link has not been attempted. In this
38
+ paper, we propose two axioms for feature subset selection sufficiency axiom
39
+ and necessity axiombased on which this link is formalized: The expected
40
+ feature subset is the one which maximizes relevance. Finding the expected
41
+ feature subset turns out to be NP-hard. We then devise a heuristic algorithm
42
+ to find the expected subset which has a polynomial time complexity. The
43
+ experimental results show that the algorithm finds good enough subset of
44
+ features which, when presented to C4.5, results in better prediction
45
+ accuracy.
46
+ library_name: transformers
47
  ---
48
 
49
  # BART fine-tuned for keyphrase generation
 
100
  journal={arXiv preprint arXiv:2209.03791},
101
  year={2022}
102
  }
103
+ ```