kdvisdjf rkeogkw
commited on
Commit
·
2980cdc
1
Parent(s):
cc4f373
Update README.md
Browse files
README.md
CHANGED
@@ -21,28 +21,6 @@ widget:
|
|
21 |
biomedical domains and news texts. We explore the role of transfer learning
|
22 |
between different domains to improve the model performance on small text
|
23 |
corpora.
|
24 |
-
- text: >-
|
25 |
-
<|KEYPHRASES|> Relevance has traditionally been linked with feature subset
|
26 |
-
selection, but formalization of this link has not been attempted. In this
|
27 |
-
paper, we propose two axioms for feature subset selection sufficiency axiom
|
28 |
-
and necessity axiombased on which this link is formalized: The expected
|
29 |
-
feature subset is the one which maximizes relevance. Finding the expected
|
30 |
-
feature subset turns out to be NP-hard. We then devise a heuristic algorithm
|
31 |
-
to find the expected subset which has a polynomial time complexity. The
|
32 |
-
experimental results show that the algorithm finds good enough subset of
|
33 |
-
features which, when presented to C4.5, results in better prediction
|
34 |
-
accuracy.
|
35 |
-
- text: >-
|
36 |
-
<|TITLE|> Relevance has traditionally been linked with feature subset
|
37 |
-
selection, but formalization of this link has not been attempted. In this
|
38 |
-
paper, we propose two axioms for feature subset selection sufficiency axiom
|
39 |
-
and necessity axiombased on which this link is formalized: The expected
|
40 |
-
feature subset is the one which maximizes relevance. Finding the expected
|
41 |
-
feature subset turns out to be NP-hard. We then devise a heuristic algorithm
|
42 |
-
to find the expected subset which has a polynomial time complexity. The
|
43 |
-
experimental results show that the algorithm finds good enough subset of
|
44 |
-
features which, when presented to C4.5, results in better prediction
|
45 |
-
accuracy.
|
46 |
library_name: transformers
|
47 |
pipeline_tag: text2text-generation
|
48 |
---
|
|
|
21 |
biomedical domains and news texts. We explore the role of transfer learning
|
22 |
between different domains to improve the model performance on small text
|
23 |
corpora.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
24 |
library_name: transformers
|
25 |
pipeline_tag: text2text-generation
|
26 |
---
|