Datasets:

Modalities:
Text
Formats:
json
Languages:
Chinese
ArXiv:
Libraries:
Datasets
pandas
License:
xywang1 commited on
Commit
adb3857
·
1 Parent(s): 93f4d2d

Upload data

Browse files
Files changed (8) hide show
  1. .gitattributes +1 -0
  2. LICENSE +60 -0
  3. README.md +98 -0
  4. dev.txt +272 -0
  5. dialog_release.json +3 -0
  6. document_url_release.json +3 -0
  7. test.txt +272 -0
  8. train.txt +0 -0
.gitattributes CHANGED
@@ -56,3 +56,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
56
  # Video files - compressed
57
  *.mp4 filter=lfs diff=lfs merge=lfs -text
58
  *.webm filter=lfs diff=lfs merge=lfs -text
 
 
56
  # Video files - compressed
57
  *.mp4 filter=lfs diff=lfs merge=lfs -text
58
  *.webm filter=lfs diff=lfs merge=lfs -text
59
+ *.json filter=lfs diff=lfs merge=lfs -text
LICENSE CHANGED
@@ -0,0 +1,60 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Tencent AI Lab NaturalConv Dataset Terms and Conditions
2
+
3
+ Thank you for your interest in the Tencent AI Lab NaturalConv Dataset.
4
+
5
+ PLEASE READ THE FOLLOWING TERMS CAREFULLY:
6
+
7
+ BY DOWNLOADING OR OTHERWISE ACCESSING OR USING THE TENCENT AI LAB NATURALCONV
8
+ DATASET (THE “DATASET”), YOU AGREE THAT YOU HAVE READ AND UNDERSTOOD, AND, AS A
9
+ CONDITION TO YOUR USE OF THE DATASET, YOU AGREE TO BE BOUND BY, THE FOLLOWING
10
+ TERMS AND CONDITIONS, INCLUDING TENCENT’S PRIVACY POLICY [1] AND THE TENCENT SERVICE
11
+ AGREEMENT [2] (TOGETHER, THESE “TERMS”). THE PRIVACY POLICY AND THE SERVICE
12
+ AGREEMENT ARE INCORPORATED BY THIS REFERENCE INTO, AND MADE A PART OF, THESE
13
+ TERMS. IF YOU ARE NOT ELIGIBLE, OR DO NOT AGREE TO THE TERMS, THEN YOU DO NOT HAVE
14
+ OUR PERMISSION TO USE THE DATASET. YOUR USE OF THE DATASET, AND TENCENT’S
15
+ PROVISION OF THE DATASET TO YOU, CONSTITUTES AN AGREEMENT BY TENCENT AND BY YOU
16
+ TO BE BOUND BY THESE TERMS.
17
+
18
+ 1. Dataset Contents. The Dataset consists of annotated dialogues (each, a “Sequence”)
19
+ and hyperlinks to news articles (each, an “Article”). All references to the Dataset also
20
+ refer to each of the Sequences and Articles.
21
+
22
+ 2. Ownership. The Sequences were created and annotated by or on behalf of Tencent
23
+ International Service Pte. Ltd. or its affiliates (“Tencent”). The Dataset is the property of
24
+ Tencent and is protected by intellectual property and other laws. Tencent reserves all
25
+ rights to the Dataset not granted expressly in these Terms.
26
+
27
+ 3. Non-Commercial License Grant. Subject to your complete and ongoing compliance with
28
+ these Terms, Tencent grants you, solely for your personal, non-commercial use, a
29
+ limited, non-exclusive, non-transferable, non-sublicensable, revocable license to
30
+ download, reproduce, and make derivative works of the Dataset, solely for the following
31
+ permitted uses:
32
+ a. Scientific research, development, and testing; and
33
+ b. Publication in academic papers and display at technology research and
34
+ development events.
35
+
36
+ 4. License Restrictions. Except as expressly permitted in these Terms, you may not:
37
+ a. Reproduce, distribute, publicly display, or publicly perform, or otherwise use
38
+ any portion of the Dataset or any derivative work of the Dataset;
39
+ b. Download the Dataset or make the Dataset or any portion of the Dataset
40
+ available from any source other than Tencent;
41
+ c. Resell or distribute any portion of the Dataset or any derivative work of the
42
+ Dataset;
43
+ d. Use any portion of the Dataset or any derivative work of the Dataset in any
44
+ publication, product brochure, or for other advertising or marketing purpose;
45
+ e. Use any portion of the Dataset or any derivative work of the Dataset in any
46
+ other written work;
47
+ f. Use any portion of the Dataset if you are prohibited under applicable law
48
+ from doing so, or use any portion of the Dataset in violation of applicable
49
+ law;
50
+ g. Attempt to do any of the acts described above in (a) through (f) or assist or
51
+ permit any person in engaging in any of the acts described above in (a)
52
+ through (f).
53
+
54
+ 5. Disclaimers. The Dataset is provided free of charge, Accordingly, Tencent and its
55
+ licensors provide the Dataset AS-IS, without warranty of any kind, express or implied.
56
+ The views and opinions expressed in the Dataset (including the content of the Articles)
57
+ do not necessarily reflect those of Tencent.
58
+
59
+ [1] https://www.tencent.com/en-us/privacy-policy.html
60
+ [2] https://www.tencent.com/en-us/service-agreement.html
README.md CHANGED
@@ -2,4 +2,102 @@
2
  license: other
3
  license_name: tencent-ai-lab-naturalconv-dataset-terms-and-conditions
4
  license_link: LICENSE
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2
  license: other
3
  license_name: tencent-ai-lab-naturalconv-dataset-terms-and-conditions
4
  license_link: LICENSE
5
+ task_categories:
6
+ - text-generation
7
+ language:
8
+ - zh
9
+ tags:
10
+ - dialogue
11
+ - multi-turn
12
+ - topic-driven
13
+ - document
14
+ - news
15
+ - conversation
16
+ size_categories:
17
+ - 10K<n<100K
18
+ configs:
19
+ - config_name: default
20
+ data_files: dialog_release.json
21
  ---
22
+
23
+ # NaturalConv: A Chinese Dialogue Dataset Towards Multi-turn Topic-driven Conversation
24
+
25
+ ## Introduction
26
+
27
+ This dataset is described in the paper [NaturalConv: A Chinese Dialogue Dataset Towards Multi-turn Topic-driven Conversation](https://arxiv.org/abs/2103.02548). The entire dataset contains 5 data files.
28
+
29
+ ### 1. dialog_release.json:
30
+
31
+ It is a json file containing a list of dictionaries.
32
+
33
+ After loading in python this way:
34
+
35
+ ```python
36
+ import json
37
+ import codecs
38
+ dialog_list = json.loads(codecs.open("dialog_release.json", "r", "utf-8").read())
39
+ ```
40
+
41
+ dialog_list is a list whose element is a dictionary.
42
+
43
+ Each dictionary contains three keys: "dialog_id", "document_id" and "content":
44
+ "dialog_id" is an unique id for this dialogue.
45
+ "document_id" represents which doucment this dialogue is grounded on.
46
+ "content" is a list of the whole dialogue session.
47
+
48
+ Altogether there are 19,919 dialogs, with approximately 400K dialogue utterances.
49
+
50
+
51
+ ### 2. document_url_release.json:
52
+
53
+ It is a json file containing a list of dictionaries.
54
+
55
+ After loading in python this way:
56
+
57
+ ```python
58
+ import json
59
+ import codecs
60
+ document_list = json.loads(codecs.open("document_url_release.json", "r", "utf-8").read())
61
+ ```
62
+
63
+ document_list is a list whose element is a dictionary.
64
+
65
+ Each dictionary contains three keys: "document_id", "topic", and "url":
66
+ "document_id" is an unique id for this document.
67
+ "topic" represents which topic this document comes from.
68
+ "url" represents the url of the original document.
69
+
70
+ Altogether there are 6,500 documents.
71
+
72
+
73
+ ### 3, 4, and 5. train.txt, dev.txt, and test.txt:
74
+
75
+ Each file contains the "dialog_id" for train, dev and test, respectively.
76
+
77
+
78
+ ## Document Downloading
79
+
80
+ For research purpose only, you can refer to the code shared in this [repositary](https://github.com/naturalconv/NaturalConvDataSet) for downloading the document texts through the released urls in the document_url_release.json file.
81
+
82
+
83
+ ## Citation
84
+
85
+ Please kindly cite our paper if you find this dataset useful:
86
+
87
+ ```
88
+ @inproceedings{aaai-2021-naturalconv,
89
+ title={NaturalConv: A Chinese Dialogue Dataset Towards Multi-turn Topic-driven Conversation},
90
+ author={Wang, Xiaoyang and Li, Chen and Zhao, Jianqiao and Yu, Dong},
91
+ booktitle={Proceedings of the 35th AAAI Conference on Artificial Intelligence (AAAI-21)},
92
+ year={2021}
93
+ }
94
+ ```
95
+
96
+ ## License
97
+
98
+ The dataset is released for non-commercial usage only. By downloading, you agree to the terms and conditions in our [LICENSE](https://huggingface.co/datasets/xywang1/NaturalConv/blob/main/LICENSE). For the authorization of commercial usage, please contact [email protected] for details.
99
+
100
+
101
+ ## Disclaimers
102
+
103
+ The dataset is provided AS-IS, without warranty of any kind, express or implied. The views and opinions expressed in the dataset including the documents and the dialogues do not necessarily reflect those of Tencent or the authors of the above paper.
dev.txt ADDED
@@ -0,0 +1,272 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2887_0
2
+ 2887_1
3
+ 2887_2
4
+ 2887_3
5
+ 3436_0
6
+ 3436_1
7
+ 3436_2
8
+ 3436_3
9
+ 2572_0
10
+ 2572_1
11
+ 2572_2
12
+ 2572_3
13
+ 719_0
14
+ 719_1
15
+ 719_2
16
+ 719_3
17
+ 3238_0
18
+ 3238_1
19
+ 3238_2
20
+ 3238_3
21
+ 3324_0
22
+ 3324_1
23
+ 3324_2
24
+ 3324_3
25
+ 1315_0
26
+ 1315_1
27
+ 1315_2
28
+ 1315_3
29
+ 2357_0
30
+ 2357_1
31
+ 2357_2
32
+ 2357_3
33
+ 410_0
34
+ 410_1
35
+ 410_2
36
+ 410_3
37
+ 2628_0
38
+ 2628_1
39
+ 2628_2
40
+ 2628_3
41
+ 1309_0
42
+ 1309_1
43
+ 1309_2
44
+ 1309_3
45
+ 288_0
46
+ 288_1
47
+ 288_2
48
+ 288_3
49
+ 178_0
50
+ 178_1
51
+ 178_2
52
+ 178_3
53
+ 3305_0
54
+ 3305_1
55
+ 3305_2
56
+ 3305_3
57
+ 831_0
58
+ 831_1
59
+ 831_2
60
+ 831_3
61
+ 3468_0
62
+ 3468_1
63
+ 3468_2
64
+ 3468_3
65
+ 2714_0
66
+ 2714_1
67
+ 2714_2
68
+ 2714_3
69
+ 3224_0
70
+ 3224_1
71
+ 3224_2
72
+ 3224_3
73
+ 236_0
74
+ 236_1
75
+ 236_2
76
+ 236_3
77
+ 2704_0
78
+ 2704_1
79
+ 2704_2
80
+ 2704_3
81
+ 1407_0
82
+ 1407_1
83
+ 1407_2
84
+ 1407_3
85
+ 278_0
86
+ 278_1
87
+ 278_2
88
+ 278_3
89
+ 2313_0
90
+ 2313_1
91
+ 2313_2
92
+ 2313_3
93
+ 2755_0
94
+ 2755_1
95
+ 2755_2
96
+ 2755_3
97
+ 522_0
98
+ 522_1
99
+ 522_2
100
+ 522_3
101
+ 1540_0
102
+ 1540_1
103
+ 1540_2
104
+ 1540_3
105
+ 1590_0
106
+ 1590_1
107
+ 1590_2
108
+ 1590_3
109
+ 2610_0
110
+ 2610_1
111
+ 2610_2
112
+ 2610_3
113
+ 2820_0
114
+ 2820_1
115
+ 2820_2
116
+ 2820_3
117
+ 1917_0
118
+ 1917_1
119
+ 1917_2
120
+ 1917_3
121
+ 2418_0
122
+ 2418_1
123
+ 2418_2
124
+ 2418_3
125
+ 3100_0
126
+ 3100_1
127
+ 3100_2
128
+ 3100_3
129
+ 3326_0
130
+ 3326_1
131
+ 3326_2
132
+ 3326_3
133
+ 430_0
134
+ 430_1
135
+ 430_2
136
+ 430_3
137
+ 2608_0
138
+ 2608_1
139
+ 2608_2
140
+ 2608_3
141
+ 1398_0
142
+ 1398_1
143
+ 1398_2
144
+ 1398_3
145
+ 2617_0
146
+ 2617_1
147
+ 2617_2
148
+ 2617_3
149
+ 3487_0
150
+ 3487_1
151
+ 3487_2
152
+ 3487_3
153
+ 2168_0
154
+ 2168_1
155
+ 2168_2
156
+ 2168_3
157
+ 445_0
158
+ 445_1
159
+ 445_2
160
+ 445_3
161
+ 1585_0
162
+ 1585_1
163
+ 1585_2
164
+ 1585_3
165
+ 1783_0
166
+ 1783_1
167
+ 1783_2
168
+ 1783_3
169
+ 3133_0
170
+ 3133_1
171
+ 3133_2
172
+ 3133_3
173
+ 1474_0
174
+ 1474_1
175
+ 1474_2
176
+ 1474_3
177
+ 1941_0
178
+ 1941_1
179
+ 1941_2
180
+ 1941_3
181
+ 1333_0
182
+ 1333_1
183
+ 1333_2
184
+ 1333_3
185
+ 929_0
186
+ 929_1
187
+ 929_2
188
+ 929_3
189
+ 545_0
190
+ 545_1
191
+ 545_2
192
+ 545_3
193
+ 2024_0
194
+ 2024_1
195
+ 2024_2
196
+ 2024_3
197
+ 3221_0
198
+ 3221_1
199
+ 3221_2
200
+ 3221_3
201
+ 3488_0
202
+ 3488_1
203
+ 3488_2
204
+ 3488_3
205
+ 2155_0
206
+ 2155_1
207
+ 2155_2
208
+ 2155_3
209
+ 94_0
210
+ 94_1
211
+ 94_2
212
+ 94_3
213
+ 2379_0
214
+ 2379_1
215
+ 2379_2
216
+ 2379_3
217
+ 3372_0
218
+ 3372_1
219
+ 3372_2
220
+ 3372_3
221
+ 1153_0
222
+ 1153_1
223
+ 1153_2
224
+ 1153_3
225
+ 469_0
226
+ 469_1
227
+ 469_2
228
+ 469_3
229
+ 2991_0
230
+ 2991_1
231
+ 2991_2
232
+ 2991_3
233
+ 492_0
234
+ 492_1
235
+ 492_2
236
+ 492_3
237
+ 3134_0
238
+ 3134_1
239
+ 3134_2
240
+ 3134_3
241
+ 1835_0
242
+ 1835_1
243
+ 1835_2
244
+ 1835_3
245
+ 1516_0
246
+ 1516_1
247
+ 1516_2
248
+ 1516_3
249
+ 2067_0
250
+ 2067_1
251
+ 2067_2
252
+ 2067_3
253
+ 1971_0
254
+ 1971_1
255
+ 1971_2
256
+ 1971_3
257
+ 2043_0
258
+ 2043_1
259
+ 2043_2
260
+ 2043_3
261
+ 100_0
262
+ 100_1
263
+ 100_2
264
+ 100_3
265
+ 1552_0
266
+ 1552_1
267
+ 1552_2
268
+ 1552_3
269
+ 265_0
270
+ 265_1
271
+ 265_2
272
+ 265_3
dialog_release.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0aed101b0f295f87da8d27c3fb4eb62955488b7116c9a9c63c929f65656c195a
3
+ size 51726524
document_url_release.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6ba5ded35e15183a339ce806a10d2d019b3aec641a812fae034810155dd0e4f4
3
+ size 739488
test.txt ADDED
@@ -0,0 +1,272 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2637_0
2
+ 2637_1
3
+ 2637_2
4
+ 2637_3
5
+ 1974_0
6
+ 1974_1
7
+ 1974_2
8
+ 1974_3
9
+ 570_0
10
+ 570_1
11
+ 570_2
12
+ 570_3
13
+ 1458_0
14
+ 1458_1
15
+ 1458_2
16
+ 1458_3
17
+ 2434_0
18
+ 2434_1
19
+ 2434_2
20
+ 2434_3
21
+ 1147_0
22
+ 1147_1
23
+ 1147_2
24
+ 1147_3
25
+ 2231_0
26
+ 2231_1
27
+ 2231_2
28
+ 2231_3
29
+ 110_0
30
+ 110_1
31
+ 110_2
32
+ 110_3
33
+ 1098_0
34
+ 1098_1
35
+ 1098_2
36
+ 1098_3
37
+ 2676_0
38
+ 2676_1
39
+ 2676_2
40
+ 2676_3
41
+ 1721_0
42
+ 1721_1
43
+ 1721_2
44
+ 1721_3
45
+ 1085_0
46
+ 1085_1
47
+ 1085_2
48
+ 1085_3
49
+ 551_0
50
+ 551_1
51
+ 551_2
52
+ 551_3
53
+ 1633_0
54
+ 1633_1
55
+ 1633_2
56
+ 1633_3
57
+ 2998_0
58
+ 2998_1
59
+ 2998_2
60
+ 2998_3
61
+ 2423_0
62
+ 2423_1
63
+ 2423_2
64
+ 2423_3
65
+ 426_0
66
+ 426_1
67
+ 426_2
68
+ 426_3
69
+ 852_0
70
+ 852_1
71
+ 852_2
72
+ 852_3
73
+ 285_0
74
+ 285_1
75
+ 285_2
76
+ 285_3
77
+ 1388_0
78
+ 1388_1
79
+ 1388_2
80
+ 1388_3
81
+ 3043_0
82
+ 3043_1
83
+ 3043_2
84
+ 3043_3
85
+ 3211_0
86
+ 3211_1
87
+ 3211_2
88
+ 3211_3
89
+ 841_0
90
+ 841_1
91
+ 841_2
92
+ 841_3
93
+ 3289_0
94
+ 3289_1
95
+ 3289_2
96
+ 3289_3
97
+ 2648_0
98
+ 2648_1
99
+ 2648_2
100
+ 2648_3
101
+ 1250_0
102
+ 1250_1
103
+ 1250_2
104
+ 1250_3
105
+ 2490_0
106
+ 2490_1
107
+ 2490_2
108
+ 2490_3
109
+ 2776_0
110
+ 2776_1
111
+ 2776_2
112
+ 2776_3
113
+ 2224_0
114
+ 2224_1
115
+ 2224_2
116
+ 2224_3
117
+ 1890_0
118
+ 1890_1
119
+ 1890_2
120
+ 1890_3
121
+ 364_0
122
+ 364_1
123
+ 364_2
124
+ 364_3
125
+ 1065_0
126
+ 1065_1
127
+ 1065_2
128
+ 1065_3
129
+ 2424_0
130
+ 2424_1
131
+ 2424_2
132
+ 2424_3
133
+ 3407_0
134
+ 3407_1
135
+ 3407_2
136
+ 3407_3
137
+ 745_0
138
+ 745_1
139
+ 745_2
140
+ 745_3
141
+ 1563_0
142
+ 1563_1
143
+ 1563_2
144
+ 1563_3
145
+ 678_0
146
+ 678_1
147
+ 678_2
148
+ 678_3
149
+ 2766_0
150
+ 2766_1
151
+ 2766_2
152
+ 2766_3
153
+ 3410_0
154
+ 3410_1
155
+ 3410_2
156
+ 3410_3
157
+ 502_0
158
+ 502_1
159
+ 502_2
160
+ 502_3
161
+ 3270_0
162
+ 3270_1
163
+ 3270_2
164
+ 3270_3
165
+ 789_0
166
+ 789_1
167
+ 789_2
168
+ 789_3
169
+ 1548_0
170
+ 1548_1
171
+ 1548_2
172
+ 1548_3
173
+ 3489_0
174
+ 3489_1
175
+ 3489_2
176
+ 3489_3
177
+ 947_0
178
+ 947_1
179
+ 947_2
180
+ 947_3
181
+ 1448_0
182
+ 1448_1
183
+ 1448_2
184
+ 1448_3
185
+ 782_0
186
+ 782_1
187
+ 782_2
188
+ 782_3
189
+ 1316_0
190
+ 1316_1
191
+ 1316_2
192
+ 1316_3
193
+ 1763_0
194
+ 1763_1
195
+ 1763_2
196
+ 1763_3
197
+ 3481_0
198
+ 3481_1
199
+ 3481_2
200
+ 3481_3
201
+ 3376_0
202
+ 3376_1
203
+ 3376_2
204
+ 3376_3
205
+ 628_0
206
+ 628_1
207
+ 628_2
208
+ 628_3
209
+ 1264_0
210
+ 1264_1
211
+ 1264_2
212
+ 1264_3
213
+ 1429_0
214
+ 1429_1
215
+ 1429_2
216
+ 1429_3
217
+ 2512_0
218
+ 2512_1
219
+ 2512_2
220
+ 2512_3
221
+ 3179_0
222
+ 3179_1
223
+ 3179_2
224
+ 3179_3
225
+ 2931_0
226
+ 2931_1
227
+ 2931_2
228
+ 2931_3
229
+ 921_0
230
+ 921_1
231
+ 921_2
232
+ 921_3
233
+ 2864_0
234
+ 2864_1
235
+ 2864_2
236
+ 2864_3
237
+ 891_0
238
+ 891_1
239
+ 891_2
240
+ 891_3
241
+ 1209_0
242
+ 1209_1
243
+ 1209_2
244
+ 1209_3
245
+ 2576_0
246
+ 2576_1
247
+ 2576_2
248
+ 2576_3
249
+ 564_0
250
+ 564_1
251
+ 564_2
252
+ 564_3
253
+ 2025_0
254
+ 2025_1
255
+ 2025_2
256
+ 2025_3
257
+ 1660_0
258
+ 1660_1
259
+ 1660_2
260
+ 1660_3
261
+ 1715_0
262
+ 1715_1
263
+ 1715_2
264
+ 1715_3
265
+ 417_0
266
+ 417_1
267
+ 417_2
268
+ 417_3
269
+ 319_0
270
+ 319_1
271
+ 319_2
272
+ 319_3
train.txt ADDED
The diff for this file is too large to render. See raw diff