davidmasip
commited on
Commit
Β·
00dfba4
1
Parent(s):
1629d8a
Reorg
Browse files- model/config.json β config.json +0 -0
- model/pytorch_model.bin β pytorch_model.bin +0 -0
- model/special_tokens_map.json β special_tokens_map.json +0 -0
- model/tokenizer.json β tokenizer.json +0 -0
- tokenizer/special_tokens_map.json +0 -1
- tokenizer/tokenizer.json +0 -0
- tokenizer/tokenizer_config.json +0 -1
- model/tokenizer_config.json β tokenizer_config.json +0 -0
- model/training_args.bin β training_args.bin +0 -0
model/config.json β config.json
RENAMED
File without changes
|
model/pytorch_model.bin β pytorch_model.bin
RENAMED
File without changes
|
model/special_tokens_map.json β special_tokens_map.json
RENAMED
File without changes
|
model/tokenizer.json β tokenizer.json
RENAMED
File without changes
|
tokenizer/special_tokens_map.json
DELETED
@@ -1 +0,0 @@
|
|
1 |
-
{"bos_token": "<s>", "eos_token": "</s>", "unk_token": "<unk>", "sep_token": "</s>", "pad_token": "<pad>", "cls_token": "<s>", "mask_token": "<mask>"}
|
|
|
|
tokenizer/tokenizer.json
DELETED
The diff for this file is too large to render.
See raw diff
|
|
tokenizer/tokenizer_config.json
DELETED
@@ -1 +0,0 @@
|
|
1 |
-
{"bos_token": "<s>", "eos_token": "</s>", "sep_token": "</s>", "cls_token": "<s>", "unk_token": "<unk>", "pad_token": "<pad>", "mask_token": "<mask>", "special_tokens_map_file": "models/twerto-base-uncased/special_tokens_map.json", "name_or_path": "pysentimiento/robertuito-sentiment-analysis", "tokenizer_class": "PreTrainedTokenizerFast"}
|
|
|
|
model/tokenizer_config.json β tokenizer_config.json
RENAMED
File without changes
|
model/training_args.bin β training_args.bin
RENAMED
File without changes
|