bigscience-1.3B-de-tokenizer / special_tokens_map.json
yongzx's picture
add tokenizer
5357e41
raw
history blame contribute delete
90 Bytes
{"bos_token": "<|endoftext|>", "eos_token": "<|endoftext|>", "unk_token": "<|endoftext|>"}