Mann-E_Dreams / tokenizer /added_tokens.json
Muhammadreza's picture
Adding `diffusers` weights of this model (#1)
aeab11b verified
raw
history blame contribute delete
57 Bytes
{
"<|endoftext|>": 49407,
"<|startoftext|>": 49406
}