bert_trained / tokenizer.json
Harsh-7300's picture
Update tokenizer.json
a6b69c6 verified
raw
history blame contribute delete
128 Bytes
{
"model_type": "bert",
"do_lower_case": true,
"strip_accents": false,
"pad_token_id": 0,
"wordpieces_prefix": "##"
}