nanogpt-enwik8-compressed-working / tokenizer_config.json
prompterminal's picture
Upload folder using huggingface_hub
255e848 verified
raw
history blame contribute delete
141 Bytes
{
"tokenizer_class": "CharacterLevelTokenizer",
"vocab_size": 6060,
"model_max_length": 1024,
"clean_up_tokenization_spaces": false
}