BabyBabelLM GPTBERT
Collection
BabyBabelLM (Multilingual BabyLM) with GPT-BERT Architecture
โข
11 items
โข
Updated
This repository contains checkpoints for the multilingual (all) variant of BabyBabeLLM.
*_15_16.bin โ main model weights *_15_16_ema.bin โ EMA smoothed weights *_15_16_state_dict.bin โ PyTorch state dict pytorch_model.bin โ extracted EMA weights (for AutoModel)from transformers import AutoModel, AutoTokenizer
repo = "suchirsalhan/babybabellm-multi-all"
tokenizer = AutoTokenizer.from_pretrained(repo)
model = AutoModel.from_pretrained(repo)
inputs = tokenizer("Hello world!", return_tensors="pt")
outputs = model(**inputs)
multiall indicates the language/config variant.