SAM1 Hybrid Model
Architecture
- Transformer + CNN + RNN
- Parameters: 253,748,736 (~253.7M)
- 24L ร 768d ร 12H
Usage
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("path/to/model")
model = AutoModelForCausalLM.from_pretrained("path/to/model")
prompt = "User: Hello!\nSam:"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_length=100)
print(tokenizer.decode(outputs[0]))
- Downloads last month
- 22