BFloat16 is not supported on MPS
#25
by
To-the-north-pole
- opened
Even though I tried the code below, it still has the error message.
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(
model_name,
torch_dtype=torch.float16,
device_map="auto"
)