loubnabnl HF Staff commited on
Commit
ea89fd7
·
verified ·
1 Parent(s): 27b5dbe

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -62,7 +62,7 @@ outputs = model.generate(inputs)
62
  print(tokenizer.decode(outputs[0]))
63
  ```
64
 
65
- For local inference, you can use `llama.cpp`, `MLX` and `MLC`. you can find quantized checkpoints in this collection [TODO].
66
 
67
  ## Evaluation
68
 
 
62
  print(tokenizer.decode(outputs[0]))
63
  ```
64
 
65
+ For local inference, you can use `llama.cpp`, `ONNX`, `MLX` and `MLC`. You can find quantized checkpoints in this collection [TODO].
66
 
67
  ## Evaluation
68