Update README.md
Browse files
README.md
CHANGED
|
@@ -104,6 +104,8 @@ You can now also run Devstral using these (alphabetical ordered) frameworks:
|
|
| 104 |
- [`llama.cpp`](https://github.com/ggml-org/llama.cpp): To use community ones such as [Unsloth's](https://huggingface.co/unsloth/Devstral-2-123B-Instruct-2512-GGUF) or [Bartowski's](https://huggingface.co/bartowski/mistralai_Devstral-2-123B-Instruct-2512-GGUF) make sure to use changes from this [PR](https://github.com/ggml-org/llama.cpp/pull/17945).
|
| 105 |
- [`Ollama`](https://ollama.com/): https://ollama.com/library/devstral-2
|
| 106 |
|
|
|
|
|
|
|
| 107 |
#### vLLM (recommended)
|
| 108 |
|
| 109 |
<details>
|
|
|
|
| 104 |
- [`llama.cpp`](https://github.com/ggml-org/llama.cpp): To use community ones such as [Unsloth's](https://huggingface.co/unsloth/Devstral-2-123B-Instruct-2512-GGUF) or [Bartowski's](https://huggingface.co/bartowski/mistralai_Devstral-2-123B-Instruct-2512-GGUF) make sure to use changes from this [PR](https://github.com/ggml-org/llama.cpp/pull/17945).
|
| 105 |
- [`Ollama`](https://ollama.com/): https://ollama.com/library/devstral-2
|
| 106 |
|
| 107 |
+
If you notice subpar performance with local serving, please submit issues to the relevant framework so that it can be fixed and in the meantime we advise to use the Mistral AI API.
|
| 108 |
+
|
| 109 |
#### vLLM (recommended)
|
| 110 |
|
| 111 |
<details>
|