Update README.md
Browse files
README.md
CHANGED
|
@@ -126,7 +126,7 @@ We recommend using this model with [vLLM](https://github.com/vllm-project/vllm).
|
|
| 126 |
|
| 127 |
#### Installation
|
| 128 |
|
| 129 |
-
Make sure to install most recent vllm:
|
| 130 |
|
| 131 |
```
|
| 132 |
uv pip install -U vllm \
|
|
@@ -161,7 +161,7 @@ Additional flags:
|
|
| 161 |
|
| 162 |
#### Usage of the model
|
| 163 |
|
| 164 |
-
Here we
|
| 165 |
|
| 166 |
<details>
|
| 167 |
<summary>Test Base</summary>
|
|
|
|
| 126 |
|
| 127 |
#### Installation
|
| 128 |
|
| 129 |
+
Make sure to install the most recent vllm:
|
| 130 |
|
| 131 |
```
|
| 132 |
uv pip install -U vllm \
|
|
|
|
| 161 |
|
| 162 |
#### Usage of the model
|
| 163 |
|
| 164 |
+
Here we assume that the model `mistralai/Ministral-3-14B-Base-2512` is served and you can ping it to the domain `localhost` with the port `8000` which is the default for vLLM.
|
| 165 |
|
| 166 |
<details>
|
| 167 |
<summary>Test Base</summary>
|