Update README.md
Browse files
README.md
CHANGED
|
@@ -66,7 +66,11 @@ Make sure to install the latest version of `transformers` or `vllm`, eventually
|
|
| 66 |
pip install git+https://github.com/huggingface/transformers.git
|
| 67 |
```
|
| 68 |
|
| 69 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 70 |
|
| 71 |
### 🤗 transformers
|
| 72 |
|
|
@@ -92,7 +96,7 @@ model = AutoModelForCausalLM.from_pretrained(
|
|
| 92 |
For vLLM, simply start a server by executing the command below:
|
| 93 |
|
| 94 |
```
|
| 95 |
-
# pip install vllm
|
| 96 |
vllm serve tiiuae/Falcon-H1-1B-Instruct --tensor-parallel-size 2 --data-parallel-size 1
|
| 97 |
```
|
| 98 |
|
|
|
|
| 66 |
pip install git+https://github.com/huggingface/transformers.git
|
| 67 |
```
|
| 68 |
|
| 69 |
+
For vLLM, make sure to install `vllm>=0.9.0`:
|
| 70 |
+
|
| 71 |
+
```bash
|
| 72 |
+
pip install "vllm>=0.9.0"
|
| 73 |
+
```
|
| 74 |
|
| 75 |
### 🤗 transformers
|
| 76 |
|
|
|
|
| 96 |
For vLLM, simply start a server by executing the command below:
|
| 97 |
|
| 98 |
```
|
| 99 |
+
# pip install vllm>=0.9.0
|
| 100 |
vllm serve tiiuae/Falcon-H1-1B-Instruct --tensor-parallel-size 2 --data-parallel-size 1
|
| 101 |
```
|
| 102 |
|