gargamit commited on
Commit
c4b3741
Β·
verified Β·
1 Parent(s): 9db0932

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -2
README.md CHANGED
@@ -26,7 +26,6 @@ The model belongs to the Phi-4 model family and supports 128K token context leng
26
  🏑 [Phi Portal](https://azure.microsoft.com/en-us/products/phi) <br>
27
  πŸ–₯️ Try It [Azure](https://aka.ms/phi4-mini-reasoning/azure) <br>
28
 
29
- πŸš€ [Model paper](https://huggingface.co/papers/2503.01743)
30
 
31
  πŸŽ‰**Phi-4 models**: [[Phi-4-reasoning](https://huggingface.co/microsoft/Phi-4-reasoning)] | [[multimodal-instruct](https://huggingface.co/microsoft/Phi-4-multimodal-instruct) | [onnx](https://huggingface.co/microsoft/Phi-4-multimodal-instruct-onnx)];
32
  [[mini-instruct](https://huggingface.co/microsoft/Phi-4-mini-instruct) | [onnx](https://huggingface.co/microsoft/Phi-4-mini-instruct-onnx)]
@@ -158,7 +157,7 @@ print(outputs[0])
158
  + **Context length:** 128K tokens<br>
159
  + **GPUs:** 128 H100-80G<br>
160
  + **Training time:** 2 days<br>
161
- + **Training data:** 150 tokens<br>
162
  + **Outputs:** Generated text<br>
163
  + **Dates:** Trained in February 2024<br>
164
  + **Status:** This is a static model trained on offline datasets with the cutoff date of February 2025 for publicly available data.<br>
 
26
  🏑 [Phi Portal](https://azure.microsoft.com/en-us/products/phi) <br>
27
  πŸ–₯️ Try It [Azure](https://aka.ms/phi4-mini-reasoning/azure) <br>
28
 
 
29
 
30
  πŸŽ‰**Phi-4 models**: [[Phi-4-reasoning](https://huggingface.co/microsoft/Phi-4-reasoning)] | [[multimodal-instruct](https://huggingface.co/microsoft/Phi-4-multimodal-instruct) | [onnx](https://huggingface.co/microsoft/Phi-4-multimodal-instruct-onnx)];
31
  [[mini-instruct](https://huggingface.co/microsoft/Phi-4-mini-instruct) | [onnx](https://huggingface.co/microsoft/Phi-4-mini-instruct-onnx)]
 
157
  + **Context length:** 128K tokens<br>
158
  + **GPUs:** 128 H100-80G<br>
159
  + **Training time:** 2 days<br>
160
+ + **Training data:** 150B tokens<br>
161
  + **Outputs:** Generated text<br>
162
  + **Dates:** Trained in February 2024<br>
163
  + **Status:** This is a static model trained on offline datasets with the cutoff date of February 2025 for publicly available data.<br>