Follwoing LUFFY, we change to rope_theta from 10000 to 40000 and extend the context window to 16k.

Downloads last month
7
Safetensors
Model size
8B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for yangzhch6/Qwen2.5-Math-7B-L

Quantizations
1 model

Collection including yangzhch6/Qwen2.5-Math-7B-L