metadata
license: apache-2.0
base_model: abeja/ABEJA-Qwen2.5-7b-Japanese-v0.1
base_model_relation: quantized
language:
- ja
Using turboderp's ExLlamaV2 v0.2.8 for quantization.
2.2bpw
3.0bpw
4.0bpw
5.0bpw
6.0bpw
7.0bpw
8.0bpw
Calibration Dataset
TFMC/imatrix-dataset-for-japanese-llm
ABEJA-Qwen2.5-7b-Japanese-v0.1-exl2
- Model creator: abeja
- Original model: ABEJA-Qwen2.5-7b-Japanese-v0.1