LLama-3-3x8b-multilingual

This model is MoE merge of following three models

  1. namespace-Pt/Llama-3-8B-Instruct-80K-QLoRA-Merged
  2. meta-llama/Meta-Llama-3-8B-Instruct
  3. lightblue/suzume-llama-3-8B-multilingual

The model has context length of 80k.

Downloads last month
5
Safetensors
Model size
19B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Souvik3333/Llama-3-3x8B-multilingual

Quantizations
2 models