Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
DopeorNope
/
Ko-Mixtral-v1.3-MoE-7Bx2
like
14
Text Generation
Transformers
Safetensors
Korean
English
mixtral
Mixture of Experts
text-generation-inference
License:
cc-by-nc-sa-4.0
Model card
Files
Files and versions
xet
Community
1
Deploy
Use this model
c662637
Ko-Mixtral-v1.3-MoE-7Bx2
/
generation_config.json
DopeorNope
Upload MixtralForCausalLM
c662637
verified
almost 2 years ago
raw
Copy download link
history
blame
Safe
132 Bytes
{
"_from_model_config"
:
true
,
"bos_token_id"
:
1
,
"eos_token_id"
:
2
,
"pad_token_id"
:
2
,
"transformers_version"
:
"4.36.2"
}