Model Composition
NurtureAI/neural-chat-7b-v3-16k: Weight - 30%xDAN-AI/xDAN-L1-Chat-RL-v1: Weight - 30%rwitz/go-bruins-v2: Weight - 30%segmed/MedMistral-7B-v0.1: Weight - 10%
Code Snippet for Model Merging
The following Python code demonstrates how to create this mixed model using the LM-Cocktail approach:
from LM_Cocktail import mix_models_by_layers
model = mix_models_by_layers(
model_names_or_paths=[
"NurtureAI/neural-chat-7b-v3-16k",
"xDAN-AI/xDAN-L1-Chat-RL-v1",
"rwitz/go-bruins-v2",
"segmed/MedMistral-7B-v0.1"
],
model_type='decoder',
weights=[0.3, 0.3, 0.3, 0.1],
output_path='./mixed_llm'
)
license: apache-2.0
- Downloads last month
- 9