File size: 2,387 Bytes
7c69bba 9ed1e66 7c69bba 9ed1e66 7c69bba 880c0bd e4b6cb0 7c69bba e4b6cb0 7c69bba e4b6cb0 7c69bba e4b6cb0 7c69bba e4b6cb0 7c69bba e4b6cb0 7c69bba 9ed1e66 e4b6cb0 7c69bba e4b6cb0 9ed1e66 e4b6cb0 7c69bba e4b6cb0 7c69bba e4b6cb0 7c69bba e4b6cb0 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 |
---
base_model:
- Vortex5/Moonlit-Shadow-12B
- yamatazen/NeonMaid-12B-v2
- Vortex5/Vermilion-Sage-12B
library_name: transformers
tags:
- mergekit
- merge
- roleplay
---

# 🩵 LunaMaid-12B
This is a multi-stage merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## 🧬 Merge Overview
**LunaMaid-12B** was produced through a **two-stage multi-model merge** using [MergeKit](https://github.com/arcee-ai/mergekit).
Each stage fuses models with complementary linguistic and stylistic traits to create a cohesive, emotionally nuanced personality.
### 🩵 **Stage 1 — Slerp Merge (Intermediate Model `First`)**
- **Base Model:** [Vortex5/Vermilion-Sage-12B](https://huggingface.co/Vortex5/Vermilion-Sage-12B)
- **Merged With:** [yamatazen/NeonMaid-12B-v2](https://huggingface.co/yamatazen/NeonMaid-12B-v2)
- **Method:** Spherical Linear Interpolation (Slerp)
<details>
<summary><b>Stage 1 Configuration</b></summary>
```yaml
name: First
base_model: Vortex5/Vermilion-Sage-12B
models:
- model: yamatazen/NeonMaid-12B-v2
merge_method: slerp
dtype: bfloat16
parameters:
normalize: true
t: [0.25, 0.35, 0.45, 0.55, 0.65, 0.75, 0.6, 0.5, 0.6, 0.6]
```
</details>
### 🌑 **Merge Method — Karcher Mean Merge (Final Model)**
- **Base Model:** Intermediate output from Stage 1 `./intermediates/First`
- **Merged With:** [Vortex5/Moonlit-Shadow-12B](https://huggingface.co/Vortex5/Moonlit-Shadow-12B)
- **Method:** [Karcher Mean](https://en.wikipedia.org/wiki/Karcher_mean) (Riemannian Barycenter)
<details>
<summary><b>Stage 2 Configuration</b></summary>
```yaml
dtype: bfloat16
merge_method: karcher
modules:
default:
slices:
- sources:
- layer_range: [0, 40]
model: ./intermediates/First
- layer_range: [0, 40]
model: Vortex5/Moonlit-Shadow-12B
parameters:
max_iter: 9999
tol: 1e-9
```
</details>
### Models Merged
The following models were included in the merge:
* [Vortex5/Moonlit-Shadow-12B](https://huggingface.co/Vortex5/Moonlit-Shadow-12B)
* [Vortex5/Vermilion-Sage-12B](https://huggingface.co/Vortex5/Vermilion-Sage-12B)
* [yamatazen/NeonMaid-12B-v2](https://huggingface.co/yamatazen/NeonMaid-12B-v2)
* ./intermediates/First
|