๐ Abyssal-Seraph-12B
Where the light of the divine meets the poetry of the abyss.
๐ Overview
Abyssal-Seraph-12B is a multi-stage creative merge designed for expressive storytelling, emotional depth, and lyrical dialogue.
It was crafted through a layered fusion using MergeKit:
- ๐ LunaMaid ร Vermilion-Sage โ merged via NearSwap (t=0.0008) to unify LunaMaidโs balanced composure with Vermilion-Sageโs radiant prose.
- ๐ฏ๏ธ Dark-Quill ร Mag-Mell-R1 โ merged via NearSwap (t=0.0008) to draw forth mysticism, poetic darkness, and a sense of dreamlike gravity.
- โจ Both intermediate results combined with the Karcher Mean โ a geometric blend ensuring harmony between light and shadow.
๐ฉถ Model Essence
| Trait | Description | 
|---|---|
| ๐ง Core Nature | Philosophical, poetic, emotionally resonant | 
| ๐ฌ Style | Fluid prose, vivid imagery, articulate reflection | 
| ๐ซ Tone | Dreamlike, balanced between divine warmth and abyssal calm | 
| ๐ญ Best For | Roleplay, character dialogue, introspection, lore writing, creative prose | 
๐งฌ Merge Overview
Abyssal-Seraph-12B was created through a multi-stage, precision merge designed to blend expressive prose with poetic balance while maintaining model stability.
๐ Stage 1
โจ Method: NearSwap (t = 0.0008)
๐ฉต Base: Vortex5/LunaMaid-12B
๐ฎ Secondary: Vortex5/Vermilion-Sage-12B  
Stage 1 Configuration
name: First
models:
- model: Vortex5/Vermilion-Sage-12B
merge_method: nearswap
base_model: Vortex5/LunaMaid-12B
parameters:
t: 0.0008
dtype: bfloat16
tokenizer:
source: Vortex5/LunaMaid-12B
๐ฉถ Stage 2
โ๏ธ Method: NearSwap (t = 0.0008) ๐ค Base: Vortex5/Dark-Quill-12B ๐ซ Secondary: inflatebot/MN-12B-Mag-Mell-R1
Stage 2 Configuration
name: Second
models:
  - model: inflatebot/MN-12B-Mag-Mell-R1
merge_method: nearswap
base_model: Vortex5/Dark-Quill-12B
parameters:
  t: 0.0008
dtype: bfloat16`
๐ Stage 3 โ Final Merge
โ๏ธ Method: Karcher Mean (tol = 1e-9, max_iter = 20000) ๐ Inputs: First + Second ๐ Purpose: To geometrically fuse both for coherence.
Final Merge Configuration
models:
- model: First
- model: Second
merge_method: karcher
dtype: bfloat16
parameters:
tol: 1e-9
max_iter: 20000
tokenizer:
source: First
๐๐ Acknowledgements ๐๐
- โ๏ธ mradermacher โ for static and imatrix quantization
- ๐ DeathGodlike โ for EXL3 quants
- ๐ฉถ All original model authors and contributors whose work made this model possible.
Models merged in this creation:
- Downloads last month
- 204
