ComfyUI_00165_

๐ŸŒŒ Abyssal-Seraph-12B

Where the light of the divine meets the poetry of the abyss.


๐Ÿœ‚ Overview

Abyssal-Seraph-12B is a multi-stage creative merge designed for expressive storytelling, emotional depth, and lyrical dialogue.
It was crafted through a layered fusion using MergeKit:

  1. ๐ŸŒ™ LunaMaid ร— Vermilion-Sage โ€” merged via NearSwap (t=0.0008) to unify LunaMaidโ€™s balanced composure with Vermilion-Sageโ€™s radiant prose.
  2. ๐Ÿ•ฏ๏ธ Dark-Quill ร— Mag-Mell-R1 โ€” merged via NearSwap (t=0.0008) to draw forth mysticism, poetic darkness, and a sense of dreamlike gravity.
  3. โœจ Both intermediate results combined with the Karcher Mean โ€” a geometric blend ensuring harmony between light and shadow.

๐Ÿฉถ Model Essence

Trait Description
๐Ÿง  Core Nature Philosophical, poetic, emotionally resonant
๐Ÿ’ฌ Style Fluid prose, vivid imagery, articulate reflection
๐Ÿ’ซ Tone Dreamlike, balanced between divine warmth and abyssal calm
๐ŸŽญ Best For Roleplay, character dialogue, introspection, lore writing, creative prose

๐Ÿงฌ Merge Overview

Abyssal-Seraph-12B was created through a multi-stage, precision merge designed to blend expressive prose with poetic balance while maintaining model stability.

๐ŸŒ™ Stage 1

โœจ Method: NearSwap (t = 0.0008)
๐Ÿฉต Base: Vortex5/LunaMaid-12B
๐Ÿ’ฎ Secondary: Vortex5/Vermilion-Sage-12B

Stage 1 Configuration
name: First
models:
- model: Vortex5/Vermilion-Sage-12B
merge_method: nearswap
base_model: Vortex5/LunaMaid-12B
parameters:
t: 0.0008
dtype: bfloat16
tokenizer:
source: Vortex5/LunaMaid-12B

๐Ÿฉถ Stage 2

โš™๏ธ Method: NearSwap (t = 0.0008) ๐Ÿ–ค Base: Vortex5/Dark-Quill-12B ๐Ÿ’ซ Secondary: inflatebot/MN-12B-Mag-Mell-R1

Stage 2 Configuration
name: Second
models:
  - model: inflatebot/MN-12B-Mag-Mell-R1
merge_method: nearswap
base_model: Vortex5/Dark-Quill-12B
parameters:
  t: 0.0008
dtype: bfloat16`

๐ŸŒŒ Stage 3 โ€” Final Merge

โš–๏ธ Method: Karcher Mean (tol = 1e-9, max_iter = 20000) ๐Ÿœ‚ Inputs: First + Second ๐Ÿ’Ž Purpose: To geometrically fuse both for coherence.

Final Merge Configuration
models:
- model: First
- model: Second
merge_method: karcher
dtype: bfloat16
parameters:
tol: 1e-9
max_iter: 20000
tokenizer:
source: First

๐ŸŒ‘๐Ÿœ‚ Acknowledgements ๐Ÿœ‚๐ŸŒ‘

  • โš™๏ธ mradermacher โ€” for static and imatrix quantization
  • ๐Ÿœ› DeathGodlike โ€” for EXL3 quants
  • ๐Ÿฉถ All original model authors and contributors whose work made this model possible.

Models merged in this creation:

Downloads last month
204
Safetensors
Model size
12B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Vortex5/Abyssal-Seraph-12B