Celestial-Queen-12B

Overview

Celestial-Queen-12B was created through a multi-stage merge combining Crimson-Constellation-12B, Strawberry_Smoothie-12B-Model_Stock, MN-12B-Mag-Mell-R1, LunaMaid-12B, Mahou-1.5-mistral-nemo-12B, MN-12B-Celeste-V1.9, Omega-Darker_The-Final-Directive-12B, MegaMoon-Karcher-12B, and MN-12B-Mag-Mell-R1.

Multi-stage merge configuration
name: First
models:
  - model: Vortex5/Crimson-Constellation-12B
  - model: DreadPoor/Strawberry_Smoothie-12B-Model_Stock
  - model: inflatebot/MN-12B-Mag-Mell-R1
  - model: Vortex5/LunaMaid-12B
merge_method: saef
parameters:
  paradox: 0.40
  strength: 0.88
  boost: 0.28
  modes: 2
dtype: float32
tokenizer:
  source: Vortex5/LunaMaid-12B
---
name: Second
models:
  - model: flammenai/Mahou-1.5-mistral-nemo-12B
  - model: nothingiisreal/MN-12B-Celeste-V1.9
  - model: ReadyArt/Omega-Darker_The-Final-Directive-12B
merge_method: saef
parameters:
  paradox: 0.54
  strength: 0.9
  boost: 0.6
  modes: 2
dtype: float32
tokenizer:
  source: union
---
name: Nearswap1
models:
  - model: Vortex5/MegaMoon-Karcher-12B
merge_method: nearswap
base_model: First
parameters:
  t: 0.0008
dtype: float32
tokenizer:
  source: First
---
name: Nearswap2
models:
  - model: inflatebot/MN-12B-Mag-Mell-R1
merge_method: nearswap
base_model: Second
parameters:
  t: 0.0008
dtype: float32
tokenizer:
  source: Second
---
models:
  - model: Nearswap1
  - model: Nearswap2
merge_method: karcher
chat_template: auto
dtype: float32
out_dtype: bfloat16
parameters:
  tol: 1e-9
  max_iter: 1000
tokenizer:
  source: Vortex5/LunaMaid-12B

Intended Use

Storytelling Long-form narrative
Roleplay Emotion-forward interaction
Creative Writing Atmospheric fiction
Downloads last month
261
Safetensors
Model size
12B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Vortex5/Celestial-Queen-12B

Collection including Vortex5/Celestial-Queen-12B