Original model: Eclipsed-Prism-12B by Vortex5


Available ExLlamaV3 (release v0.0.18) quantizations

Requirements: A python installation with huggingface-hub module to use CLI.

Licensing: The license for the provided quantized models is derived from the original model (License detected: unknown). For additional information see the original model's page above, or, if unavailable, the files and the page backups below.


Backups

Original files

Original page (click to expand)

Eclipsed-Prism-12B

Overview

Eclipsed-Prism-12B was created through a multi-stage merge involving Starlit-Shadow-12B, Shining-Prism-12B, EtherealAurora-12B, EsotericSage-12B, and Hollow-Aether-12B using custom methods.

Multi-stage merge configuration
name: First
merge_method: acl
base_model: Vortex5/Starlit-Shadow-12B
models:
  - model: Vortex5/Shining-Prism-12B
  - model: yamatazen/EtherealAurora-12B
parameters:
  strength: 0.75
  selectivity: 0.95
dtype: bfloat16
tokenizer:
  source: Vortex5/Starlit-Shadow-12B
---
name: Second
merge_method: amsf
models:
  - model: First
  - model: yamatazen/EsotericSage-12B
  - model: Vortex5/Hollow-Aether-12B
dtype: bfloat16
tokenizer:
  source: Vortex5/Starlit-Shadow-12B
---
name: Third
merge_method: saef
models:
  - model: Second
  - model: Vortex5/Shining-Prism-12B
  - model: yamatazen/EtherealAurora-12B
parameters:
  paradox: 0.45
  strength: 0.9
  boost: 0.55
  modes: 2
dtype: bfloat16
tokenizer:
  source: Vortex5/Starlit-Shadow-12B
---
#no name needed for final model
merge_method: sm2f
base_model: Third
models:
  - model: Vortex5/Starlit-Shadow-12B
parameters:
  focus: 0.55
  trust: 0.60
dtype: bfloat16
tokenizer:
  source: Vortex5/Starlit-Shadow-12B
      

Intended Use

๐ŸŒ’ Storytelling
๐ŸŽญ Roleplay
โœจ Creative writing
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for DeathGodlike/Eclipsed-Prism-12B_EXL3

Quantized
(3)
this model