|
--- |
|
base_model: |
|
- yvvki/Erotophobia-24B-v2.0 |
|
base_model_relation: quantized |
|
quantized_by: ArtusDev |
|
library_name: transformers |
|
tags: |
|
- mergekit |
|
- merge |
|
|
|
--- |
|
# Erotophobia-24-v2.0 |
|
|
|
 |
|
|
|
My second merge! Yayay! |
|
*Technically my third but we'll ignore the first failed model :(* |
|
|
|
This model is just headache to do, it really is! It wasted me $20 trying to fit darkc0de/BlackXorDolphTronGOAT because it's a `float32` dtype! |
|
I didn't know I can downcast the model into `dfloat16` first until at the very end there. |
|
But, this model has the downcast in the [df16](df16) directory for your pleasure. |
|
|
|
> Still testing and awaiting GGUF. Please kindly give your patience. Thank you <3 |
|
|
|
## Philosophy |
|
|
|
### Mind |
|
|
|
Fusion of darkc0de's fusions merge, and Acree's DeepSeek distill abliterated. |
|
|
|
I think this is a good base. The Xortron is the top performing at UGI leaderboard and Arcee has the DeepSeek distill and updated world information. |
|
|
|
### Heart |
|
|
|
Karcher with all 2503 base, using updated Dan's and Eurydice, and reintroduce Pantheon. |
|
|
|
I feel like this will improve the roleplay. All has the models have unique characteristics of their own, and hopefully the Karcher mean can find a nice center. |
|
|
|
### Soul |
|
|
|
Stock on Sleep's Omega Gaslight and Broken Tutu, since both based on Cydonia 2.1 and have the BlackSheep model in their merge. |
|
|
|
I'm horny... |
|
|
|
## Merge Details |
|
|
|
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). |
|
|
|
### Merge Method |
|
|
|
This model was merged using multi-stage model merging method: |
|
|
|
- **Mind** was merged using the [Arcee Fusion](https://www.arcee.ai/blog/meet-mergekit-v0-1-arcee-fusion-expanded-model-support-multi-gpu-acceleration) merge method using huihui-ai/Arcee-Blitz-abliterated as a base. |
|
- **Heart** was merged using the [Karcher Mean](https://en.wikipedia.org/wiki/Karcher_mean) merge method. |
|
- **Soul** was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using TheDrummer/Cydonia-24B-v2.1 as a base. |
|
|
|
Fially, this model was merged using the [DELLA](https://arxiv.org/abs/2406.11617) merge method using **Mind** as a base. |
|
|
|
The partial merge results (**Mind**, **Heart**, and **Soul**) are available inside the [intermediates](intermediates) directory. |
|
|
|
### Models Merged |
|
|
|
The following models were included in the merge: |
|
* **Mind** |
|
- darkc0de/BlackXorDolphTronGOAT |
|
* **Soul** |
|
- PocketDoc/Dans-PersonalityEngine-V1.3.0-24b |
|
- aixonlab/Eurydice-24b-v3.5 |
|
- Gryphe/Pantheon-RP-1.8-24b-Small-3.1 |
|
* **Heart** |
|
- ReadyArt/Broken-Tutu-24B |
|
- ReadyArt/Omega-Darker-Gaslight_The-Final-Forgotten-Fever-Dream-24B |
|
|
|
### Configuration |
|
|
|
The following YAML configuration was used to produce this model (using `mergekit-multi`): |
|
|
|
```yaml |
|
name: Mind |
|
merge_method: arcee_fusion |
|
dtype: bfloat16 |
|
tokenizer: |
|
source: union |
|
chat_template: auto |
|
base_model: huihui-ai/Arcee-Blitz-abliterated |
|
models: |
|
- model: darkc0de/BlackXorDolphTronGOAT |
|
--- |
|
name: Heart |
|
merge_method: karcher |
|
tokenizer: |
|
source: union |
|
chat_template: auto |
|
parameters: |
|
max_iter: 1000 |
|
models: |
|
- model: PocketDoc/Dans-PersonalityEngine-V1.3.0-24b |
|
- model: aixonlab/Eurydice-24b-v3.5 |
|
- model: Gryphe/Pantheon-RP-1.8-24b-Small-3.1 |
|
--- |
|
name: Soul |
|
merge_method: model_stock |
|
tokenizer: |
|
source: union |
|
chat_template: auto |
|
base_model: TheDrummer/Cydonia-24B-v2.1 |
|
models: |
|
- model: ReadyArt/Broken-Tutu-24B |
|
- model: ReadyArt/Omega-Darker-Gaslight_The-Final-Forgotten-Fever-Dream-24B |
|
--- |
|
merge_method: della |
|
tokenizer: |
|
source: union |
|
chat_template: auto |
|
base_model: Mind |
|
models: |
|
- model: Mind |
|
- model: Heart |
|
parameters: |
|
weight: 0.6 |
|
- model: Soul |
|
parameters: |
|
weight: 0.4 |
|
parameters: |
|
density: 0.7 |
|
epsilon: 0.2 |
|
lamda: 1.1 |
|
``` |
|
|