Neona-12B / README.md
kyx0r's picture
Upload folder using huggingface_hub
707c2cd verified
---
base_model: []
library_name: transformers
tags:
- mergekit
- merge
---
# Neona-12B
![image/png](https://huggingface.co/kyx0r/Neona-12B/resolve/main/neona_final.png?download=true)
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [NearSwap](https://huggingface.co/alchemonaut/QuartetAnemoi-70B-t0.0001) merge method using [yamatazen/NeonMaid-12B-v2](https://huggingface.co/yamatazen/NeonMaid-12B-v2) as a base.
### Models Merged
The following models were included in the merge:
* [yamatazen/LorablatedStock-12B](https://huggingface.co/yamatazen/LorablatedStock-12B)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: ../LorablatedStock-12B-frank
merge_method: nearswap
base_model: ../NeonMaid-12B-v2-frank
parameters:
t: [0.0005, 0.0008, 0.0013, 0.0008, 0.0005]
dtype: bfloat16
chat_template: "chatml"
tokenizer:
source: "base"
```