File size: 2,613 Bytes
7364f16
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
---
base_model:
- PocketDoc/Dans-PersonalityEngine-V1.2.0-24b
- lars1234/Mistral-Small-24B-Instruct-2501-writer
- mistralai/Mistral-Small-24B-Instruct-2501
- trashpanda-org/Llama3-24B-Mullein-v1
- unsloth/Mistral-Small-24B-Base-2501
- arcee-ai/Arcee-Blitz
- allura-org/Mistral-Small-24b-Sertraline-0304
library_name: transformers
tags:
- mergekit
- mergekitty
- merge

---
# v0a

This is a merge of pre-trained language models created using [mergekitty](https://github.com/allura-org/mergekitty).

## Merge Details
### Merge Method

This model was merged using the [SCE](https://arxiv.org/abs/2408.07990) merge method using [unsloth/Mistral-Small-24B-Base-2501](https://huggingface.co/unsloth/Mistral-Small-24B-Base-2501) as a base.

### Models Merged

The following models were included in the merge:
* [PocketDoc/Dans-PersonalityEngine-V1.2.0-24b](https://huggingface.co/PocketDoc/Dans-PersonalityEngine-V1.2.0-24b)
* [lars1234/Mistral-Small-24B-Instruct-2501-writer](https://huggingface.co/lars1234/Mistral-Small-24B-Instruct-2501-writer)
* [mistralai/Mistral-Small-24B-Instruct-2501](https://huggingface.co/mistralai/Mistral-Small-24B-Instruct-2501)
* [trashpanda-org/Llama3-24B-Mullein-v1](https://huggingface.co/trashpanda-org/Llama3-24B-Mullein-v1)
* [arcee-ai/Arcee-Blitz](https://huggingface.co/arcee-ai/Arcee-Blitz)
* [allura-org/Mistral-Small-24b-Sertraline-0304](https://huggingface.co/allura-org/Mistral-Small-24b-Sertraline-0304)

### Configuration

The following YAML configuration was used to produce this model:

```yaml
base_model: unsloth/Mistral-Small-24B-Base-2501
merge_method: sce
dtype: float32
out_dtype: bfloat16
models:
  - model: allura-org/Mistral-Small-24b-Sertraline-0304
    parameters:
      select_topk: 0.50
  - model: lars1234/Mistral-Small-24B-Instruct-2501-writer
    parameters:
      select_topk: 0.20
  - model: PocketDoc/Dans-PersonalityEngine-V1.2.0-24b
    parameters:
      select_topk: 0.20
  - model: trashpanda-org/Llama3-24B-Mullein-v1
    parameters:
      select_topk: 0.175
  - model: arcee-ai/Arcee-Blitz
    parameters:
      select_topk: 0.15
  - model: mistralai/Mistral-Small-24B-Instruct-2501
    parameters:
      select_topk: 0.15

# apt install git nano -y
# uv tool install mergekitty --with hf_transfer
# uv tool install https://github.com/aphrodite-engine/aphrodite-engine/releases/download/v0.6.7/aphrodite_engine-0.6.7-cp38-abi3-manylinux1_x86_64.whl --with aphrodite-engine --with setuptools --with hf_transfer
# uv tool install huggingface_hub
# huggingface-cli login
# nano merge.yml
# mergekitty-yaml --cuda --lazy-unpickle merge.yml v0a

```