File size: 680 Bytes
550120c
 
 
 
b5c9382
 
 
 
550120c
 
 
 
10e6632
 
 
 
550120c
 
d2b4c34
a4467e4
 
b5c9382
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
---
base_model:
- darkc0de/ConsciousCrimininalComputing-BlAcKxOrDoLpHtRoN
- darkc0de/ConsciousCrimininalComputing-BlackXorDolphTronGOAT
- darkc0de/XortronCriminalComputingConfig
- cognitivecomputations/Dolphin-Mistral-24B-Venice-Edition
- TroyDoesAI/BlackSheep-24B
- darkc0de/Xortron2025
library_name: transformers
tags:
- mergekit
- merge
- uncensored
- harmful
license: wtfpl
pipeline_tag: text-generation
---


![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/6540a02d1389943fef4d2640/nhYZrmlwlvEzQna3kzsng.jpeg)

This is a personal experiment of stacking and shuffling multiple models several times through **mergekit** using merge_method: **arcee_fusion**