|
--- |
|
base_model: |
|
- darkc0de/ConsciousCrimininalComputing-BlAcKxOrDoLpHtRoN |
|
- darkc0de/ConsciousCrimininalComputing-BlackXorDolphTronGOAT |
|
- darkc0de/XortronCriminalComputingConfig |
|
- cognitivecomputations/Dolphin-Mistral-24B-Venice-Edition |
|
- TroyDoesAI/BlackSheep-24B |
|
- darkc0de/Xortron2025 |
|
library_name: transformers |
|
tags: |
|
- mergekit |
|
- merge |
|
- uncensored |
|
- harmful |
|
license: wtfpl |
|
pipeline_tag: text-generation |
|
--- |
|
|
|
|
|
 |
|
|
|
This is a personal experiment of stacking and shuffling multiple models several times through **mergekit** using merge_method: **arcee_fusion** |