--- base_model: - darkc0de/ConsciousCrimininalComputing-BlAcKxOrDoLpHtRoN - darkc0de/ConsciousCrimininalComputing-BlackXorDolphTronGOAT - darkc0de/XortronCriminalComputingConfig - cognitivecomputations/Dolphin-Mistral-24B-Venice-Edition - TroyDoesAI/BlackSheep-24B - darkc0de/Xortron2025 library_name: transformers tags: - mergekit - merge - uncensored - harmful license: wtfpl pipeline_tag: text-generation --- ![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/6540a02d1389943fef4d2640/nhYZrmlwlvEzQna3kzsng.jpeg) This is a personal experiment of stacking and shuffling multiple models several times through **mergekit** using merge_method: **arcee_fusion**