--- base_model: - darkc0de/XortronCriminalComputing - TroyDoesAI/BlackSheep-24B - darkc0de/XortronCriminalComputingConfig library_name: transformers tags: - mergekit - merge - uncensored - harmful license: wtfpl language: - en pipeline_tag: text-generation --- ![image/gif](https://s14.gifyu.com/images/bsuJQ.gif) This model turned out really well, intelligent, knowledgeable, and of course state-of-the-art **Uncensored** performance. Please use **responsibly**, or at least **discretely**. This model will help you do anything and everything you probably shouldn't be doing. As of this writing, this model tops the **UGI Leaderboard** for models under 70 billion parameters in both the **UGI** and **W10** categories. ![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/6540a02d1389943fef4d2640/_proSyYDP1-HxfHC3fc-Q.jpeg) # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [darkc0de/XortronCriminalComputing](https://huggingface.co/darkc0de/XortronCriminalComputing) as a base. ### Models Merged The following models were included in the merge: * [TroyDoesAI/BlackSheep-24B](https://huggingface.co/TroyDoesAI/BlackSheep-24B) * [darkc0de/XortronCriminalComputing](https://huggingface.co/darkc0de/XortronCriminalComputing) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: darkc0de/XortronCriminalComputing - model: TroyDoesAI/BlackSheep-24B parameters: density: 0.8 weight: 0.8 merge_method: ties base_model: darkc0de/XortronCriminalComputing dtype: float16 ```