
An experimental distillation of Bart, the German version of the Babelscape/rebel-dataset using GILDA. Currently, the model is unhinged, and using it is not for the faint of heart. Proceed at your own risk, but be explicitly warned: if you are not adequately equipped to navigate the complexities of a German soul beset by the existential angst of Weltschmerz, then you should not complain if it ultimately explodes in your face. The consequences of mishandling this deeply troubled and profoundly philosophical mindset can be catastrophic, and users are advised to tread with extreme caution, lest they fall prey to the model's erratic and potentially devastating behavior.
The model is optimized toward 1/4bQAT but not trained as of yet.
- Downloads last month
- 5
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for hideosnes/Bart-T2T-Distill_GildaBot
Base model
facebook/bart-large-cnn