DavidAU/L3.1-MOE-2X8B-Deepseek-DeepHermes-e32-uncensored-abliterated-13.7B-gguf Text Generation • 14B • Updated May 28 • 519 • 9
DavidAU/L3.1-MOE-2X8B-Deepseek-DeepHermes-e32-13.7B-gguf Text Generation • 14B • Updated May 28 • 336 • 1
DavidAU/Qwen2.5-MOE-6x1.5B-DeepSeek-Reasoning-e32-8.71B-gguf Text Generation • 9B • Updated Mar 5 • 1.02k • 5
DavidAU/How-To-Set-and-Manage-MOE-Mix-of-Experts-Model-Activation-of-Experts Text Generation • Updated May 19 • 4
DavidAU/Qwen2.5-MOE-2X1.5B-DeepSeek-Uncensored-Censored-4B-gguf Text Generation • 4B • Updated Mar 6 • 1.11k • 5
DavidAU/Qwen2.5-MOE-2X7B-DeepSeek-Abliterated-Censored-19B-gguf Text Generation • 19B • Updated Mar 6 • 189 • 4