AlphaMoE Collection Foundational model architecture for Mixture of Experts by aquif. State-of-the-art performance and economical training. • 1 item • Updated 4 days ago
aquif-3.6 Collection Based on aquif-3.5, aquif-3.6-8B brings more efficient, powerful, automatic reasoning. • 1 item • Updated 6 days ago
aquif-moe Collection Experiments with Mixture-of-Experts architectures, predecessor to aquif-3-moe. • 2 items • Updated 21 days ago
aquif-3 Collection The most popular series of models by aquif, that started on March 2025. • 5 items • Updated 21 days ago
aquif-3.5 Collection The latest series of models from aquif, released late August 2025. • 5 items • Updated 21 days ago