Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up
Davidsamuel101 's Collections
EN-Neural-BB-Books
MOE's
MOE's Model
LLM

MOE's

updated Feb 6, 2024
Upvote
1

  • BlackMamba: Mixture of Experts for State-Space Models

    Paper • 2402.01771 • Published Feb 1, 2024 • 26

  • OpenMoE: An Early Effort on Open Mixture-of-Experts Language Models

    Paper • 2402.01739 • Published Jan 29, 2024 • 29

  • DeepSeekMoE: Towards Ultimate Expert Specialization in Mixture-of-Experts Language Models

    Paper • 2401.06066 • Published Jan 11, 2024 • 55
Upvote
1
  • Collection guide
  • Browse collections
Company
TOS Privacy About Jobs
Website
Models Datasets Spaces Pricing Docs