Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Buckets new
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up
Hanjiang1998 's Collections
DSR
MOE
GCN
LLM

MOE

updated 5 days ago
Upvote
-

  • MoHETS: Long-term Time Series Forecasting with Mixture-of-Heterogeneous-Experts

    Paper • 2601.21866 • Published Jan 29 • 1

  • Let Experts Feel Uncertainty: A Multi-Expert Label Distribution Approach to Probabilistic Time Series Forecasting

    Paper • 2602.04678 • Published Feb 4

  • A Hybrid Tensor-Expert-Data Parallelism Approach to Optimize Mixture-of-Experts Training

    Paper • 2303.06318 • Published Mar 11, 2023 • 1

  • MoEC: Mixture of Expert Clusters

    Paper • 2207.09094 • Published Jul 19, 2022 • 1

  • Seg-MoE: Multi-Resolution Segment-wise Mixture-of-Experts for Time Series Forecasting Transformers

    Paper • 2601.21641 • Published Jan 29 • 1

  • Small but Mighty: Enhancing Time Series Forecasting with Lightweight LLMs

    Paper • 2503.03594 • Published Mar 5, 2025
Upvote
-
  • Collection guide
  • Browse collections
Company
TOS Privacy About Careers
Website
Models Datasets Spaces Pricing Docs