Mixture of Experts Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts Paper β’ 2409.16040 β’ Published Sep 24, 2024 β’ 16
Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts Paper β’ 2409.16040 β’ Published Sep 24, 2024 β’ 16
Useful Models alaa-lab/InstructCV Image-to-Image β’ Updated Feb 19, 2024 β’ 4 β’ 10 Vision-CAIR/vicuna-7b Text Generation β’ Updated May 22, 2023 β’ 1k β’ 24 shibing624/ziya-llama-13b-medical-merged Text Generation β’ Updated Feb 19, 2024 β’ 7 β’ 26 chaoyi-wu/PMC_LLAMA_7B Text Generation β’ Updated May 17, 2023 β’ 619 β’ 65
Mixture of Experts Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts Paper β’ 2409.16040 β’ Published Sep 24, 2024 β’ 16
Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts Paper β’ 2409.16040 β’ Published Sep 24, 2024 β’ 16
Useful Models alaa-lab/InstructCV Image-to-Image β’ Updated Feb 19, 2024 β’ 4 β’ 10 Vision-CAIR/vicuna-7b Text Generation β’ Updated May 22, 2023 β’ 1k β’ 24 shibing624/ziya-llama-13b-medical-merged Text Generation β’ Updated Feb 19, 2024 β’ 7 β’ 26 chaoyi-wu/PMC_LLAMA_7B Text Generation β’ Updated May 17, 2023 β’ 619 β’ 65