-
wangrongsheng/Aurora
Text Generation • Updated • 20 -
Aurora:Activating Chinese chat capability for Mixtral-8x7B sparse Mixture-of-Experts through Instruction-Tuning
Paper • 2312.14557 • Published -
wangrongsheng/Aurora-Plus
Text Generation • Updated • 3 -
wangrongsheng/Aurora-dpo
Text Generation • Updated • 1
wangrongsheng
wangrongsheng
AI & ML interests
None yet
Recent Activity
upvoted
a
paper
2 days ago
QFFT, Question-Free Fine-Tuning for Adaptive Reasoning
upvoted
a
paper
4 days ago
LlamaFactory: Unified Efficient Fine-Tuning of 100+ Language Models
liked
a model
4 days ago
MiniMaxAI/MiniMax-M1-40k