YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

A Moe model built on top of Qwen1.5-7B-Chat, Qwen1.5-7B and Crystalcareai/CrystalQwen-1.5-7B, then qlora via mlx

pip install mlx-moe
python -m mlx_moe.generate --model mzbac/qwen-1.5-2x3-sft-hf-4bit-mlx --prompt "how backpropagation works?" --eos-token "<|im_end|>"
Downloads last month
5
Safetensors
Model size
3.09B params
Tensor type
F16
·
U32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support