YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Qwen Omni Hugging Face Inference Endpoint Handler
This directory contains a reusable custom handler for deploying Qwen 3 Omni models (via the Hugging Face Inference Endpoints service). The handler mirrors the multi-modal interaction blueprint from the official Qwen audio/visual dialogue cookbook and supports text, image, and audio turns in a single payload.
Files
handler.py
โ entry-point loaded by the Inference Endpoint runtime.requirements.txt
โ Python dependencies installed before the handler is imported.
Usage
- Upload the contents of this directory (
handler.py
,requirements.txt
) to a Hugging Face model repository that you control (defaults toGrandMasterPomidor/qwen-omni-endpoint-handler
via the provided Makefile). - Provision a custom Inference Endpoint that references that repository and the
Qwen Omni model weights you wish to serve. Set environment variables such as
MODEL_ID
to point at your chosen checkpoint (e.g.Qwen/Qwen2.5-Omni-Mini
). - Send JSON payloads to the endpoint as documented in the header docstring of
handler.py
.
Refer to the accompanying Makefile
for convenience targets to package and
push these assets.
- Downloads last month
- 59
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support