optimized demos for Wan 2.2 14B models, using FP8 quantization + AoT compilation & community LoRAs for fast & high quality inference on ZeroGPU π¨
ZeroGPU AoTI
community
AI & ML interests
AoT compilation, ZeroGPU inference optimization
Recent Activity
View all activity
spaces
7
pinned
Running
on
Zero
MCP
FLUX.1 Kontext fp8
β‘
Kontext image editing on FLUX[dev]
pinned
Running
on
Zero
MCP
1
FLUX.1 Kontext
β‘
Kontext image editing on FLUX[dev]
Running
on
Zero
MCP
27
Wan2.2 14B Fast
π₯
generate a video from an image with a text prompt
Running
on
Zero
FA3 Builder
π
Build and download FlashAttention-3 wheel
Running
on
Zero
MCP
2
Wan 2.2 14B Fast T2V
π₯
generate a video from a text prompt
Running
on
Zero
MCP
2
Wan 2.2 14B Image
πΈ
generate images from a text prompt with Wan 2.2 T2V
models
0
None public yet
datasets
0
None public yet