Work in progress

The model files are almost identical to our t5sdxl-v0-bf16 model. However it has had its model_config.json adjusted, so that it will work with new code, that will be going into the "community" diffusers pipeline area.

Alternatively, there is now a "demo.py" script that can use diffusers pipeline styles relatively cleanly, AS-IS!

Precision

Note that the unet is, sadly, only bf16 at this time, since we only have 4090s

Usage

You can use it with the sample code in demo.py

Forward notes

Hmm.. in retrospect... perhaps it would be better to use chatpig/t5-v1_1-xl-encoder-gguf instead of the xxl version.

That is natively 2048 dim, so no need for a projection layer.

Downloads last month
94
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support