gguf quantized and fp8 scaled versions of illustrious (test pack)
setup (in general)
- drag gguf file(s) to diffusion_models folder (./ComfyUI/models/diffusion_models)
- drag clip or encoder(s), i.e., illustrious_g_clip and illustrious_l_clip, to text_encoders folder (./ComfyUI/models/text_encoders)
- drag vae decoder(s), i.e., vae, to illustrious_vae folder (./ComfyUI/models/vae)
run it straight (no installation needed way)
- get the comfy pack with the new gguf-node (pack)
- run the .bat file in the main directory
workflow
- drag any workflow json file to the activated browser; or
- drag any generated output file (i.e., picture, video, etc.; which contains the workflow metadata) to the activated browser
review
- use tag/word(s) as input for more accurate results for those legacy models; not very convenient (compare to the recent models) at the very beginning
- credits should be given to those contributors from civitai platform
- fast-illustrious gguf was quantized from fp8 scaled safetensors while illustrious gguf was quantized from the original bf16 (this is just an attempt to test: is it true? the trimmed model with 50% tensors lesser really load faster? please test it yourself; btw, some models might have their unique structure/feature affecting the loader performance, never one size fits all)
- fp8 scaled file works fine in this model; including vae and clips
- good to run on old machines, i.e., 9xx series or before (legacy mode [--disable-cuda-malloc --lowvram] supported); compatible with the new gguf-node
- disclaimer: some models (original files) are provided by someone else and we might not easily spot out the creator/contributor(s) behind, unless it was specified in the source; rather let it blank instead of anonymous/unnamed/unknown; if it is your work, do let us know; we will address it back properly and probably; thanks for everything
reference
- wai creator
- comfyui comfyanonymous
- gguf-node (pypi|repo|pack)
- Downloads last month
- 593
Model tree for calcuis/illustrious
Base model
KBlueLeaf/kohaku-xl-beta5