Not Working

#1
by rishabh1227 - opened

im getting an error about the weights UnetLoaderGGUF
'conv_in.weight'

Maybe comfyui is not updated?

Works for me on a fresh manual install of comfyui.

With Q8_0:

$ python main.py
... skip a lot ...
got prompt
Using pytorch attention in VAE
Using pytorch attention in VAE
VAE load device: cuda:0, offload device: cpu, dtype: torch.bfloat16
CLIP/text encoder model load device: cuda:0, offload device: cpu, current: cpu, dtype: torch.float16
Requested to load CosmosTEModel_
loaded completely 22628.175 9278.853515625 True
gguf qtypes: BF16 (6), F32 (148), Q8_0 (576)
model weight dtype torch.bfloat16, manual cast: None
model_type FLOW_COSMOS
unet unexpected: ['pos_embedder.seq', 'pos_embedder.dim_spatial_range', 'pos_embedder.dim_temporal_range']
Requested to load CosmosPredict2
loaded completely 16788.0498588562 14702.583984375 True
100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 35/35 [02:27<00:00,  4.22s/it]
Requested to load WanVAE
loaded completely 285.7851753234863 242.02829551696777 True
Prompt executed in 163.22 seconds

File "C:\ComfyUI_windows_portable\ComfyUI\execution.py", line 349, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUI_windows_portable\ComfyUI\execution.py", line 224, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUI_windows_portable\ComfyUI\execution.py", line 196, in _map_node_over_list
process_inputs(input_dict, i)
File "C:\ComfyUI_windows_portable\ComfyUI\execution.py", line 185, in process_inputs
results.append(getattr(obj, func)(**inputs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-GGUF\nodes.py", line 158, in load_unet
raise RuntimeError("ERROR: Could not detect model type of: {}".format(unet_path))
RuntimeError: ERROR: Could not detect model type of: C:\ComfyUI_windows_portable\ComfyUI\models\unet\cosmos-predict2-14b-text2image-Q8_0.gguf

the q8 worked now but q3 and q4 were not working for me

File "C:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-GGUF\nodes.py", line 158, in load_unet

At the first glance looks like comfyui does not understand what the Cosmos-Predict2 model is.

Probably needed comfy update, support was just merged three days ago and some fixes were merged two days ago too:

https://github.com/comfyanonymous/ComfyUI/pull/8517

the q8 worked now but q3 and q4 were not working for me

Ok, good that it worked.

Sign up or log in to comment