runtime error
Exit code: 1. Reason: .39.attn2.add_v_proj.lora_B.causvid_lora.bias, blocks.39.attn2.to_out.0.lora_B.causvid_lora.bias, blocks.39.ffn.net.0.proj.lora_B.causvid_lora.bias, blocks.39.ffn.net.2.lora_B.causvid_lora.bias, proj_out.lora_B.causvid_lora.bias. Traceback (most recent call last): File "/home/user/app/app.py", line 29, in <module> pipe.fuse_lora() File "/usr/local/lib/python3.10/site-packages/diffusers/loaders/lora_pipeline.py", line 5430, in fuse_lora super().fuse_lora( File "/usr/local/lib/python3.10/site-packages/diffusers/loaders/lora_base.py", line 608, in fuse_lora model.fuse_lora(lora_scale, safe_fusing=safe_fusing, adapter_names=adapter_names) File "/usr/local/lib/python3.10/site-packages/diffusers/loaders/peft.py", line 659, in fuse_lora self.apply(partial(self._fuse_lora_apply, adapter_names=adapter_names)) File "/usr/local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1029, in apply module.apply(fn) File "/usr/local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1029, in apply module.apply(fn) File "/usr/local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1029, in apply module.apply(fn) File "/usr/local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1030, in apply fn(self) File "/usr/local/lib/python3.10/site-packages/diffusers/loaders/peft.py", line 681, in _fuse_lora_apply module.merge(**merge_kwargs) File "/usr/local/lib/python3.10/site-packages/peft/tuners/lora/layer.py", line 676, in merge delta_weight = self.get_delta_weight(active_adapter) File "/usr/local/lib/python3.10/site-packages/peft/tuners/lora/layer.py", line 731, in get_delta_weight output_tensor = transpose(weight_B @ weight_A, self.fan_in_fan_out) * self.scaling[adapter] File "/usr/local/lib/python3.10/site-packages/spaces/zero/torch/patching.py", line 220, in __torch_function__ raise RuntimeError( RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:0 (ZeroGPU) and cpu!
Container logs:
Fetching error logs...