Question about clamping the LoRA weights

#1
by woctordho - opened

Hi, just a quick check: Did you clamp the LoRA weights when extracting?

It seems the clamping is a common practice, but it causes quality loss, see https://github.com/kijai/ComfyUI-FluxTrainer/issues/183

Yes, I extracted the Loras with the clamp activated but I thought it was only used in the quantile type, anyway I am extracting the Loras from the original model and I disabled this option, thanks for letting me know.

It seems the clamping is a common practice, but it causes quality loss, see https://github.com/kijai/ComfyUI-FluxTrainer/issues/183

This explains why I had so many difficulties extracting good LoRAs recently. Thank you!

Sign up or log in to comment