Merged jukofyork/qwq-32b-writer-multiplicative-lora into Qwen/QwQ-32B using jukofyork/merge-lora.

Untested... But appears to have worked:

โœ“ Successfully merged and uploaded model!
Model URL: https://huggingface.co/jukofyork/qwq-32b-writer
Merge mode: Multiplicative
Scale factor: 1
Processed 14 shards
Merged 62 layers with LoRA weights
Downloads last month
37
Safetensors
Model size
32.8B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for jukofyork/qwq-32b-writer

Base model

Qwen/Qwen2.5-32B
Finetuned
Qwen/QwQ-32B
Finetuned
(76)
this model
Quantizations
2 models

Collection including jukofyork/qwq-32b-writer