Context increase

#1
by kanguru - opened

Love your work, David! You mention that the context is 32k but can be extended to 128k. How would I do that in LM-studio? Would I change both RoPE Base and RoPE Scaling or just one of them?

Owner

Hey;

I don't know if you can do this (correctly) with rope in LMS.
I modified the source files, then quanted from these as per directions at Qwen's repo + some minor adjustments.

That being said, you can use llama-server.exe with "yarn adjustments" (a different form of "rope"), so you may be able to adjust context with rope in LMS.
Rope base 32,768 ; scale .5 (to make it 65k)
scale .25 => 128k
scale .125 => 256K.

Pretty sure there will be quants of all sizes with diff context; as Unsloth is making them, and I plan on making them too.

Sign up or log in to comment