GGUF
conversational

Wrong sampler settings

#11
by Garf - opened

https://lmstudio.ai/models/openai/gpt-oss-20b

says

fields:
      - key: llm.prediction.temperature
        value: 0.8
      - key: llm.prediction.topKSampling
        value: 40
      - key: llm.prediction.topPSampling
        value:
          checked: true
          value: 0.8
      - key: llm.prediction.repeatPenalty
        value:
          checked: true
          value: 1.1
      - key: llm.prediction.minPSampling
        value:
          checked: true
          value: 0.05

However OpenAI recommends temp=1, top_p=1.0 for this model. min_p shouldn't be set either.

Sign up or log in to comment