from transformers import AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained(
    "cnmoro/gpt-oss-20b-tokenizer-optional-reasoning"
)

gen_input = tokenizer.apply_chat_template(
    [{"role": "user", "content": "Hello!"}],
    tokenize=False,
    add_generation_prompt=True, # <-- Will be added anyway if reasoning is disabled, otherwise, default behavior
    reasoning_effort="disabled" # <-- New option. Disable thinking completely
)
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for cnmoro/gpt-oss-20b-tokenizer-optional-reasoning

Base model

openai/gpt-oss-20b
Finetuned
(281)
this model