Context length and reasoning length?

#6
by KrishnaKaasyap - opened

image.png

As shown in the model card image here, can the model think for 48k tokens? For reference, o3-mini-high can think for 100k tokens, but usually stops at 30k-40k tokens and Sonnet 3.7 Thinking tokens are reported primarily for 16k tokens (tho it can think for 64k tokens).

Also please share details about native tool calling ability!

KrishnaKaasyap changed discussion title from Context length and reasoning lenght? to Context length and reasoning length?

Sign up or log in to comment