4096 context length - is that correct?
#2
by
smcleod
- opened
Just checking is that perhaps a typo for 40960 or 409600?
Hey @smcleod , an extended context length version is forthcoming. Additionally, note that our models were trained with RoPE, allowing them to accept inputs beyond 4K tokens even in their current state.
Hi, thanks again for the inquiry! We’re currently working on closing out / responding to old tickets, so we’re closing this out for now, but if you require a follow-up response, please re-open and we will get back to you!
baileyk
changed discussion status to
closed