adding context length var to config.json, this is the same var used in bamaba v2 9b

#6
by anakin004 - opened

this is good for using model in llama cpp as it there is an issue of identify its context length

ibm-ai-platform org

ok as discussed with @rganti indeed the max_length in bamba v1 should be 4K, same as V2. so merging.

mirinflim changed pull request status to merged

Sign up or log in to comment