mpt-7b-8k-instruct-sharded-bf16-2GB / generation_config.json
czurita's picture
Upload MPTForCausalLM
b33a24b
raw
history blame contribute delete
96 Bytes
{
"_from_model_config": true,
"transformers_version": "4.32.0.dev0",
"use_cache": false
}