This is a model that is assumed to perform well, but may require more testing and user feedback. Be aware, only models featured within the GUI of GPT4All, are curated and officially supported by Nomic. Use at your own risk.
About
Model converted and quantized by: 3Simplex.
GPT4All v3.1.1 required.
Prompt Template
<|start_header_id|>system<|end_header_id|>
{system_prompt}<|eot_id|>
<|start_header_id|>user<|end_header_id|>
{user_input}<|eot_id|><|start_header_id|>assistant<|end_header_id|>
{assistant_response}
128k Context Length
"llama.context_length": 131072
- Downloads last month
- 7,737
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model's library.
Model tree for GPT4All-Community/Meta-Llama-3.1-8B-Instruct-128k-GGUF
Base model
meta-llama/Llama-3.1-8B
Finetuned
meta-llama/Llama-3.1-8B-Instruct