This open-source model was created by Mistral AI.
You can find the release blog post here.
The model is available on the huggingface hub: https://huggingface.co/mistralai/Ministral-8B-Instruct-2410.
The model has 8B parameters, and supports up to 128K token contexts.