merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the passthrough merge method.
Models Merged
The following models were included in the merge:
- mistralai/Codestral-22B-v0.1
- nvidia/Mistral-NeMo-Minitron-8B-Instruct
- mistralai/Mistral-Large-Instruct-2407
- mistralai/Mistral-Nemo-Base-2407
- mistralai/Mathstral-7B-v0.1
Configuration
The following YAML configuration was used to produce this model:
dtype: float16
merge_method: passthrough
slices:
- sources:
- layer_range: [0, 8]
model: mistralai/Mistral-Large-Instruct-2407
- sources:
- layer_range: [4, 12]
model: nvidia/Mistral-NeMo-Minitron-8B-Instruct
- sources:
- layer_range: [8, 16]
model: mistralai/Mistral-Nemo-Base-2407
- sources:
- layer_range: [12, 20]
model: mistralai/Codestral-22B-v0.1
- sources:
- layer_range: [16, 24]
model: mistralai/Mathstral-7B-v0.1
- Downloads last month
- 15
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.