Chat Template
Mistral Instruct
{{ if .System }}<|im_start|>system
{{ .System }}<|im_end|>
{{ end }}{{ if .Prompt }}<|im_start|>user
{{ .Prompt }}<|im_end|>
{{ end }}<|im_start|>assistant
{{ .Response }}<|im_end|>
ChatML
{{ if .System }}<|im_start|>system
{{ .System }}<|im_end|>
{{ end }}{{ if .Prompt }}<|im_start|>user
{{ .Prompt }}<|im_end|>
{{ end }}<|im_start|>assistant
{{ .Response }}{{ if .Response }}<|im_end|>{{ end }}
GGUF
Thank you mradermacher for creating the GGUF versions of this model.
- Static quants - mradermacher/MistralCreative-24B-Instruct-GGUF
- Imatrix quants - mradermacher/MistralCreative-24B-Instruct-i1-GGUF
Merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the TIES merge method using anthracite-core/Mistral-Small-3.1-24B-Instruct-2503-HF as a base.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
name: Sorawiz/MistralCreative-24B-Test-E
merge_method: dare_ties
base_model: Sorawiz/MistralCreative-24B-Chat
models:
- model: Sorawiz/MistralCreative-24B-Chat
parameters:
weight: 0.20
- model: Gryphe/Pantheon-RP-1.8-24b-Small-3.1
parameters:
weight: 0.20
- model: ReadyArt/Forgotten-Transgression-24B-v4.1
parameters:
weight: 0.30
- model: ReadyArt/Forgotten-Abomination-24B-v4.0
parameters:
weight: 0.30
parameters:
density: 1
tokenizer:
source: union
chat_template: auto
---
name: Sorawiz/MistralCreative-24B-Test-U
merge_method: dare_ties
base_model: Sorawiz/MistralCreative-24B-Test-E
models:
- model: Sorawiz/MistralCreative-24B-Test-E
parameters:
weight: 0.3
- model: ReadyArt/Gaslight-24B-v1.0
parameters:
weight: 0.5
- model: Gryphe/Pantheon-RP-1.8-24b-Small-3.1
parameters:
weight: 0.2
parameters:
density: 0.70
tokenizer:
source: union
chat_template: auto
---
models:
- model: anthracite-core/Mistral-Small-3.1-24B-Instruct-2503-HF
- model: Sorawiz/MistralCreative-24B-Test-U
parameters:
density: 1.00
weight: 1.00
- model: ReadyArt/The-Omega-Directive-M-24B-v1.0
parameters:
density: 1.00
weight: 1.00
merge_method: ties
base_model: anthracite-core/Mistral-Small-3.1-24B-Instruct-2503-HF
parameters:
normalize: true
dtype: float32
- Downloads last month
- 23
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for Sorawiz/MistralCreative-24B-Instruct
Merge model
this model