MN-12B-solracht-EXPERIMENTAL-011425

This is a merge of pre-trained language models created using mergekit.

Merge Details

LOWER YOUR EXPECTATIONS.

This is an experimental release of MN-12B-Mag-Mell, to test the NuSLERP feature in Mergekit. The expectation is that this model behaves exactly like Mag Mell R1.

It has been observed in testing that it doesn't produce literally the same outputs, despite being in theory a replication of legacy SLERP behavior with NuSLERP hyperparameters. After pondering while this was uploading, it appears likely that the reason for the difference is that DARE pruned different sets of parameters each time. To reiterate: The expectation is that this has ==the exact same problems== that Mag Mell does. I'm posting this so that people can tell me whether or not this is the case.

Merge Method

This model was merged using the DARE TIES merge method using IntervitensInc/Mistral-Nemo-Base-2407-chatml as a base.

Models Merged

The following models were included in the merge:

  • output/wind-r0
  • output/water-r0
  • output/earth-r0

Configuration

The following YAML configuration was used to produce this model:

models:
 - model: output/earth-r0
   parameters:
     density: 0.7
     weight: 0.5
 - model: output/water-r0
   parameters:
     density: 0.9
     weight: 1
 - model: output/wind-r0
   parameters:
     density: 0.5
     weight: 0.7
merge_method: dare_ties
base_model: IntervitensInc/Mistral-Nemo-Base-2407-chatml
tokenizer_source: base
Downloads last month
8
Safetensors
Model size
12.2B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Alfitaria/MN-12B-solracht-EXPERIMENTAL-011425

Finetuned
(3)
this model
Quantizations
1 model