final_model

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the DARE TIES merge method using CultriX/SeQwence-14Bv1 as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

base_model: CultriX/SeQwence-14Bv1
dtype: bfloat16
merge_method: dare_ties
parameters:
  int8_mask: 1.0
  normalize: 1.0
slices:
- sources:
  - layer_range: [0, 8]
    model: CultriX/SeQwence-14Bv1
    parameters:
      density: 1.0
      weight: 0.34927958017496047
  - layer_range: [0, 8]
    model: CultriX/Qwestion-14B
    parameters:
      density: 1.0
      weight: 0.4785529567298472
  - layer_range: [0, 8]
    model: CultriX/SeQwence-14Bv2
    parameters:
      density: 0.9095619834430182
      weight: 0.08292400341270245
- sources:
  - layer_range: [8, 16]
    model: CultriX/SeQwence-14Bv1
    parameters:
      density: 1.0
      weight: 0.31847489577754107
  - layer_range: [8, 16]
    model: CultriX/Qwestion-14B
    parameters:
      density: 1.0
      weight: 0.34008726542768253
  - layer_range: [8, 16]
    model: CultriX/SeQwence-14Bv2
    parameters:
      density: 1.0
      weight: -0.010187285487908426
- sources:
  - layer_range: [16, 24]
    model: CultriX/SeQwence-14Bv1
    parameters:
      density: 1.0
      weight: 0.1562216100470764
  - layer_range: [16, 24]
    model: CultriX/Qwestion-14B
    parameters:
      density: 1.0
      weight: 0.31090250951964327
  - layer_range: [16, 24]
    model: CultriX/SeQwence-14Bv2
    parameters:
      density: 0.8226944254037076
      weight: 0.4055505847346826
- sources:
  - layer_range: [24, 32]
    model: CultriX/SeQwence-14Bv1
    parameters:
      density: 1.0
      weight: 0.1478643123383346
  - layer_range: [24, 32]
    model: CultriX/Qwestion-14B
    parameters:
      density: 0.8233564236912981
      weight: 0.34508971280776113
  - layer_range: [24, 32]
    model: CultriX/SeQwence-14Bv2
    parameters:
      density: 1.0
      weight: 0.47963393901209633
- sources:
  - layer_range: [32, 40]
    model: CultriX/SeQwence-14Bv1
    parameters:
      density: 0.9078052860602195
      weight: 0.5051482718423455
  - layer_range: [32, 40]
    model: CultriX/Qwestion-14B
    parameters:
      density: 1.0
      weight: 0.21938011111527006
  - layer_range: [32, 40]
    model: CultriX/SeQwence-14Bv2
    parameters:
      density: 0.9287247232625168
      weight: 0.12414619742696054
- sources:
  - layer_range: [40, 48]
    model: CultriX/SeQwence-14Bv1
    parameters:
      density: 1.0
      weight: 0.1932759286778445
  - layer_range: [40, 48]
    model: CultriX/Qwestion-14B
    parameters:
      density: 0.9846832888894079
      weight: 0.572903756192807
  - layer_range: [40, 48]
    model: CultriX/SeQwence-14Bv2
    parameters:
      density: 1.0
      weight: 0.33759567132306584

Open LLM Leaderboard Evaluation Results

Detailed results can be found here

Metric Value
Avg. 34.41
IFEval (0-Shot) 57.19
BBH (3-Shot) 46.39
MATH Lvl 5 (4-Shot) 22.13
GPQA (0-shot) 15.32
MuSR (0-shot) 17.27
MMLU-PRO (5-shot) 48.17
Downloads last month
8
Safetensors
Model size
14.8B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for CultriX/SeQwence-14Bv3

Evaluation results