SOG_MSLERP_MULTI
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the DELLA merge method using D:\mergekit_My_YAMLS\70B_mSlOG_un as a base.
Models Merged
The following models were included in the merge:
- c:\LLM\INT\flam
Configuration
The following YAML configuration was used to produce this model:
base_model: D:\mergekit\_My_YAMLS\70B_mSlOG_un
dtype: float32
merge_method: della
modules:
default:
slices:
- sources:
- layer_range: [0, 80]
model: c:\LLM\INT\flam
parameters:
density: 0.2
epsilon: 0.1
weight: 0.2
- layer_range: [0, 80]
model: D:\mergekit\_My_YAMLS\70B_mSlOG_un
parameters:
density: 1.0
epsilon: 0.0
weight: 0.8
out_dtype: bfloat16
parameters:
int8_mask: 0.0
lambda: 1.0
normalize: 0.0
tokenizer_source: D:\mergekit\_My_YAMLS\70B_mSlOG_un
- Downloads last month
- 0
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support