Fusion-7B-Quintessence / mergekit_config.yml
ilevytate's picture
Upload Primary Model Files
0359847 verified
raw
history blame contribute delete
310 Bytes
slices:
- sources:
- model: NousResearch/Nous-Hermes-2-Mistral-7B-DPO
layer_range: [0, 32]
- sources:
- model: NousResearch/Genstruct-7B
layer_range: [0, 32]
- sources:
- model: Weyaxi/Einstein-v4-7B
layer_range: [0, 32]
merge_method: passthrough
dtype: bfloat16