L3-GothicMaid-8B / mergekit_config.yml
yamatazen's picture
Upload folder using huggingface_hub
3476d81 verified
raw
history blame contribute delete
305 Bytes
base_model: Sao10K/L3-8B-Stheno-v3.2
models:
- model: FPHam/L3-8B-Everything-COT
parameters:
density: 0.75
weight: 1.0
- model: HumanLLMs/Human-Like-LLama3-8B-Instruct
parameters:
density: 0.5
weight: 0.5
merge_method: ties
dtype: bfloat16
parameters:
normalize: true