Contexual-Llama-18B-RAG / mergekit_config.yml
TroyDoesAI's picture
I pressed ctrl z and saved before uploading on accident, corrected mergekit
f884182 verified
raw
history blame contribute delete
323 Bytes
slices:
- sources:
- model: TroyDoesAI/Contextual-Llama-13B-RAG
layer_range: [0, 24]
- sources:
- model: cognitivecomputations/Samantha-1.11-13b
layer_range: [18, 36]
- sources:
- model: TroyDoesAI/Contextual-Llama-13B-RAG
layer_range: [24, 40]
merge_method: passthrough
dtype: float16