--- base_model: - flammenai/Llama3.1-Flammades-70B - flammenai/Mahou-1.5-llama3.1-70B - rinna/llama-3-youko-70b - nbeerbower/Llama3-Sapientia-70B library_name: transformers tags: - mergekit - merge license: llama3.3 --- ![image/png](https://huggingface.co/nbeerbower/Llama3-Asobi-70B/resolve/main/asobi_cover.png?download=true) # Llama3-Asobi-70B This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [Llama3-Sapientia-70B](https://huggingface.co/nbeerbower/Llama3-Sapientia-70B) as a base. ### Models Merged The following models were included in the merge: * [flammenai/Llama3.1-Flammades-70B](https://huggingface.co/flammenai/Llama3.1-Flammades-70B) * [flammenai/Mahou-1.5-llama3.1-70B](https://huggingface.co/flammenai/Mahou-1.5-llama3.1-70B) * [rinna/llama-3-youko-70b](https://huggingface.co/rinna/llama-3-youko-70b) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: flammenai/Mahou-1.5-llama3.1-70B parameters: weight: 1 density: 1 - model: flammenai/Llama3.1-Flammades-70B parameters: weight: 1 density: 1 - model: rinna/llama-3-youko-70b parameters: weight: 1 density: 1 merge_method: ties base_model: /root/Llama3-Sapientia-70B parameters: weight: 1 density: 1 normalize: true int8_mask: true dtype: bfloat16 ```