--- base_model: - LeroyDyer/_Spydaz_Web_AI_AGI_R1_OmG_Coder - LeroyDyer/_Spydaz_Web_AI_AGI_R1_Math_AdvancedStudent - LeroyDyer/LCARS_TOP_SCORE - LeroyDyer/_Spydaz_Web_AI_AGI_R1_Math_001 library_name: transformers tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [LeroyDyer/_Spydaz_Web_AI_AGI_R1_Math_AdvancedStudent](https://huggingface.co/LeroyDyer/_Spydaz_Web_AI_AGI_R1_Math_AdvancedStudent) as a base. ### Models Merged The following models were included in the merge: * [LeroyDyer/_Spydaz_Web_AI_AGI_R1_OmG_Coder](https://huggingface.co/LeroyDyer/_Spydaz_Web_AI_AGI_R1_OmG_Coder) * [LeroyDyer/LCARS_TOP_SCORE](https://huggingface.co/LeroyDyer/LCARS_TOP_SCORE) * [LeroyDyer/_Spydaz_Web_AI_AGI_R1_Math_001](https://huggingface.co/LeroyDyer/_Spydaz_Web_AI_AGI_R1_Math_001) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: LeroyDyer/LCARS_TOP_SCORE parameters: density: 0.128 weight: [0.128, 0.128, 0.768, 0.256] # weight gradient - model: LeroyDyer/_Spydaz_Web_AI_AGI_R1_Math_001 parameters: density: 0.256 weight: [0.256, 0.128, 0.256, 0.768] # weight gradient - model: LeroyDyer/_Spydaz_Web_AI_AGI_R1_OmG_Coder parameters: density: 0.512 weight: [0.256, 0.512, 0.768, 0.512] # weight gradient - model: LeroyDyer/_Spydaz_Web_AI_AGI_R1_Math_AdvancedStudent parameters: density: 0.768 weight: - filter: mlp value: 0.768 - value: 0.128 merge_method: ties base_model: LeroyDyer/_Spydaz_Web_AI_AGI_R1_Math_AdvancedStudent parameters: normalize: true int8_mask: true dtype: float16 ```