--- base_model: [] library_name: transformers tags: - mergekit - merge --- Human-like model. A good tradeoff between intelligence and human-like The following YAML configuration was used to produce this model: ```yaml models: - model: Francois-PE-12B parameters: weight: 0.3 density: 0.4 - model: Nera_Noctis-12B parameters: weight: 0.3 density: 0.4 - model: BlueLight-12B parameters: weight: 0.5 density: 0.8 base_model: Mistral-Nemo-Base-2407 parameters: lambda: 1.0 epsilon: 0.1 rescale: true normalize: false int8_mask: true merge_method: della_linear tokenizer: source: union dtype: bfloat16 ```