nbeerbower commited on
Commit
14668f3
·
verified ·
1 Parent(s): 5c7f0b0

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -3
README.md CHANGED
@@ -1,5 +1,6 @@
1
  ---
2
  base_model:
 
3
  - nbeerbower/Qwen2.5-32B-abliterated-LORA
4
  library_name: transformers
5
  tags:
@@ -14,7 +15,7 @@ This is a merge of pre-trained language models created using [mergekit](https://
14
  ## Merge Details
15
  ### Merge Method
16
 
17
- This model was merged using the [task arithmetic](https://arxiv.org/abs/2212.04089) merge method using /home/nbeerbower/AI/nvidia/text-generation-webui/models/EVA-Rombos1-Qwen2.5-32B + [nbeerbower/Qwen2.5-32B-abliterated-LORA](https://huggingface.co/nbeerbower/Qwen2.5-32B-abliterated-LORA) as a base.
18
 
19
  ### Models Merged
20
 
@@ -26,7 +27,7 @@ The following models were included in the merge:
26
  The following YAML configuration was used to produce this model:
27
 
28
  ```yaml
29
- base_model: /home/nbeerbower/AI/nvidia/text-generation-webui/models/EVA-Rombos1-Qwen2.5-32B+nbeerbower/Qwen2.5-32B-abliterated-LORA
30
  dtype: bfloat16
31
  merge_method: task_arithmetic
32
  parameters:
@@ -34,7 +35,7 @@ parameters:
34
  slices:
35
  - sources:
36
  - layer_range: [0, 64]
37
- model: /home/nbeerbower/AI/nvidia/text-generation-webui/models/EVA-Rombos1-Qwen2.5-32B+nbeerbower/Qwen2.5-32B-abliterated-LORA
38
  parameters:
39
  weight: 1.0
40
 
 
1
  ---
2
  base_model:
3
+ - nbeerbower/EVA-Rombos1-Qwen2.5-32B
4
  - nbeerbower/Qwen2.5-32B-abliterated-LORA
5
  library_name: transformers
6
  tags:
 
15
  ## Merge Details
16
  ### Merge Method
17
 
18
+ This model was merged using the [task arithmetic](https://arxiv.org/abs/2212.04089) merge method using [nbeerbower/EVA-Rombos1-Qwen2.5-32B](https://huggingface.co/nbeerbower/EVA-Rombos1-Qwen2.5-32B) + [nbeerbower/Qwen2.5-32B-abliterated-LORA](https://huggingface.co/nbeerbower/Qwen2.5-32B-abliterated-LORA) as a base.
19
 
20
  ### Models Merged
21
 
 
27
  The following YAML configuration was used to produce this model:
28
 
29
  ```yaml
30
+ base_model: nbeerbower/EVA-Rombos1-Qwen2.5-32B+nbeerbower/Qwen2.5-32B-abliterated-LORA
31
  dtype: bfloat16
32
  merge_method: task_arithmetic
33
  parameters:
 
35
  slices:
36
  - sources:
37
  - layer_range: [0, 64]
38
+ model: nbeerbower/EVA-Rombos1-Qwen2.5-32B+nbeerbower/Qwen2.5-32B-abliterated-LORA
39
  parameters:
40
  weight: 1.0
41