Casual-Autopsy commited on
Commit
e7f216e
1 Parent(s): cf6ec58

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -14,13 +14,13 @@ base_model:
14
 
15
  This is a remerge of [bluuwhale's merger](https://huggingface.co/bluuwhale/L3-SAO-MIX-8B-V1) using the exact yaml config with the only difference being that merge calculations are done in fp32 instead of bfp16
16
 
 
 
17
  ## Merge Details
18
  ### Merge Method
19
 
20
  This model was merged using the della merge method using [Sao10K/L3-8B-Niitama-v1](https://huggingface.co/Sao10K/L3-8B-Niitama-v1) as a base.
21
 
22
- I've done this since I'm planning to use this for another merger, but you can use as is if you wish.
23
-
24
  ### Models Merged
25
 
26
  The following models were included in the merge:
 
14
 
15
  This is a remerge of [bluuwhale's merger](https://huggingface.co/bluuwhale/L3-SAO-MIX-8B-V1) using the exact yaml config with the only difference being that merge calculations are done in fp32 instead of bfp16
16
 
17
+ I've done this since I'm planning to use this for another merger, but you can use as is if you wish.
18
+
19
  ## Merge Details
20
  ### Merge Method
21
 
22
  This model was merged using the della merge method using [Sao10K/L3-8B-Niitama-v1](https://huggingface.co/Sao10K/L3-8B-Niitama-v1) as a base.
23
 
 
 
24
  ### Models Merged
25
 
26
  The following models were included in the merge: