Update README.md
Browse files
README.md
CHANGED
@@ -8,7 +8,7 @@ Open_Gpt4_v0.2
|
|
8 |
|
9 |
![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/642cc1c253e76b4c2286c58e/T7QKB0fKNHQvNqAjm8zrH.jpeg)
|
10 |
|
11 |
-
This model is a TIES merger of Mixtral-8x7B-Instruct-v0.1 and bagel-8x7b-v0.2 with MixtralOrochi8x7B being the Base model.
|
12 |
|
13 |
|
14 |
I was very impressed with MixtralOrochi8x7B performance and multifaceted usecases as it is already a merger of many usefull Mixtral models such as Mixtral instruct,
|
@@ -24,7 +24,7 @@ Merged models:
|
|
24 |
|
25 |
- https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1
|
26 |
|
27 |
-
- https://huggingface.co/jondurbin/bagel-8x7b-v0.2
|
28 |
|
29 |
|
30 |
Instruct template: Alpaca
|
@@ -37,7 +37,7 @@ models:
|
|
37 |
parameters:
|
38 |
density: .5
|
39 |
weight: .7
|
40 |
-
- model: bagel-8x7b-v0.2
|
41 |
parameters:
|
42 |
density: .5
|
43 |
weight: 1
|
|
|
8 |
|
9 |
![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/642cc1c253e76b4c2286c58e/T7QKB0fKNHQvNqAjm8zrH.jpeg)
|
10 |
|
11 |
+
This model is a TIES merger of Mixtral-8x7B-Instruct-v0.1 and bagel-dpo-8x7b-v0.2 with MixtralOrochi8x7B being the Base model.
|
12 |
|
13 |
|
14 |
I was very impressed with MixtralOrochi8x7B performance and multifaceted usecases as it is already a merger of many usefull Mixtral models such as Mixtral instruct,
|
|
|
24 |
|
25 |
- https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1
|
26 |
|
27 |
+
- https://huggingface.co/jondurbin/bagel-dpo-8x7b-v0.2
|
28 |
|
29 |
|
30 |
Instruct template: Alpaca
|
|
|
37 |
parameters:
|
38 |
density: .5
|
39 |
weight: .7
|
40 |
+
- model: bagel-dpo-8x7b-v0.2
|
41 |
parameters:
|
42 |
density: .5
|
43 |
weight: 1
|