Marcjoni commited on
Commit
7d52f84
·
verified ·
1 Parent(s): e555aac

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +23 -4
README.md CHANGED
@@ -10,11 +10,30 @@ tags:
10
  - yamatazen/EtherealAurora-12B-v2
11
  ---
12
 
 
 
13
  # AbyssSynth-12B
14
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
15
  AbyssSynth-12B is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
16
- * [yamatazen/LorablatedStock-12B](https://huggingface.co/yamatazen/LorablatedStock-12B)
17
- * [yamatazen/EtherealAurora-12B-v2](https://huggingface.co/yamatazen/EtherealAurora-12B-v2)
18
 
19
  ## 🧩 Configuration
20
 
@@ -75,7 +94,7 @@ from transformers import AutoTokenizer
75
  import transformers
76
  import torch
77
 
78
- model = "Marcjoni/AbyssSynth-12B"
79
  messages = [{"role": "user", "content": "What is a large language model?"}]
80
 
81
  tokenizer = AutoTokenizer.from_pretrained(model)
@@ -87,6 +106,6 @@ pipeline = transformers.pipeline(
87
  device_map="auto",
88
  )
89
 
90
- outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
91
  print(outputs[0]["generated_text"])
92
  ```
 
10
  - yamatazen/EtherealAurora-12B-v2
11
  ---
12
 
13
+ <img src="./AbyssSynth.png" alt="Model Image"/>
14
+
15
  # AbyssSynth-12B
16
 
17
+ <b><i>She holds the weight of a galaxy, a fragile infinity encased within a sphere of shimmering light.
18
+ <br> Darkness swirls beneath endless stars, vast and unfathomable, yet tempered by her calm grasp.
19
+ <br> Not a force of chaos, but of silent order, the abyss contained, a cosmos in delicate balance, where creation and oblivion meet.</i></b>
20
+
21
+ ## 🔧 Recommended Sampling Settings:</b>
22
+ ```yaml
23
+ Temperature: 0.75 to 1.25
24
+ Min P: 0.035
25
+ Context Length: Stable at 12k tokens, with possible support for extended contexts
26
+ ```
27
+ ## 💬 Prompt Format
28
+ Supports ChatML style messages. Example:
29
+ ```yaml
30
+ <|im_start|>user
31
+ Your question here.
32
+ <|im_end|>
33
+ <|im_start|>assistant
34
+ ```
35
+
36
  AbyssSynth-12B is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
 
 
37
 
38
  ## 🧩 Configuration
39
 
 
94
  import transformers
95
  import torch
96
 
97
+ model = "Marcjoni/AbyssSynth-12B-12B"
98
  messages = [{"role": "user", "content": "What is a large language model?"}]
99
 
100
  tokenizer = AutoTokenizer.from_pretrained(model)
 
106
  device_map="auto",
107
  )
108
 
109
+ outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=1, top_k=0, top_p=1)
110
  print(outputs[0]["generated_text"])
111
  ```