H34v7 - Models Merge and Collections FP16/BF16
Collection
My merge collections.
โข
3 items
โข
Updated
"temperature": 0.8,
"top_k": 40,
"top_p": 0.95,
"min_p": 0.05,
"repeat_last_n": 40,
"repeat_penalty": 1.2,
These are non-imatrix. I'll release the imatrix version later.
GGUF 3-bit Q3_K_M about 27 GB of vRAM/RAM:
ollama run hf.co/h34v7/DansXPantheon-RP-Engine-V1.2-24b-Small-Instruct-Ties-Merge-GGUF:Q3_K_M
GGUF 4-bit Q4_K_M about 30 GB of vRAM/RAM:
ollama run hf.co/h34v7/DansXPantheon-RP-Engine-V1.2-24b-Small-Instruct-Ties-Merge-GGUF:Q4_K_M
GGUF 5-bit Q5_K_M about 33 GB of vRAM/RAM:
ollama run hf.co/h34v7/DansXPantheon-RP-Engine-V1.2-24b-Small-Instruct-Ties-Merge-GGUF:Q5_K_M
Using GGUF Q5-K-M default Ollama settings. FP-16 here.
User: hi
Assistance: Hello! It's nice to meet you. How are you doing today? ๐
User: alright tell me some joke
Assistance: Sure, here's a lighthearted joke for you:
What do you call a fake noodle?
An impasta! ๐๐คช
Hope that made you chuckle. If you'd like another one or need any other assistance, just let me know!
User: Give me 5 non-formal ways to say 'See you later' in French.
Assistance: Certainly! Here are five casual ways to say "see you later" in French:
1. ร plus tard - This is a very common and informal way to bid farewell.
2. Salut ร plus - A more relaxed version that combines "hello/goodbye" (salut) with "see you later."
3. Ciao ร plus - Using the Italian "ciao" gives it an even more casual feel.
4. Bisous ร plus tard - This one adds a friendly touch by including "kisses" before saying see you later.
5. ร la prochaine - Literally means "until next time," but is often used casually as well.
Remember that French people generally use "tu" (informal) with friends and family, while "vous" is more formal for strangers or in professional settings. These phrases work best when using the informal "tu." ๐ซ๐ทโจ
This is a merge of pre-trained language models created using mergekit.
This model was merged using the TIES merge method using Gryphe/Pantheon-RP-1.8-24b-Small-3.1 as a base.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
models:
- model: Gryphe/Pantheon-RP-1.8-24b-Small-3.1
parameters:
density: 0.7
weight: 0.7
- model: PocketDoc/Dans-PersonalityEngine-V1.2.0-24b
parameters:
density: 0.5
weight: 0.5
merge_method: ties
base_model: Gryphe/Pantheon-RP-1.8-24b-Small-3.1
parameters:
normalize: false
int8_mask: true
dtype: float16
tokenizer:
source: union
Motherboard: Asus Maximus Z790
CPU: Intel i9-13900KS
RAM: 96 GB 6400mhz
GPU: RTX-4090 OC 24GB