DansXPantheon-RP-Engine-V1.2-24b-Small-Instruct-Ties-Merge

  • Fixed EOS Token
  • Better merge and less yapping.
  • Can use emoji somehow? Like almost like Gemma-3 i don't know how or why but its just cute sometimes.

Settings

"temperature": 0.8,
"top_k": 40,
"top_p": 0.95,
"min_p": 0.05,
"repeat_last_n": 40,
"repeat_penalty": 1.2,

Run on Ollama

These are non-imatrix. I'll release the imatrix version later.

GGUF 3-bit Q3_K_M about 27 GB of vRAM/RAM:

ollama run hf.co/h34v7/DansXPantheon-RP-Engine-V1.2-24b-Small-Instruct-Ties-Merge-GGUF:Q3_K_M

GGUF 4-bit Q4_K_M about 30 GB of vRAM/RAM:

ollama run hf.co/h34v7/DansXPantheon-RP-Engine-V1.2-24b-Small-Instruct-Ties-Merge-GGUF:Q4_K_M

GGUF 5-bit Q5_K_M about 33 GB of vRAM/RAM:

ollama run hf.co/h34v7/DansXPantheon-RP-Engine-V1.2-24b-Small-Instruct-Ties-Merge-GGUF:Q5_K_M

Output Sample

Using GGUF Q5-K-M default Ollama settings. FP-16 here.

User: hi
Assistance: Hello! It's nice to meet you. How are you doing today? ๐Ÿ˜Š

User: alright tell me some joke
Assistance: Sure, here's a lighthearted joke for you:
What do you call a fake noodle?
An impasta! ๐Ÿ๐Ÿคช
Hope that made you chuckle. If you'd like another one or need any other assistance, just let me know!

User: Give me 5 non-formal ways to say 'See you later' in French.
Assistance: Certainly! Here are five casual ways to say "see you later" in French:
1. ร€ plus tard - This is a very common and informal way to bid farewell.
2. Salut ร  plus - A more relaxed version that combines "hello/goodbye" (salut) with "see you later."
3. Ciao ร  plus - Using the Italian "ciao" gives it an even more casual feel.
4. Bisous ร  plus tard - This one adds a friendly touch by including "kisses" before saying see you later.
5. ร€ la prochaine - Literally means "until next time," but is often used casually as well.
Remember that French people generally use "tu" (informal) with friends and family, while "vous" is more formal for strangers or in professional settings. These phrases work best when using the informal "tu." ๐Ÿ‡ซ๐Ÿ‡ทโœจ

Merge Details

This is a merge of pre-trained language models created using mergekit.

Merge Method

This model was merged using the TIES merge method using Gryphe/Pantheon-RP-1.8-24b-Small-3.1 as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: Gryphe/Pantheon-RP-1.8-24b-Small-3.1
    parameters:
      density: 0.7
      weight: 0.7
  - model: PocketDoc/Dans-PersonalityEngine-V1.2.0-24b
    parameters:
      density: 0.5
      weight: 0.5
      
merge_method: ties
base_model: Gryphe/Pantheon-RP-1.8-24b-Small-3.1
parameters:
  normalize: false
  int8_mask: true
dtype: float16
tokenizer: 
 source: union

My System

Motherboard: Asus Maximus Z790
CPU: Intel i9-13900KS
RAM: 96 GB 6400mhz
GPU: RTX-4090 OC 24GB
Downloads last month
12
Safetensors
Model size
23.6B params
Tensor type
FP16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for h34v7/DansXPantheon-RP-Engine-V1.2-24b-Small-Instruct-Ties-Merge

Collection including h34v7/DansXPantheon-RP-Engine-V1.2-24b-Small-Instruct-Ties-Merge