Second iteration (first was the hottest trash) of mass injecting the good stuff into my spatial awareness/object orientation framework. VAR2 was trained on mixed data, no RP, and VAR(1) was trained exclusively on spatial/task data.

image/png

- Temp: 1
- Min P: 0.02
- Top nsigma: 1.73
- Rep Pen: 1.02
- DRY: 0.8, 1.75, 4, 4096
  • Screenshots below are from the imx Q6 quant
  • Using system prompt: 'You are a brilliant award winning writer and storyteller, with a visceral and 'in your face' writing style'

The model seems to desperately want to adhere to sys prompts and cards/patterns, lengthy sys prompts feel like they shackle the responses.

merge_method: breadcrumbs_ties

models:   
  - model: Delta-Vector/Austral-70B-Winton
    parameters:
      gamma: 0.01
      density: .2
      weight: 0.13
  - model: Delta-Vector/Shimamura-70B
    parameters:
      gamma: 0.01
      density: .2
      weight: 0.13
  - model: Darkhn/L3.3-70B-Animus-V7.0
    parameters:
      gamma: 0.01
      density: .5
      weight: 0.13
  - model: TheDrummer/Anubis-70B-v1.1
    parameters:
      gamma: 0.02
      density: .3
      weight: 0.13
  - model: schonsense/Llama3_3_70B_VAR_r128
    parameters:
      gamma: 0
      density: .7
      weight: 0.13
  - model: SentientAGI/Dobby-Unhinged-Llama-3.3-70B
    parameters:
      gamma: 0.01
      density: .3
      weight: 0.13
  - model: Tarek07/Scripturient-V1.3-LLaMa-70B
    parameters:
      gamma: 0.01
      density: .3
      weight: 0.13
  - model: zerofata/L3.3-GeneticLemonade-Unleashed-v3-70B
    parameters:
      gamma: 0.02
      density: .2
      weight: 0.13
  - model: schonsense/ll3_3_70B_r128_VAR2

base_model: schonsense/ll3_3_70B_r128_VAR2
tokenizer_source: union
parameters:
  normalize: true
  int8_mask: true
  lambda: 0.95

dtype: float32
out_dtype: bfloat16

image/png

image/png

image/png

image/png

Downloads last month
527
GGUF
Model size
70.6B params
Architecture
llama
Hardware compatibility
Log In to view the estimation

4-bit

5-bit

6-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for tachyphylaxis/Cream_top2_gguf

Quantized
(2)
this model