a KTO finetune ontop of the -Base Austral-24B, Still not recc'd for use, Use -Winton!

WandB: https://wandb.ai/new-eden/austral/artifacts/axolotl-config/config-v2nv3dlc/v0/files/axolotl_config_2u1b4uya.yml

Datasets:

datasets:
  - path: Delta-Vector/Tauri-IFeval-Dans-Tulu-KTO
    split: train
    type: chatml.argilla
  - path: NewEden/Helpsteer-3-edit-kto-v7
    split: train
    type: chatml.argilla
  - path: Delta-Vector/Tauri-Helpsteer-3-Preference-KTO
    split: train
    type: chatml.argilla
  - path: NewEden/Helpsteer-3-edit-kto-v7
    split: train
    type: chatml.argilla
  - path: Delta-Vector/Tauri-Opus-Accepted-GPT-Rejected-Opus-Writing-Prompts
    split: train
    type: chatml.argilla
  - path: NewEden/Opus-accepted-hermes-rejected-shuffled
    split: train
    type: chatml.argilla
  - path: NewEden/Purpura-Arkhaios-CC-KTO
    split: train
    type: chatml.argilla
  - path: Delta-Vector/Tauri-KTO-Instruct-Mix
    split: train
    type: chatml.argilla
Downloads last month
4
Safetensors
Model size
23.6B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Delta-Vector/Austral-SFT-KTO