Plesio-70B / README.md
Delta-Vector's picture
Update README.md
dd45b1f verified
metadata
base_model:
  - Delta-Vector/Shimamura-70B
  - Delta-Vector/Austral-70B-Winton
library_name: transformers
tags:
  - mergekit
  - merge
  - roleplay
  - creative_writing
  - llama

Plesio-70B

Model banner

Model Information

Plesio-70B

70B parameters Llama-3.3 based Creative / Fresh Prose Co-writing/Roleplay/Adventure Generalist

A simple merge yet sovl in it's own way, This merge is inbetween Shimamura & Austral Winton, I wanted to give Austral a bit of shorter prose, So FYI for all the 10000+ Token reply lovers.

Thanks Auri for testing!

Using the Oh-so-great 0.2 Slerp merge weight with Winton as the Base.

Support me on Ko-Fi: https://ko-fi.com/deltavector

Quantized Versions

Available Downloads

  • GGUF FormatFor use with LLama.cpp & Forks(ty Auri and Bart)
  • EXL3 FormatFor use with TabbyAPI (Slower on Ampere))

Prompting

Model has been tuned with the LLama-3 Instruct formatting.

See Merging Config
https://files.catbox.moe/yw81rn.yml
            

Credits

Thank you to Lucy Knada, Auri, Ateron, Alicat, Intervitens, Cgato, Kubernetes Bad and the rest of Anthracite.