Yi-34b-x2-v2 / README.md
sumo43's picture
Update README.md
f9274f0 verified
metadata
license: mit
language:
  - en
tags:
  - merge

Introduction

Yi-32b-x2-v2.0 is an MoE model created with mergekit + custom prompts. The following base models are used:

Weyaxi/Bagel-Hermes-34B-Slerp
one-man-army/UNA-34Beagles-32K-bf16-v1

Details

Used Librarys

  • mergekit
  • transformers

How to use

# pip install transformers==4.35.2
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained("sumo43/Yi-32b-x2-v2.0")
model = AutoModelForCausalLM.from_pretrained(
    "sumo43/Yi-32b-x2-v2.0"
)