RP-7b-instruct

๐Ÿšจ This model is tuning to RP and knowledge is likely unstable.

This is a merge of pre-trained language models created using mergekit.

Output example

[INST] <<SYS>>
ใ‚ใชใŸใฏๆ—ฅๆœฌ่ชžใ‚’่ฉฑใ™ๅ„ช็ง€ใชใ‚ขใ‚ทใ‚นใ‚ฟใƒณใƒˆใงใ™ใ€‚ๅ›ž็ญ”ใซใฏๅฟ…ใšๆ—ฅๆœฌ่ชžใง็ญ”ใˆใฆใใ ใ•ใ„ใ€‚ใพใŸ่€ƒใˆใ‚‹้Ž็จ‹ใ‚‚ๅ‡บๅŠ›ใ—ใฆใใ ใ•ใ„ใ€‚
<</SYS>>

ใพใฉใƒžใ‚ฎใงไธ€็•ชๅฅฝใใชใ‚ญใƒฃใƒฉใฏ๏ผŸ [/INST]ใพใฉใƒžใ‚ฎใงไธ€็•ชๅฅฝใใชใ‚ญใƒฃใƒฉใ‚ฏใ‚ฟใƒผใฏใ€้นฟ็›ฎใพใฉใ‹ใงใ™ใ€‚ๅฝผๅฅณใฏ็‰ฉ่ชžใฎไธญๅฟƒไบบ็‰ฉใงใ‚ใ‚Šใ€ๅ„ชใ—ใใฆ่Šฏใฎๅผทใ„ๆ€งๆ ผใŒ้ญ…ๅŠ›็š„ใงใ™ใ€‚ใพใŸใ€ๅฝผๅฅณใŒ้ญ”ๆณ•ๅฐ‘ๅฅณใจใ—ใฆใฎ้‹ๅ‘ฝใ‚’ๅ—ใ‘ๅ…ฅใ‚Œใ€ไป–ใฎ้ญ”ๆณ•ๅฐ‘ๅฅณใŸใกใ‚’ๆ•‘ใ†ใŸใ‚ใซๅฅ”่ตฐใ™ใ‚‹ๅงฟใซใฏๆ„Ÿๅ‹•ใ—ใพใ™ใ€‚ๅฝผๅฅณใฎๆœ€ๆœŸใฎ้ธๆŠžใ‚‚ใ€ๅคšใใฎไบบใ€…ใฎๅฟƒใซๆฎ‹ใ‚‹ๅใ‚ทใƒผใƒณใจใชใฃใฆใ„ใพใ™ใ€‚

Test environment

This model was tested using text-generation-webui. I use preset min_p and Null preset with temperature=0.3 for Generation.

Usage

This format must be adhered to strictly, as deviations may result in less optimal outputs from the model.

The template used to construct a prompt for the Instruct model is specified as follows:

<s>[INST] <<SYS>>\n{SYSTEM_PROMPT}\n<</SYS>>\n\n{USER_MESSAGE_1} [/INST] {BOT_MESSAGE_1}</s>[INST] {USER_MESSAGE_2} [/INST] 

Please be aware that <s> and </s> are special tokens used for the beginning of string (BOS) and end of string (EOS), respectively, while [INST] and [/INST] are considered regular strings.

For the "{SYSTEM_PROMPT}" part, We recommend using "ใ‚ใชใŸใฏๆ—ฅๆœฌ่ชžใ‚’่ฉฑใ™ๅ„ช็ง€ใชใ‚ขใ‚ทใ‚นใ‚ฟใƒณใƒˆใงใ™ใ€‚ๅ›ž็ญ”ใซใฏๅฟ…ใšๆ—ฅๆœฌ่ชžใง็ญ”ใˆใฆใใ ใ•ใ„ใ€‚ใพใŸ่€ƒใˆใ‚‹้Ž็จ‹ใ‚‚ๅ‡บๅŠ›ใ—ใฆใใ ใ•ใ„ใ€‚"

For the "{USER_MESSAGE_1}" part, We recommend using {instruction}\n{input}

In other words, We recommend the following:

<s>[INST] <<SYS>>\nใ‚ใชใŸใฏๆ—ฅๆœฌ่ชžใ‚’่ฉฑใ™ๅ„ช็ง€ใชใ‚ขใ‚ทใ‚นใ‚ฟใƒณใƒˆใงใ™ใ€‚ๅ›ž็ญ”ใซใฏๅฟ…ใšๆ—ฅๆœฌ่ชžใง็ญ”ใˆใฆใใ ใ•ใ„ใ€‚ใพใŸ่€ƒใˆใ‚‹้Ž็จ‹ใ‚‚ๅ‡บๅŠ›ใ—ใฆใใ ใ•ใ„ใ€‚\n<</SYS>>\n\n{instruction1}\n{input1} [/INST] {BOT_MESSAGE_1}</s>[INST] {instruction2}\n{input2} [/INST] 

Use the instruct model

import torch
from transformers import AutoTokenizer, AutoModelForCausalLM

model_name = "nitky/RP-7b-instruct"
model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype=torch.bfloat16, device_map="auto")
tokenizer = AutoTokenizer.from_pretrained(model_name)

device = "cuda"

messages = [
    {"role": "system", "content": "ใ‚ใชใŸใฏๆ—ฅๆœฌ่ชžใ‚’่ฉฑใ™ๅ„ช็ง€ใชใ‚ขใ‚ทใ‚นใ‚ฟใƒณใƒˆใงใ™ใ€‚ๅ›ž็ญ”ใซใฏๅฟ…ใšๆ—ฅๆœฌ่ชžใง็ญ”ใˆใฆใใ ใ•ใ„ใ€‚ใพใŸ่€ƒใˆใ‚‹้Ž็จ‹ใ‚‚ๅ‡บๅŠ›ใ—ใฆใใ ใ•ใ„ใ€‚"},
    {"role": "user", "content": "ใพใฉใƒžใ‚ฎใงไธ€็•ชๅฅฝใใชใ‚ญใƒฃใƒฉใฏ๏ผŸ"}
]

encodeds = tokenizer.apply_chat_template(messages, return_tensors="pt")

model_inputs = encodeds.to(device)
model.to(device)

generated_ids = model.generate(model_inputs, max_new_tokens=256, do_sample=True, temperature=0.3)
decoded = tokenizer.batch_decode(generated_ids)
print(decoded[0])

Merge Details

Merge Method

This model was merged using the Model Stock merge method using stabilityai/japanese-stablelm-base-gamma-7b as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

merge_method: model_stock
base_model: stabilityai/japanese-stablelm-base-gamma-7b
models:
  - model: Aratako/AntlerStar-RP
  - model: Aratako/ArrowPro-7B-RobinHood-toxic
  - model: Aratako/Ninja-v1-RP-expressive
  - model: DataPilot/ArrowPro-7B-KUJIRA
  - model: DataPilot/ArrowPro-7B-RobinHood
  - model: DataPilot/ArrowPro-7B-KillerWhale
  - model: Elizezen/Phos-7B-RP
  - model: ohwi/japanese-stablelm-instruct-gamma-7b-dpo-uf-v1
  - model: umiyuki/Umievo-itr012-Gleipnir-7B
dtype: bfloat16
tokenizer_source: model:stabilityai/japanese-stablelm-base-gamma-7b
name: RP-7b-instruct
Downloads last month
5
Safetensors
Model size
7.24B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for nitky/RP-7b-instruct