File size: 1,070 Bytes
4049904
 
deb8680
 
 
 
 
4049904
deb8680
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
---
license: apache-2.0
datasets:
- Anthropic/hh-rlhf
language:
- en
pipeline_tag: text-generation
---


# GPT-2 Medium Fine-Tuned on Anthropic-hh Dataset

This repository houses a GPT-2 Medium model fine-tuned on the Anthropic-hh dataset. The fine-tuning process involved masking Human's utterances, with the loss computed exclusively on the Assistant's responses.

## Model Information

- **Base Model:** GPT-2 Medium
- **Training Data:** Anthropic-hh dataset
- **Fine-Tuning Approach:** Supervised fine-tuning with a focus on Assistant's responses.

## How to Use

```python
from transformers import GPT2LMHeadModel, GPT2Tokenizer

# Load tokenizer and model
tokenizer = GPT2Tokenizer.from_pretrained("RaushanTurganbay/GPT2_instruct_tuned")
model = GPT2LMHeadModel.from_pretrained("RaushanTurganbay/GPT2_instruct_tuned")

# Generate responses
prompt = "Your input prompt here"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_length=150, num_return_sequences=1)

print(tokenizer.decode(outputs[0], skip_special_tokens=True))
```