GPT-3 small
Pretrained GPT-3 small, continuing the development of GPT NEO, with architecture that purposefully mimics that of GPT-3, model was trained on CNN Daily Mail News dataset for text generation.
How to use the model
from transformers import GPT2Tokenizer, GPTNeoForCausalLM
tokenizer = GPT2Tokenizer.from_pretrained('minhtoan/gpt3-small-finetune-cnndaily-news')
model = GPTNeoForCausalLM.from_pretrained('minhtoan/gpt3-small-finetune-cnndaily-news')
text = "Ever noticed how plane seats appear to be getting smaller and smaller? "
input_ids = tokenizer.encode(text, return_tensors='pt')
max_length = 150
sample_outputs = model.generate(input_ids, do_sample=True, max_length=max_length,temperature = 0.8)
for i, sample_output in enumerate(sample_outputs):
print(">> Generated text {}\n\n{}".format(i+1, tokenizer.decode(sample_output.tolist())))
print('\n---')
Author
Phan Minh Toan
- Downloads last month
- 110
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.