RadonSAI-Small

Overview

RadonSAI-Small is a variant of the Radon model family, based on the GPT2LMHeadModel architecture.

Model Details

  • Source Model: gpt2
  • Architecture: GPT2LMHeadModel
  • Parameters: 123.6M
  • Model Type: gpt2

Usage

from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("MagistrTheOne/RadonSAI-Small")
model = AutoModelForCausalLM.from_pretrained("MagistrTheOne/RadonSAI-Small")

prompt = "Hello, how are you?"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=50)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))

Model Information

  • Languages: English, Russian
  • License: Apache 2.0
  • Format: Safetensors
  • Library: Transformers

Citation

If you use this model, please cite the original source model and the Radon project.

Downloads last month
83
Safetensors
Model size
0.1B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for MagistrTheOne/RadonSAI-Small

Finetuned
(1903)
this model
Quantizations
1 model