Serayuki-1B

Model Developer: Shoukaku07
Model Type: Causal Language Model

Example Usage

Using Hugging Face Transformers:

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("SeraphyneLab/Serayuki-1B")
tokenizer = AutoTokenizer.from_pretrained("SeraphyneLab/Serayuki-1B")

input_text = "Once upon a time"
inputs = tokenizer(input_text, return_tensors="pt").to("cuda")
outputs = model.generate(**inputs, max_new_tokens=128)

print(tokenizer.decode(outputs[0], skip_special_tokens=True))

License

This model is licensed under the MIT License.

Tokenizer Notice

This model was trained from scratch; however, it uses the tokenizer from Meta’s LLaMA 3.2 3B Instruct model. As such, the tokenizer is subject to Meta’s LLaMA 3 license. Please review their terms before using this model or tokenizer in commercial applications.

Downloads last month
14
Safetensors
Model size
1.49B params
Tensor type
F32
·
FP16
·
U8
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Datasets used to train SeraphyneLab/Serayuki-1B