Neo-X1 Preview
Collection
2 items
β’
Updated
Neo-X1-3B-Preview is a 3B parameter model trained from Qwen2.5-3B-Instruct. The model focuses on enhanced text generation, code understanding, and conversational capabilities.
Developed by: Open Neo Team
Our model was fine-tuned on a carefully curated combination of high-quality datasets:
Performance metrics on key benchmarks (to be updated):
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "odyssey-labs/Neo-X1-3B-Preview"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)
# Example usage
input_text = "Write a function to calculate fibonacci numbers"
inputs = tokenizer(input_text, return_tensors="pt")
outputs = model.generate(**inputs, max_length=200)
response = tokenizer.decode(outputs[0], skip_special_tokens=True)
We want to thank:
@misc{sky_t1_2025,
author = {Open-Neo},
title = {Neo-X1: Fully open-source model that provides performance with low-end compute},
howpublished = {https://huggingface.co/open-neo},
note = {Accessed: 2025-01-015},
year = {2025}
}