--- license: apache-2.0 pipeline_tag: text-generation tags: - cortex.cpp --- ## Overview **NovaSky Team** developed and released the [Sky-T1](https://huggingface.co/novasky-ai/Sky-T1-32B-Preview), a 32-billion parameter reasoning model adapted from Qwen2.5-32B-Instruct. This model is designed for advanced reasoning, coding, and mathematical tasks, achieving performance comparable to state-of-the-art models like o1-preview while being cost-efficient. Sky-T1 was trained on 17K verified responses from Qwen/QwQ-32B-Preview, with additional science data from the Still-2 dataset, ensuring high-quality and diverse learning sources. The model supports complex reasoning via long chain-of-thought processes and excels in both coding and mathematical challenges. Utilizing Llama-Factory with DeepSpeed Zero-3 Offload, Sky-T1 training was completed in just 19 hours on 8 H100 GPUs, demonstrating efficient resource utilization. These capabilities make Sky-T1 an exceptional tool for applications in programming, academic research, and reasoning-intensive tasks. ## Variants | No | Variant | Cortex CLI command | | --- | --- | --- | | 1 | [Sky-t1-32b](https://huggingface.co/cortexso/sky-t1/tree/32b) | `cortex run sky-t1:32b` | ## Use it with Jan (UI) 1. Install **Jan** using [Quickstart](https://jan.ai/docs/quickstart) 2. Use in Jan model Hub: ```bash cortexso/sky-t1 ``` ## Use it with Cortex (CLI) 1. Install **Cortex** using [Quickstart](https://cortex.jan.ai/docs/quickstart) 2. Run the model with command: ```bash cortex run sky-t1 ``` ## Credits - **Author:** NovaSky Team - **Converter:** [Homebrew](https://www.homebrew.ltd/) - **Original License:** [License](https://choosealicense.com/licenses/apache-2.0/) - **Papers:** [Sky-T1: Fully Open-Source Reasoning Model](https://novasky-ai.github.io/posts/sky-t1/)