Jan-Nano-128k: Empowering deeper research through extended context understanding.

Authors: Alan Dao, Bach Vu Dinh, Thinh Le
Overview
Jan-Nano-128k represents a significant advancement in compact language models for research applications. Building upon the success of Jan-Nano, this enhanced version features a native 128k context window that enables deeper, more comprehensive research capabilities without the performance degradation typically associated with context extension methods.
Key Improvements:
- π Research Deeper: Extended context allows for processing entire research papers, lengthy documents, and complex multi-turn conversations
- β‘ Native 128k Window: Built from the ground up to handle long contexts efficiently, maintaining performance across the full context range
- π Enhanced Performance: Unlike traditional context extension methods, Jan-Nano-128k shows improved performance with longer contexts
This model maintains full compatibility with Model Context Protocol (MCP) servers while dramatically expanding the scope of research tasks it can handle in a single session.
Evaluation
Jan-Nano-128k has been rigorously evaluated on the SimpleQA benchmark using our MCP-based methodology, demonstrating superior performance compared to its predecessor:
Why Jan-Nano-128k?
Traditional approaches to extending context length, such as YaRN (Yet another RoPE extensioN), often result in performance degradation as context length increases. Jan-Nano-128k breaks this paradigm:
This fundamental difference makes Jan-Nano-128k ideal for research applications requiring deep document analysis, multi-document synthesis, and complex reasoning over large information sets.
Use it with Jan (UI)
- Install Jan using Quickstart
Original weight: https://huggingface.co/Menlo/Jan-nano-128k
Recommended Sampling Parameters
Temperature: 0.7
Top-p: 0.8
Top-k: 20
Min-p: 0.0
π€ Community & Support
- Discussions: HuggingFace Community
- Issues: GitHub Repository
- Documentation: Official Docs
π Citation
@model{jan-nano-128k,
title={Jan-Nano-128k: Deep Research with Extended Context},
author={Dao, Alan and Dinh, Bach Vu and Le Thinh},
year={2024},
url={https://huggingface.co/Menlo/Jan-nano-128k}
}
Jan-Nano-128k: Empowering deeper research through extended context understanding.
- Downloads last month
- 0
3-bit
4-bit
5-bit
6-bit
8-bit
Model tree for Menlo/Jan-nano-128k-gguf
Base model
Qwen/Qwen3-4B-Base