Deepseek-r1-Distill-8b-Beijing-History

🤖 模型简介
基于 Google Colab 微调的轻量化模型,专精于北京历史文化领域知识分析(如胡同演变、皇家建筑、民俗文化等),适用于作为大型任务的中间件或领域知识问答工具。

🚀 特性

  • 轻量高效:8B 参数量,推理资源需求低
  • 领域适配:针对北京历史文本优化,理解地名、事件时间线等细节
  • 任务兼容:支持作为 RAG 系统的核心组件或下游任务预处理模块

💻 快速使用

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("JIA244601/beijing112telling")
tokenizer = AutoTokenizer.from_pretrained("JIA244601/beijing112telling")

input_text = "请分析景山在明清两代的功能演变"
inputs = tokenizer(input_text, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=200)
print(tokenizer.decode(outputs[0]))
Downloads last month
3
GGUF
Model size
8.03B params
Architecture
llama
Hardware compatibility
Log In to view the estimation

8-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support