YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

lihaoxin2020/Qwen3-4B-Thinking-GSM8K-global_step_128

This repository was uploaded programmatically.

Usage

Load with transformers (or your library) using from_pretrained():

from transformers import AutoModel, AutoTokenizer

model = AutoModel.from_pretrained("lihaoxin2020/Qwen3-4B-Thinking-GSM8K-global_step_128")
tokenizer = AutoTokenizer.from_pretrained("lihaoxin2020/Qwen3-4B-Thinking-GSM8K-global_step_128")

If you're using a custom library, ensure your loader expects the same files (config.json, weight files like pytorch_model.bin or model.safetensors, and any tokenizer assets) at the repository root.

Downloads last month
-
Safetensors
Model size
4.41B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support