# checkpoint-5016 ## Checkpoint Information **Checkpoint Name**: `checkpoint-5016` **Repository Name**: `aylinakkus/qwen_2_5_math_epoch_5016` **Checkpoint Path**: `/home/mert/aylin/capability-erosion-sft/LLaMA-Factory/saves/qwen2.5-1.5b/full/sft/checkpoint-5016` ## Model Configuration This checkpoint was extracted from a Qwen 2.5 1.5B model training run. - **Base Model**: Qwen 2.5 1.5B - **Training Framework**: LLaMA-Factory - **Task**: Math fine-tuning ## Description This repository contains the model state dict extracted from the training checkpoint. ### Files - `model_state_dict.pt`: PyTorch state dictionary containing the model weights - `README.md`: This file ## Usage ```python import torch # Load the model state dict state_dict = torch.load("model_state_dict.pt", map_location='cpu') # Use with your model architecture # model.load_state_dict(state_dict) ``` ## Notes - This checkpoint was automatically uploaded using the `upload_checkpoints.py` script - Checkpoint extracted from: `checkpoint-5016` - Original path: `/home/mert/aylin/capability-erosion-sft/LLaMA-Factory/saves/qwen2.5-1.5b/full/sft/checkpoint-5016`