--- library_name: peft license: cc-by-4.0 base_model: pczarnik/herbert-base-ner tags: - generated_from_trainer metrics: - precision - recall - f1 model-index: - name: herbert-ner-lora-datetime results: [] --- # herbert-ner-lora-datetime This model is a fine-tuned version of [pczarnik/herbert-base-ner](https://huggingface.co/pczarnik/herbert-base-ner) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0326 - Precision: 0.5051 - Recall: 0.4902 - F1: 0.4975 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:| | No log | 1.0 | 92 | 0.1005 | 0.2308 | 0.0882 | 0.1277 | | No log | 2.0 | 184 | 0.0583 | 0.4211 | 0.2745 | 0.3323 | | No log | 3.0 | 276 | 0.0422 | 0.4105 | 0.3824 | 0.3959 | | No log | 4.0 | 368 | 0.0346 | 0.4899 | 0.4755 | 0.4826 | | No log | 5.0 | 460 | 0.0326 | 0.5051 | 0.4902 | 0.4975 | ### Framework versions - PEFT 0.12.0 - Transformers 4.50.3 - Pytorch 2.4.1 - Datasets 2.21.0 - Tokenizers 0.21.1