|
--- |
|
tags: |
|
- generated_from_trainer |
|
model-index: |
|
- name: full-lstm-0 |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# full-lstm-0 |
|
|
|
This model is a fine-tuned version of [](https://huggingface.co/) on the None dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 3.9694 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 5e-05 |
|
- train_batch_size: 32 |
|
- eval_batch_size: 32 |
|
- seed: 0 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- training_steps: 3052726 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | |
|
|:-------------:|:-----:|:-------:|:---------------:| |
|
| 4.8015 | 0.03 | 76319 | 4.7666 | |
|
| 4.5084 | 1.03 | 152638 | 4.4785 | |
|
| 4.3627 | 0.03 | 228957 | 4.3422 | |
|
| 4.2719 | 1.03 | 305276 | 4.2587 | |
|
| 4.207 | 0.03 | 381595 | 4.2008 | |
|
| 4.157 | 1.03 | 457914 | 4.1598 | |
|
| 4.1211 | 0.03 | 534233 | 4.1288 | |
|
| 4.0934 | 0.03 | 610552 | 4.1043 | |
|
| 4.0625 | 1.03 | 686871 | 4.0851 | |
|
| 4.0403 | 0.03 | 763190 | 4.0690 | |
|
| 4.0192 | 0.03 | 839509 | 4.0562 | |
|
| 4.0046 | 0.03 | 915828 | 4.0454 | |
|
| 3.9833 | 0.03 | 992148 | 4.0356 | |
|
| 3.9707 | 1.03 | 1068468 | 4.0273 | |
|
| 3.9639 | 0.03 | 1144788 | 4.0195 | |
|
| 3.9539 | 1.03 | 1221108 | 4.0139 | |
|
| 3.9372 | 0.03 | 1297428 | 4.0088 | |
|
| 3.9294 | 1.03 | 1373748 | 4.0042 | |
|
| 3.9158 | 0.03 | 1450068 | 4.0000 | |
|
| 3.9084 | 1.03 | 1526388 | 3.9959 | |
|
| 3.9068 | 0.03 | 1602708 | 3.9934 | |
|
| 3.9023 | 0.03 | 1679028 | 3.9910 | |
|
| 3.9014 | 0.03 | 1755348 | 3.9886 | |
|
| 3.8972 | 1.03 | 1831668 | 3.9863 | |
|
| 3.8877 | 0.03 | 1907988 | 3.9832 | |
|
| 3.8826 | 1.03 | 1984308 | 3.9816 | |
|
| 3.8796 | 0.03 | 2060628 | 3.9805 | |
|
| 3.8705 | 0.03 | 2136948 | 3.9788 | |
|
| 3.8694 | 0.03 | 2213268 | 3.9773 | |
|
| 3.863 | 1.03 | 2289588 | 3.9766 | |
|
| 3.857 | 0.03 | 2365908 | 3.9751 | |
|
| 3.8545 | 1.03 | 2442228 | 3.9737 | |
|
| 3.8461 | 0.03 | 2518548 | 3.9731 | |
|
| 3.8429 | 1.03 | 2594868 | 3.9722 | |
|
| 3.8378 | 0.03 | 2671188 | 3.9716 | |
|
| 3.8349 | 0.03 | 2747508 | 3.9707 | |
|
| 3.8407 | 0.03 | 2823828 | 3.9704 | |
|
| 3.8386 | 1.03 | 2900148 | 3.9700 | |
|
| 3.8431 | 0.03 | 2976468 | 3.9696 | |
|
| 3.8437 | 0.02 | 3052726 | 3.9694 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.33.3 |
|
- Pytorch 2.0.1 |
|
- Datasets 2.12.0 |
|
- Tokenizers 0.13.3 |
|
|