t5-small-finetuned-stock-news-3
This model is a fine-tuned version of t5-small on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.5388
- Rouge1: 46.5843
- Rouge2: 39.8053
- Rougel: 45.0579
- Rougelsum: 45.1153
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 6
Training results
Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
---|---|---|---|---|---|---|---|
0.6486 | 1.0 | 2061 | 0.5804 | 46.121 | 39.3055 | 44.5881 | 44.6641 |
0.5558 | 2.0 | 4122 | 0.5550 | 46.3546 | 39.6639 | 44.9562 | 44.9928 |
0.518 | 3.0 | 6183 | 0.5472 | 46.6276 | 39.8816 | 45.1918 | 45.251 |
0.4974 | 4.0 | 8244 | 0.5384 | 46.6196 | 39.7899 | 45.0584 | 45.1347 |
0.4794 | 5.0 | 10305 | 0.5385 | 46.6275 | 39.7555 | 45.0597 | 45.1392 |
0.4714 | 6.0 | 12366 | 0.5388 | 46.5843 | 39.8053 | 45.0579 | 45.1153 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.1
- Tokenizers 0.21.1
- Downloads last month
- 6
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for Kallia/t5-small-finetuned-stock-news-3
Base model
google-t5/t5-small