crossencoder-airline-refine-020
This model is a fine-tuned version of cross-encoder/stsb-roberta-large on the None dataset. It achieves the following results on the evaluation set:
- Loss: 2.2676
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-08
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
13.5066 | 1.0 | 157 | 12.4148 |
10.2651 | 2.0 | 314 | 10.5345 |
10.6519 | 3.0 | 471 | 9.2623 |
11.1633 | 4.0 | 628 | 8.3263 |
9.0311 | 5.0 | 785 | 7.5766 |
8.2421 | 6.0 | 942 | 6.9271 |
7.7625 | 7.0 | 1099 | 6.3252 |
7.1628 | 8.0 | 1256 | 5.7654 |
6.3684 | 9.0 | 1413 | 5.2496 |
6.1002 | 10.0 | 1570 | 4.7695 |
4.7647 | 11.0 | 1727 | 4.3481 |
3.9525 | 12.0 | 1884 | 3.9878 |
4.4235 | 13.0 | 2041 | 3.6682 |
5.2694 | 14.0 | 2198 | 3.3836 |
4.1843 | 15.0 | 2355 | 3.1348 |
3.5038 | 16.0 | 2512 | 2.9473 |
3.548 | 17.0 | 2669 | 2.7790 |
3.485 | 18.0 | 2826 | 2.6405 |
3.4203 | 19.0 | 2983 | 2.5244 |
4.2155 | 20.0 | 3140 | 2.4407 |
2.7964 | 21.0 | 3297 | 2.3585 |
2.8937 | 22.0 | 3454 | 2.3019 |
3.5284 | 23.0 | 3611 | 2.2585 |
2.6299 | 24.0 | 3768 | 2.2254 |
3.1668 | 25.0 | 3925 | 2.2049 |
2.7021 | 26.0 | 4082 | 2.1902 |
2.4671 | 27.0 | 4239 | 2.1857 |
3.0145 | 28.0 | 4396 | 2.1822 |
2.7435 | 29.0 | 4553 | 2.1828 |
2.7928 | 30.0 | 4710 | 2.1821 |
Framework versions
- Transformers 4.38.1
- Pytorch 2.0.1
- Datasets 2.17.1
- Tokenizers 0.15.2
- Downloads last month
- 33
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for pjbhaumik/crossencoder-airline-refine-020
Base model
FacebookAI/roberta-large
Quantized
cross-encoder/stsb-roberta-large