bert-for-patents-finetuned_r-as
This model is a fine-tuned version of anferico/bert-for-patents on a unique dataset consisting of 46673 patent applications, of which 11387 were defined as "RA-SYS-related" and 35000 as "Not RA-SYS-related". The fine-tuning is performed on patents' titles and abstracts. The base model was fine-tuned to perform a binary classification task, identifying patents related to the "Robotics and Autonomous Systems" domain.
It achieves the following results on the evaluation set:
- Loss: 0.0669
- Accuracy: 0.975
- Auc: 0.996
- F1: 0.947
- Precision: 0.96
- Recall: 0.935
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Auc | F1 | Precision | Recall |
---|---|---|---|---|---|---|---|---|
0.3652 | 1.0 | 1015 | 0.1716 | 0.93 | 0.988 | 0.838 | 0.971 | 0.738 |
0.1382 | 2.0 | 2030 | 0.1070 | 0.959 | 0.993 | 0.912 | 0.97 | 0.861 |
0.1048 | 3.0 | 3045 | 0.0853 | 0.968 | 0.994 | 0.934 | 0.96 | 0.909 |
0.0946 | 4.0 | 4060 | 0.0770 | 0.971 | 0.995 | 0.94 | 0.951 | 0.929 |
0.0899 | 5.0 | 5075 | 0.0731 | 0.973 | 0.995 | 0.944 | 0.952 | 0.936 |
0.0854 | 6.0 | 6090 | 0.0706 | 0.972 | 0.996 | 0.943 | 0.958 | 0.928 |
0.0836 | 7.0 | 7105 | 0.0693 | 0.974 | 0.996 | 0.945 | 0.962 | 0.929 |
0.0828 | 8.0 | 8120 | 0.0684 | 0.975 | 0.996 | 0.947 | 0.967 | 0.929 |
0.0821 | 9.0 | 9135 | 0.0677 | 0.975 | 0.996 | 0.948 | 0.967 | 0.93 |
0.0794 | 10.0 | 10150 | 0.0669 | 0.975 | 0.996 | 0.947 | 0.96 | 0.935 |
Framework versions
- Transformers 4.52.4
- Pytorch 2.6.0+cu124
- Datasets 3.6.0.dev0
- Tokenizers 0.21.1
- Downloads last month
- 25
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for Fradalessandro/bert-for-patents-finetuned_r-as
Base model
anferico/bert-for-patents