File size: 426 Bytes
ad5a5e6 99b8595 ad5a5e6 99b8595 ad5a5e6 99b8595 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
---
library_name: transformers
license: apache-2.0
metrics:
- perplexity
base_model:
- facebook/esm1b_t33_650M_UR50S
---
## **Pretraining on Combined Phosphosite Data**
ESM-1b is trained from scratch by Masked Language Modeling objective. The data is combination of phosphosite data which are used to train **isikz/esm1b_msa_mlm_pt_phosphosite** and **isikz/esm1b_mlm_pt_phosphosite**.
The total number of data is 1055221.
|