isikz's picture
Update README.md
99b8595 verified
metadata
library_name: transformers
license: apache-2.0
metrics:
  - perplexity
base_model:
  - facebook/esm1b_t33_650M_UR50S

Pretraining on Combined Phosphosite Data

ESM-1b is trained from scratch by Masked Language Modeling objective. The data is combination of phosphosite data which are used to train isikz/esm1b_msa_mlm_pt_phosphosite and isikz/esm1b_mlm_pt_phosphosite. The total number of data is 1055221.