A pre-trained BERT model for Uzbek (12layers, cased). Trained on big News corpus (Daryo)
Files info
[MASK]