yarongef commited on
Commit
026a2d8
·
1 Parent(s): b3d512f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -5
README.md CHANGED
@@ -19,11 +19,11 @@ Access to [git](https://github.com/yarongef/DistilProtBert)
19
  DistilProtBert was pretrained on millions of proteins sequences.
20
 
21
  Differences between DistilProtBert model and ProtBert:
22
- 1. Size of the model:
23
- - 230M parameters (420M parameters in ProtBert)
24
- - 15 hidden layers (30 hidden layers in ProtBert)
25
- 2. Size of the pretraining dataset: ~43M proteins (ProtBert was pretrained on 216M proteins)
26
- 3. Hardware used for pretraining: five v100 32GB Nvidia GPUs (ProtBert was pretrained on 512 16GB TPUs)
27
 
28
  ## Intended uses & limitations
29
 
 
19
  DistilProtBert was pretrained on millions of proteins sequences.
20
 
21
  Differences between DistilProtBert model and ProtBert:
22
+
23
+ | **Model** | **Parameters** | **Hidden layers** | **Pretraining sequences** | **Pretraining hardware** |
24
+ |:--------------:|:--------------:|:-----------------:|:-------------------------:|:------------------------:|
25
+ | ProtBert | 420M | 30 | 43M | 5 v100 32GB GPUs |
26
+ | DistilProtBert | 230M | 15 | 216M | 512 16GB Tpus |
27
 
28
  ## Intended uses & limitations
29