Update README.md
Browse files
README.md
CHANGED
@@ -3,7 +3,7 @@ license: apache-2.0
|
|
3 |
---
|
4 |
<h2>GatorTron-Large overview </h2>
|
5 |
|
6 |
-
Developed by a joint effort between the University of Florida and NVIDIA, GatorTron-
|
7 |
|
8 |
GatorTron-Large is pre-trained using a dataset consisting of:
|
9 |
|
|
|
3 |
---
|
4 |
<h2>GatorTron-Large overview </h2>
|
5 |
|
6 |
+
Developed by a joint effort between the University of Florida and NVIDIA, GatorTron-Large is a clinical language model of 8.9 billion parameters, pre-trained using a BERT architecure implemented in the Megatron package (https://github.com/NVIDIA/Megatron-LM).
|
7 |
|
8 |
GatorTron-Large is pre-trained using a dataset consisting of:
|
9 |
|