Commit
·
3bea431
1
Parent(s):
cf156a6
Update README.md
Browse files
README.md
CHANGED
@@ -13,7 +13,7 @@ tags:
|
|
13 |
|
14 |
|
15 |
[DisorBERT](https://aclanthology.org/2023.acl-long.853/)
|
16 |
-
is a double-domain adaptation of a language model. First, is adapted to social media language, and then, adapted to the mental health domain. In both steps, it incorporated a lexical resource to guide the masking process of the language model and, therefore, to help it in paying more attention to words related to mental disorders.
|
17 |
|
18 |
We follow the standard fine-tuning a masked language model of [Huggingface’s NLP Course](https://huggingface.co/learn/nlp-course/chapter7/3?fw=pt).
|
19 |
|
|
|
13 |
|
14 |
|
15 |
[DisorBERT](https://aclanthology.org/2023.acl-long.853/)
|
16 |
+
is a double-domain adaptation of a BERT language model. First, is adapted to social media language, and then, adapted to the mental health domain. In both steps, it incorporated a lexical resource to guide the masking process of the language model and, therefore, to help it in paying more attention to words related to mental disorders.
|
17 |
|
18 |
We follow the standard fine-tuning a masked language model of [Huggingface’s NLP Course](https://huggingface.co/learn/nlp-course/chapter7/3?fw=pt).
|
19 |
|