EzraAragon commited on
Commit
63cca37
·
1 Parent(s): fb6bcf3

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -11,12 +11,12 @@ tags:
11
 
12
  <img style="float: left;" src="https://cdn-uploads.huggingface.co/production/uploads/64b946226b5ee8c388730ec1/uXCiWXUGrzhh6SE7ymBy_.png" width="150"/>
13
 
14
- [DisorBERT](https://aclanthology.org/2023.acl-long.853/) We propose a double-domain adaptation of a language model. First, we adapted the model to social media language, and then, we adapted it to the mental health domain. In both steps, we incorporated a lexical resource to guide the masking process of the language model and, therefore, to help it in paying more attention to words related to mental disorders.
15
 
16
  We follow the standard fine-tuning a masked language model of [Huggingface’s NLP Course](https://huggingface.co/learn/nlp-course/chapter7/3?fw=pt).
17
 
18
  We used the models provided by HuggingFace v4.24.0, and Pytorch v1.13.0.
19
- In particular, for training the model we used a batch size of 256, Adam optimizer, with a learning rate of 1e<sup>-5</sup>, and cross-entropy as a loss function. We trained the models for three epochs using a GPU NVIDIA Tesla V100 32GB SXM2.
20
 
21
  ## Usage
22
  # Use a pipeline as a high-level helper from transformers import pipeline
@@ -42,7 +42,7 @@ For more details, refer to the paper [DisorBERT: A Double Domain Adaptation Mode
42
  Losada, David E. and
43
  Montes, Manuel",
44
  booktitle = "Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
45
- month = jul,
46
  year = "2023",
47
  address = "Toronto, Canada",
48
  publisher = "Association for Computational Linguistics",
 
11
 
12
  <img style="float: left;" src="https://cdn-uploads.huggingface.co/production/uploads/64b946226b5ee8c388730ec1/uXCiWXUGrzhh6SE7ymBy_.png" width="150"/>
13
 
14
+ [DisorBERT](https://aclanthology.org/2023.acl-long.853/) is a double-domain adaptation of a language model. First, is adapted to social media language, and then, adapted to the mental health domain. In both steps, it incorporated a lexical resource to guide the masking process of the language model and, therefore, to help it in paying more attention to words related to mental disorders.
15
 
16
  We follow the standard fine-tuning a masked language model of [Huggingface’s NLP Course](https://huggingface.co/learn/nlp-course/chapter7/3?fw=pt).
17
 
18
  We used the models provided by HuggingFace v4.24.0, and Pytorch v1.13.0.
19
+ In particular, for training the model we used a batch size of 256, Adam optimizer, with a learning rate of 1e<sup>-5</sup>, and cross-entropy as a loss function. We trained the model for three epochs using a GPU NVIDIA Tesla V100 32GB SXM2.
20
 
21
  ## Usage
22
  # Use a pipeline as a high-level helper from transformers import pipeline
 
42
  Losada, David E. and
43
  Montes, Manuel",
44
  booktitle = "Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
45
+ month = Jul,
46
  year = "2023",
47
  address = "Toronto, Canada",
48
  publisher = "Association for Computational Linguistics",