Commit
·
b8cc10f
1
Parent(s):
63cca37
Update README.md
Browse files
README.md
CHANGED
@@ -18,18 +18,17 @@ We follow the standard fine-tuning a masked language model of [Huggingface’s N
|
|
18 |
We used the models provided by HuggingFace v4.24.0, and Pytorch v1.13.0.
|
19 |
In particular, for training the model we used a batch size of 256, Adam optimizer, with a learning rate of 1e<sup>-5</sup>, and cross-entropy as a loss function. We trained the model for three epochs using a GPU NVIDIA Tesla V100 32GB SXM2.
|
20 |
|
21 |
-
|
22 |
-
|
23 |
pipe = pipeline("fill-mask", model="citiusLTL/DisorBERT")
|
24 |
|
25 |
-
|
26 |
-
# Load model directly
|
27 |
from transformers import AutoTokenizer, AutoModelForMaskedLM
|
28 |
|
29 |
tokenizer = AutoTokenizer.from_pretrained("citiusLTL/DisorBERT")
|
30 |
model = AutoModelForMaskedLM.from_pretrained("citiusLTL/DisorBERT")
|
31 |
|
32 |
-
|
33 |
|
34 |
For more details, refer to the paper [DisorBERT: A Double Domain Adaptation Model for Detecting Signs of Mental Disorders in Social Media](https://aclanthology.org/2023.acl-long.853/).
|
35 |
|
|
|
18 |
We used the models provided by HuggingFace v4.24.0, and Pytorch v1.13.0.
|
19 |
In particular, for training the model we used a batch size of 256, Adam optimizer, with a learning rate of 1e<sup>-5</sup>, and cross-entropy as a loss function. We trained the model for three epochs using a GPU NVIDIA Tesla V100 32GB SXM2.
|
20 |
|
21 |
+
# Usage
|
22 |
+
## Use a pipeline as a high-level helper from transformers import pipeline
|
23 |
pipe = pipeline("fill-mask", model="citiusLTL/DisorBERT")
|
24 |
|
25 |
+
## Load model directly
|
|
|
26 |
from transformers import AutoTokenizer, AutoModelForMaskedLM
|
27 |
|
28 |
tokenizer = AutoTokenizer.from_pretrained("citiusLTL/DisorBERT")
|
29 |
model = AutoModelForMaskedLM.from_pretrained("citiusLTL/DisorBERT")
|
30 |
|
31 |
+
# Paper
|
32 |
|
33 |
For more details, refer to the paper [DisorBERT: A Double Domain Adaptation Model for Detecting Signs of Mental Disorders in Social Media](https://aclanthology.org/2023.acl-long.853/).
|
34 |
|