MatteoFasulo commited on
Commit
ef74f66
·
verified ·
1 Parent(s): 13eede4

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +32 -31
README.md CHANGED
@@ -16,6 +16,8 @@ tags:
16
  model-index:
17
  - name: mdeberta-v3-base-subjectivity-german
18
  results: []
 
 
19
  ---
20
 
21
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -58,32 +60,27 @@ The model was trained and evaluated on datasets provided for the **CLEF 2025 Che
58
  You can use this model directly with the Hugging Face `transformers` library for text classification:
59
 
60
  ```python
61
- from transformers import AutoTokenizer, AutoModelForSequenceClassification
62
- import torch
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
63
 
64
- model_name = "MatteoFasulo/mdeberta-v3-base-subjectivity-german"
65
- tokenizer = AutoTokenizer.from_pretrained(model_name)
66
- model = AutoModelForSequenceClassification.from_pretrained(model_name)
67
-
68
- # Example usage for an objective sentence:
69
- text_objective = "Der Bundeskanzler traf sich heute mit dem französischen Präsidenten." # The Chancellor met the French President today.
70
- inputs_obj = tokenizer(text_objective, return_tensors="pt")
71
-
72
- with torch.no_grad():
73
- logits_obj = model(**inputs_obj).logits
74
-
75
- predicted_class_id_obj = logits_obj.argmax().item()
76
- print(f"'{text_objective}' is classified as: {model.config.id2label[predicted_class_id_obj]}")
77
-
78
- # Example usage for a subjective sentence:
79
- text_subjective = "Ich denke, dass diese Entscheidung eine Katastrophe ist." # I think that this decision is a disaster.
80
- inputs_subj = tokenizer(text_subjective, return_tensors="pt")
81
-
82
- with torch.no_grad():
83
- logits_subj = model(**inputs_subj).logits
84
-
85
- predicted_class_id_subj = logits_subj.argmax().item()
86
- print(f"'{text_subjective}' is classified as: {model.config.id2label[predicted_class_id_subj]}")
87
  ```
88
 
89
  ## Training procedure
@@ -119,13 +116,17 @@ The following hyperparameters were used during training:
119
  - Tokenizers 0.21.0
120
 
121
  ## Citation
 
122
  If you find our work helpful or inspiring, please feel free to cite it:
 
123
  ```bibtex
124
- @article{aiwizards2025checkthat,
125
- title={AI Wizards at CheckThat! 2025: Enhancing Transformer-Based Embeddings with Sentiment for Subjectivity Detection in News Articles},
126
- author={Antoun, Wissam and Kulumba, Francis and Touchent, Rian and de la Clergerie, Éric and Sagot, Benoît and Seddah, Djamé},
127
- journal={arXiv preprint arXiv:2507.11764},
128
- year={2025},
129
- url={https://arxiv.org/abs/2507.11764}
 
 
130
  }
131
  ```
 
16
  model-index:
17
  - name: mdeberta-v3-base-subjectivity-german
18
  results: []
19
+ datasets:
20
+ - MatteoFasulo/clef2025_checkthat_task1_subjectivity
21
  ---
22
 
23
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
60
  You can use this model directly with the Hugging Face `transformers` library for text classification:
61
 
62
  ```python
63
+ from transformers import pipeline
64
+
65
+ # Load the text classification pipeline
66
+ classifier = pipeline(
67
+ "text-classification",
68
+ model="MatteoFasulo/mdeberta-v3-base-subjectivity-german",
69
+ tokenizer="microsoft/mdeberta-v3-base",
70
+ )
71
+
72
+ # Example usage for an objective sentence
73
+ text1 = "Das Unternehmen meldete im letzten Quartal einen Gewinnanstieg von 10 %."
74
+ result1 = classifier(text1)
75
+ print(f"Text: '{text1}' Classification: {result1}")
76
+ # Expected output: [{'label': 'OBJ', 'score': ...}]
77
+
78
+ # Example usage for a subjective sentence
79
+ text2 = "Dieses Produkt ist absolut erstaunlich und jeder sollte es ausprobieren!"
80
+ result2 = classifier(text2)
81
+ print(f"Text: '{text2}' Classification: {result2}")
82
+ # Expected output: [{'label': 'SUBJ', 'score': ...}]
83
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
84
  ```
85
 
86
  ## Training procedure
 
116
  - Tokenizers 0.21.0
117
 
118
  ## Citation
119
+
120
  If you find our work helpful or inspiring, please feel free to cite it:
121
+
122
  ```bibtex
123
+ @misc{fasulo2025aiwizardscheckthat2025,
124
+ title={AI Wizards at CheckThat! 2025: Enhancing Transformer-Based Embeddings with Sentiment for Subjectivity Detection in News Articles},
125
+ author={Matteo Fasulo and Luca Babboni and Luca Tedeschini},
126
+ year={2025},
127
+ eprint={2507.11764},
128
+ archivePrefix={arXiv},
129
+ primaryClass={cs.CL},
130
+ url={https://arxiv.org/abs/2507.11764},
131
  }
132
  ```