oraingoz, gero gehituko ditut gauza gehio

#1
by neildlf - opened
Files changed (1) hide show
  1. README.md +60 -0
README.md ADDED
@@ -0,0 +1,60 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ datasets:
4
+ - HiTZ/GuideX_pre-training_data
5
+ - ACE05
6
+ - bc5cdr
7
+ - conll2003
8
+ - ncbi_disease
9
+ - conll2012_ontonotesv5
10
+ - rams
11
+ - tacred
12
+ - wnut_17
13
+ language:
14
+ - en
15
+ metrics:
16
+ - f1
17
+ base_model:
18
+ - meta-llama/Llama-3.1-8B
19
+ tags:
20
+ - code
21
+ - text-generation-inference
22
+ - Information Extraction
23
+ - IE
24
+ - Named Entity Recogniton
25
+ - Event Extraction
26
+ - Relation Extraction
27
+ - LLaMA
28
+ ---
29
+
30
+ <p align=
31
+
32
+
33
+
34
+ # Model Card for GuideX-8B
35
+
36
+ <p align="center">
37
+ <br>
38
+ <img src="https://cdn-uploads.huggingface.co/production/uploads/64fee0a7a10455384ebba184/IEq02_qWzwxeHd6T-UOVG.png
39
+ " style="height: 150px;">
40
+ <h2 align="center">Guided Synthetic Data Generation for Zero-Shot Information Extraction</h2>
41
+ <br>
42
+ </p>
43
+
44
+ **Llama-3.1-GuideX-8B** is an 8-billion-parameter language model fine-tuned for high-performance zero-shot Information Extraction (IE). The model is trained to follow detailed annotation guidelines provided as Python dataclasses, allowing it to adapt to new domains and schemas on the fly without requiring task-specific examples.
45
+
46
+ This model achieves state-of-the-art performance on zero-shot Named Entity Recognition (NER) by first training on `GuideX`, a large-scale synthetic dataset with executable guidelines, and then fine-tuning on a collection of gold-standard IE datasets.
47
+
48
+ - **Homepage:** [https://neilus03.github.io/guidex.com/](https://neilus03.github.io/guidex.com/)
49
+ - **Paper:** [GuideX: Guided Synthetic Data Generation for Zero-Shot Information Extraction](https://arxiv.org/abs/2506.00649)
50
+ - **Code & Data:** The code and data for reproducing the GuideX methodology are available on the project homepage.
51
+
52
+ ## Model Description
53
+
54
+ - **Developed by:** Neil De La Fuente, Oscar Sainz, Iker García-Ferrero, Eneko Agirre
55
+ - **Institution:** HiTZ Basque Center for Language Technology - Ixa NLP Group, University of the Basque Country (UPV/EHU), Technical University of Munich (TUM)
56
+ - **Model type:** Decoder-only Transformer (Text Generation)
57
+ - **Language(s):** English
58
+ - **License:** Llama 2 Community License
59
+ - **Finetuned from model:** `meta-llama/Llama-3.1-8B`
60
+