Update README.md
Browse files
README.md
CHANGED
@@ -102,7 +102,7 @@ The model was fine-tuned using the following sources:
|
|
102 |
1. Synthetic sentences generated using OpenAI's GPT4o model, based on historic textile glossaries compiled from digitised books (2,504 examples)
|
103 |
2. A subset of the Pile-NER-type dataset (4,000 examples, to avoid overfitting)
|
104 |
|
105 |
-
Dataset card: [max-long/textile_glossaries_and_pile_ner](https://huggingface.co/datasets/
|
106 |
|
107 |
For a full description of how the synthetic data was generated, you can consult [this notebook](https://github.com/congruence-engine/universal-ner-with-gliner/blob/main/code/gliner_synthetic_data.ipynb). A Colab version is available [here](https://colab.research.google.com/drive/1SBRU3RMiWcwAskJ18UD2MgwME8FALt8J?usp=sharing).
|
108 |
|
|
|
102 |
1. Synthetic sentences generated using OpenAI's GPT4o model, based on historic textile glossaries compiled from digitised books (2,504 examples)
|
103 |
2. A subset of the Pile-NER-type dataset (4,000 examples, to avoid overfitting)
|
104 |
|
105 |
+
Dataset card: [max-long/textile_glossaries_and_pile_ner](https://huggingface.co/datasets/congruence-engine/textile_glossaries_and_pile_ner)
|
106 |
|
107 |
For a full description of how the synthetic data was generated, you can consult [this notebook](https://github.com/congruence-engine/universal-ner-with-gliner/blob/main/code/gliner_synthetic_data.ipynb). A Colab version is available [here](https://colab.research.google.com/drive/1SBRU3RMiWcwAskJ18UD2MgwME8FALt8J?usp=sharing).
|
108 |
|