agentlans commited on
Commit
cb6fdf9
·
verified ·
1 Parent(s): 0f821c2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -9
README.md CHANGED
@@ -34,6 +34,8 @@ The model takes a paragraph as input and generates a list of keywords or key phr
34
  **Limitations:**
35
  - The model may sometimes generate irrelevant keywords
36
  - Performance may vary depending on the length and complexity of the input text
 
 
37
  - The model is trained on English text and may not perform well on other languages
38
 
39
  ## Training and Evaluation
@@ -63,13 +65,11 @@ print(keywords)
63
 
64
  Example input paragraph:
65
 
66
- ```
67
- In the heart of the bustling city, a hidden gem awaits discovery: a quaint little bookstore that seems to have escaped the relentless march of time. As you step inside, the scent of aged paper and rich coffee envelops you, creating an inviting atmosphere that beckons you to explore its shelves. Each corner is adorned with carefully curated collections, from classic literature to contemporary bestsellers, inviting readers of all tastes to lose themselves in the pages of a good book. The soft glow of warm lighting casts a cozy ambiance, while the gentle hum of conversation among fellow book lovers adds to the charm. This bookstore is not just a place to buy books; it's a sanctuary for those seeking solace, inspiration, and a sense of community in the fast-paced world outside.
68
- ```
69
 
70
  Example output keywords:
71
 
72
- `['quaint little bookstore', 'old paper coffee scent', 'curated collections', 'lovely hum of conversation', 'spiritual community', 'spiritual bookstore']`
73
 
74
  ## Limitations and Bias
75
 
@@ -79,7 +79,6 @@ This model has been trained on English Wikipedia paragraphs, which may introduce
79
 
80
  - **Training Data:** dataset of Wikipedia paragraphs and keywords
81
  - **Training Procedure:** Fine-tuning of google/flan-t5-small
82
- - **Hyperparameters:** Not specified
83
 
84
  ### Training hyperparameters
85
 
@@ -94,10 +93,10 @@ The following hyperparameters were used during training:
94
 
95
  ### Framework versions
96
 
97
- - Transformers 4.44.2
98
- - Pytorch 2.2.2+cu121
99
- - Datasets 2.18.0
100
- - Tokenizers 0.19.1
101
 
102
  ## Ethical Considerations
103
 
 
34
  **Limitations:**
35
  - The model may sometimes generate irrelevant keywords
36
  - Performance may vary depending on the length and complexity of the input text
37
+ - For best results, use long clean texts
38
+ - Length limit is 512 tokens due to Flan-T5 architecture
39
  - The model is trained on English text and may not perform well on other languages
40
 
41
  ## Training and Evaluation
 
65
 
66
  Example input paragraph:
67
 
68
+ ```In the heart of the bustling city, a hidden gem awaits discovery: a quaint little bookstore that seems to have escaped the relentless march of time. As you step inside, the scent of aged paper and rich coffee envelops you, creating an inviting atmosphere that beckons you to explore its shelves. Each corner is adorned with carefully curated collections, from classic literature to contemporary bestsellers, inviting readers of all tastes to lose themselves in the pages of a good book. The soft glow of warm lighting casts a cozy ambiance, while the gentle hum of conversation among fellow book lovers adds to the charm. This bookstore is not just a place to buy books; it's a sanctuary for those seeking solace, inspiration, and a sense of community in the fast-paced world outside.```
 
 
69
 
70
  Example output keywords:
71
 
72
+ `['old paper coffee scent', 'cosy hum of conversation', 'quaint bookstore', 'community in the fast-paced world', 'solace inspiration', 'curated collections']`
73
 
74
  ## Limitations and Bias
75
 
 
79
 
80
  - **Training Data:** dataset of Wikipedia paragraphs and keywords
81
  - **Training Procedure:** Fine-tuning of google/flan-t5-small
 
82
 
83
  ### Training hyperparameters
84
 
 
93
 
94
  ### Framework versions
95
 
96
+ - Transformers 4.45.1
97
+ - Pytorch 2.4.1+cu121
98
+ - Datasets 3.0.1
99
+ - Tokenizers 0.20.0
100
 
101
  ## Ethical Considerations
102