update arxiv link
Browse files
README.md
CHANGED
@@ -34,7 +34,7 @@ English | [็ฎไฝไธญๆ]()
|
|
34 |
|
35 |
<div align="center">
|
36 |
|
37 |
-
[๐ Technical Report]()
|
38 |
<!-- | [๐ป Github Repo]() -->
|
39 |
|
40 |
</div>
|
@@ -96,7 +96,14 @@ Thanks for their awesome work! Open-source contributions make Ultra-FineWeb poss
|
|
96 |
If you find our work useful, please consider citing:
|
97 |
|
98 |
```bibtex
|
99 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
100 |
```
|
101 |
|
102 |
## ๐ณ License
|
|
|
34 |
|
35 |
<div align="center">
|
36 |
|
37 |
+
[๐ Technical Report](https://arxiv.org/abs/2505.05427)
|
38 |
<!-- | [๐ป Github Repo]() -->
|
39 |
|
40 |
</div>
|
|
|
96 |
If you find our work useful, please consider citing:
|
97 |
|
98 |
```bibtex
|
99 |
+
@misc{wang2025ultrafineweb,
|
100 |
+
title={{Ultra-FineWeb}: Efficient Data Filtering and Verification for High-Quality LLM Training Data},
|
101 |
+
author={Yudong Wang and Zixuan Fu and Jie Cai and Peijun Tang and Hongya Lyu and Yewei Fang and Zhi Zheng and Jie Zhou and Guoyang Zeng and Chaojun Xiao and Xu Han and Zhiyuan Liu},
|
102 |
+
year={2025},
|
103 |
+
eprint={2505.05427},
|
104 |
+
archivePrefix={arXiv},
|
105 |
+
primaryClass={cs.CL},
|
106 |
+
}
|
107 |
```
|
108 |
|
109 |
## ๐ณ License
|