Add paper abstract and link to Github repository (#2)
Browse files- Add paper abstract and link to Github repository (76a3858f0542137c071c3b01add084725fc143de)
Co-authored-by: Niels Rogge <[email protected]>
README.md
CHANGED
@@ -1,17 +1,18 @@
|
|
1 |
---
|
2 |
-
license: mit
|
3 |
language:
|
4 |
- en
|
|
|
|
|
5 |
pipeline_tag: image-to-image
|
6 |
tags:
|
7 |
- multimodal
|
8 |
-
library_name: transformers
|
9 |
---
|
10 |
|
11 |
## π₯π₯π₯ News!!
|
12 |
* Apr 25, 2025: π We release the inference code and model weights of Step1X-Edit. [inference code](https://github.com/stepfun-ai/Step1X-Edit)
|
13 |
* Apr 25, 2025: π We have made our technical report available as open source. [Read](https://arxiv.org/abs/2504.17761)
|
14 |
|
|
|
15 |
<!-- ## Image Edit Demos -->
|
16 |
|
17 |
<div align="center">
|
|
|
1 |
---
|
|
|
2 |
language:
|
3 |
- en
|
4 |
+
library_name: transformers
|
5 |
+
license: mit
|
6 |
pipeline_tag: image-to-image
|
7 |
tags:
|
8 |
- multimodal
|
|
|
9 |
---
|
10 |
|
11 |
## π₯π₯π₯ News!!
|
12 |
* Apr 25, 2025: π We release the inference code and model weights of Step1X-Edit. [inference code](https://github.com/stepfun-ai/Step1X-Edit)
|
13 |
* Apr 25, 2025: π We have made our technical report available as open source. [Read](https://arxiv.org/abs/2504.17761)
|
14 |
|
15 |
+
|
16 |
<!-- ## Image Edit Demos -->
|
17 |
|
18 |
<div align="center">
|