Add pipeline tag and library name; include sample usage from Github
#1
by
nielsr
HF staff
- opened
README.md
CHANGED
@@ -1,5 +1,7 @@
|
|
1 |
---
|
2 |
license: mit
|
|
|
|
|
3 |
---
|
4 |
|
5 |
Uni-3DAR
|
@@ -9,6 +11,9 @@ Uni-3DAR
|
|
9 |
Introduction
|
10 |
------------
|
11 |
|
|
|
|
|
|
|
12 |
Uni-3DAR is an autoregressive model that unifies various 3D tasks. In particular, it offers the following improvements:
|
13 |
|
14 |
1. **Unified Handling of Multiple 3D Data Types.**
|
@@ -23,11 +28,63 @@ Uni-3DAR is an autoregressive model that unifies various 3D tasks. In particular
|
|
23 |
4. **High Accuracy.**
|
24 |
Building on octree compression, Uni-3DAR further tokenizes fine-grained 3D patches to maintain structural details, achieving substantially better generation quality than previous diffusion-based models.
|
25 |
|
26 |
-
Usage
|
27 |
-
-----
|
28 |
|
29 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
30 |
|
31 |
|
|
|
|
|
32 |
|
|
|
33 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
license: mit
|
3 |
+
library_name: transformers
|
4 |
+
pipeline_tag: text-to-3d
|
5 |
---
|
6 |
|
7 |
Uni-3DAR
|
|
|
11 |
Introduction
|
12 |
------------
|
13 |
|
14 |
+
<p align="center"><img src="fig/overview.png" width=95%></p>
|
15 |
+
<p align="center"><b>Schematic illustration of the Uni-3DAR framework</b></p>
|
16 |
+
|
17 |
Uni-3DAR is an autoregressive model that unifies various 3D tasks. In particular, it offers the following improvements:
|
18 |
|
19 |
1. **Unified Handling of Multiple 3D Data Types.**
|
|
|
28 |
4. **High Accuracy.**
|
29 |
Building on octree compression, Uni-3DAR further tokenizes fine-grained 3D patches to maintain structural details, achieving substantially better generation quality than previous diffusion-based models.
|
30 |
|
|
|
|
|
31 |
|
32 |
+
News
|
33 |
+
----
|
34 |
+
|
35 |
+
**2025-03-21:** We have released the core model along with the QM9 training and inference pipeline.
|
36 |
+
|
37 |
+
|
38 |
+
Dependencies
|
39 |
+
------------
|
40 |
+
|
41 |
+
- [Uni-Core](https://github.com/dptech-corp/Uni-Core). For convenience, you can use our prebuilt Docker image:
|
42 |
+
`docker pull dptechnology/unicore:2407-pytorch2.4.0-cuda12.5-rdma`
|
43 |
+
|
44 |
+
|
45 |
+
Reproducing Results on QM9
|
46 |
+
--------------------------
|
47 |
+
|
48 |
+
To reproduce results on the QM9 dataset using our pretrained model or train from scratch, please follow the instructions below.
|
49 |
+
|
50 |
+
### Download Pretrained Model and Dataset
|
51 |
+
|
52 |
+
Download the pretrained checkpoint (`qm9.pt`) and the dataset archive (`qm9_data.tar.gz`) from our [Hugging Face repository](https://huggingface.co/dptech/Uni-3DAR/tree/main).
|
53 |
+
|
54 |
+
### Inference with Pretrained Model
|
55 |
+
|
56 |
+
To generate QM9 molecules using the pretrained model:
|
57 |
+
|
58 |
+
```
|
59 |
+
bash inference_qm9.sh qm9.pt
|
60 |
+
```
|
61 |
+
|
62 |
+
### Train from Scratch
|
63 |
+
|
64 |
+
To train the model from scratch:
|
65 |
+
|
66 |
+
1. Extract the dataset:
|
67 |
+
```
|
68 |
+
tar -xzvf qm9_data.tar.gz
|
69 |
+
```
|
70 |
+
|
71 |
+
2. Run the training script with your desired data path and experiment name:
|
72 |
+
|
73 |
+
```
|
74 |
+
base_dir=/your_folder_to_save/ bash train_qm9.sh ./qm9_data/ name_of_your_exp
|
75 |
+
```
|
76 |
|
77 |
|
78 |
+
Citation
|
79 |
+
--------
|
80 |
|
81 |
+
Please kindly cite our papers if you use the data/code/model.
|
82 |
|
83 |
+
```
|
84 |
+
@article{lu2025uni3dar,
|
85 |
+
author = {Shuqi Lu and Haowei Lin and Lin Yao and Zhifeng Gao and Xiaohong Ji and Weinan E and Linfeng Zhang and Guolin Ke},
|
86 |
+
title = {Uni-3DAR: Unified 3D Generation and Understanding via Autoregression on Compressed Spatial Tokens},
|
87 |
+
journal = {Arxiv},
|
88 |
+
year = {2025},
|
89 |
+
}
|
90 |
+
```
|