YoonaAI commited on
Commit
a80c3df
·
1 Parent(s): fe07c08

Upload 9 files

Browse files
lib/smplx/LICENSE.txt ADDED
@@ -0,0 +1,58 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ License
2
+
3
+ Software Copyright License for non-commercial scientific research purposes
4
+ Please read carefully the following terms and conditions and any accompanying documentation before you download and/or use the SMPL-X/SMPLify-X model, data and software, (the "Model & Software"), including 3D meshes, blend weights, blend shapes, textures, software, scripts, and animations. By downloading and/or using the Model & Software (including downloading, cloning, installing, and any other use of this github repository), you acknowledge that you have read these terms and conditions, understand them, and agree to be bound by them. If you do not agree with these terms and conditions, you must not download and/or use the Model & Software. Any infringement of the terms of this agreement will automatically terminate your rights under this License
5
+
6
+ Ownership / Licensees
7
+ The Software and the associated materials has been developed at the
8
+
9
+ Max Planck Institute for Intelligent Systems (hereinafter "MPI").
10
+
11
+ Any copyright or patent right is owned by and proprietary material of the
12
+
13
+ Max-Planck-Gesellschaft zur Förderung der Wissenschaften e.V. (hereinafter “MPG”; MPI and MPG hereinafter collectively “Max-Planck”)
14
+
15
+ hereinafter the “Licensor”.
16
+
17
+ License Grant
18
+ Licensor grants you (Licensee) personally a single-user, non-exclusive, non-transferable, free of charge right:
19
+
20
+ To install the Model & Software on computers owned, leased or otherwise controlled by you and/or your organization;
21
+ To use the Model & Software for the sole purpose of performing non-commercial scientific research, non-commercial education, or non-commercial artistic projects;
22
+ Any other use, in particular any use for commercial purposes, is prohibited. This includes, without limitation, incorporation in a commercial product, use in a commercial service, or production of other artifacts for commercial purposes. The Model & Software may not be reproduced, modified and/or made available in any form to any third party without Max-Planck’s prior written permission.
23
+
24
+ The Model & Software may not be used for pornographic purposes or to generate pornographic material whether commercial or not. This license also prohibits the use of the Model & Software to train methods/algorithms/neural networks/etc. for commercial use of any kind. By downloading the Model & Software, you agree not to reverse engineer it.
25
+
26
+ No Distribution
27
+ The Model & Software and the license herein granted shall not be copied, shared, distributed, re-sold, offered for re-sale, transferred or sub-licensed in whole or in part except that you may make one copy for archive purposes only.
28
+
29
+ Disclaimer of Representations and Warranties
30
+ You expressly acknowledge and agree that the Model & Software results from basic research, is provided “AS IS”, may contain errors, and that any use of the Model & Software is at your sole risk. LICENSOR MAKES NO REPRESENTATIONS OR WARRANTIES OF ANY KIND CONCERNING THE MODEL & SOFTWARE, NEITHER EXPRESS NOR IMPLIED, AND THE ABSENCE OF ANY LEGAL OR ACTUAL DEFECTS, WHETHER DISCOVERABLE OR NOT. Specifically, and not to limit the foregoing, licensor makes no representations or warranties (i) regarding the merchantability or fitness for a particular purpose of the Model & Software, (ii) that the use of the Model & Software will not infringe any patents, copyrights or other intellectual property rights of a third party, and (iii) that the use of the Model & Software will not cause any damage of any kind to you or a third party.
31
+
32
+ Limitation of Liability
33
+ Because this Model & Software License Agreement qualifies as a donation, according to Section 521 of the German Civil Code (Bürgerliches Gesetzbuch – BGB) Licensor as a donor is liable for intent and gross negligence only. If the Licensor fraudulently conceals a legal or material defect, they are obliged to compensate the Licensee for the resulting damage.
34
+ Licensor shall be liable for loss of data only up to the amount of typical recovery costs which would have arisen had proper and regular data backup measures been taken. For the avoidance of doubt Licensor shall be liable in accordance with the German Product Liability Act in the event of product liability. The foregoing applies also to Licensor’s legal representatives or assistants in performance. Any further liability shall be excluded.
35
+ Patent claims generated through the usage of the Model & Software cannot be directed towards the copyright holders.
36
+ The Model & Software is provided in the state of development the licensor defines. If modified or extended by Licensee, the Licensor makes no claims about the fitness of the Model & Software and is not responsible for any problems such modifications cause.
37
+
38
+ No Maintenance Services
39
+ You understand and agree that Licensor is under no obligation to provide either maintenance services, update services, notices of latent defects, or corrections of defects with regard to the Model & Software. Licensor nevertheless reserves the right to update, modify, or discontinue the Model & Software at any time.
40
+
41
+ Defects of the Model & Software must be notified in writing to the Licensor with a comprehensible description of the error symptoms. The notification of the defect should enable the reproduction of the error. The Licensee is encouraged to communicate any use, results, modification or publication.
42
+
43
+ Publications using the Model & Software
44
+ You acknowledge that the Model & Software is a valuable scientific resource and agree to appropriately reference the following paper in any publication making use of the Model & Software.
45
+
46
+ Citation:
47
+
48
+
49
+ @inproceedings{SMPL-X:2019,
50
+ title = {Expressive Body Capture: 3D Hands, Face, and Body from a Single Image},
51
+ author = {Pavlakos, Georgios and Choutas, Vasileios and Ghorbani, Nima and Bolkart, Timo and Osman, Ahmed A. A. and Tzionas, Dimitrios and Black, Michael J.},
52
+ booktitle = {Proceedings IEEE Conf. on Computer Vision and Pattern Recognition (CVPR)},
53
+ year = {2019}
54
+ }
55
+ Commercial licensing opportunities
56
+ For commercial uses of the Software, please send email to [email protected]
57
+
58
+ This Agreement shall be governed by the laws of the Federal Republic of Germany except for the UN Sales Convention.
lib/smplx/README.md ADDED
@@ -0,0 +1,207 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ## SMPL-X: A new joint 3D model of the human body, face and hands together
2
+
3
+ [[Paper Page](https://smpl-x.is.tue.mpg.de)] [[Paper](https://ps.is.tuebingen.mpg.de/uploads_file/attachment/attachment/497/SMPL-X.pdf)]
4
+ [[Supp. Mat.](https://ps.is.tuebingen.mpg.de/uploads_file/attachment/attachment/498/SMPL-X-supp.pdf)]
5
+
6
+ ![SMPL-X Examples](./images/teaser_fig.png)
7
+
8
+ ## Table of Contents
9
+ * [License](#license)
10
+ * [Description](#description)
11
+ * [News](#news)
12
+ * [Installation](#installation)
13
+ * [Downloading the model](#downloading-the-model)
14
+ * [Loading SMPL-X, SMPL+H and SMPL](#loading-smpl-x-smplh-and-smpl)
15
+ * [SMPL and SMPL+H setup](#smpl-and-smplh-setup)
16
+ * [Model loading](https://github.com/vchoutas/smplx#model-loading)
17
+ * [MANO and FLAME correspondences](#mano-and-flame-correspondences)
18
+ * [Example](#example)
19
+ * [Modifying the global pose of the model](#modifying-the-global-pose-of-the-model)
20
+ * [Citation](#citation)
21
+ * [Acknowledgments](#acknowledgments)
22
+ * [Contact](#contact)
23
+
24
+ ## License
25
+
26
+ Software Copyright License for **non-commercial scientific research purposes**.
27
+ Please read carefully the [terms and conditions](https://github.com/vchoutas/smplx/blob/master/LICENSE) and any accompanying documentation before you download and/or use the SMPL-X/SMPLify-X model, data and software, (the "Model & Software"), including 3D meshes, blend weights, blend shapes, textures, software, scripts, and animations. By downloading and/or using the Model & Software (including downloading, cloning, installing, and any other use of this github repository), you acknowledge that you have read these terms and conditions, understand them, and agree to be bound by them. If you do not agree with these terms and conditions, you must not download and/or use the Model & Software. Any infringement of the terms of this agreement will automatically terminate your rights under this [License](./LICENSE).
28
+
29
+ ## Disclaimer
30
+
31
+ The original images used for the figures 1 and 2 of the paper can be found in this link.
32
+ The images in the paper are used under license from gettyimages.com.
33
+ We have acquired the right to use them in the publication, but redistribution is not allowed.
34
+ Please follow the instructions on the given link to acquire right of usage.
35
+ Our results are obtained on the 483 × 724 pixels resolution of the original images.
36
+
37
+ ## Description
38
+
39
+ *SMPL-X* (SMPL eXpressive) is a unified body model with shape parameters trained jointly for the
40
+ face, hands and body. *SMPL-X* uses standard vertex based linear blend skinning with learned corrective blend
41
+ shapes, has N = 10, 475 vertices and K = 54 joints,
42
+ which include joints for the neck, jaw, eyeballs and fingers.
43
+ SMPL-X is defined by a function M(θ, β, ψ), where θ is the pose parameters, β the shape parameters and
44
+ ψ the facial expression parameters.
45
+
46
+ ## News
47
+
48
+ - 3 November 2020: We release the code to transfer between the models in the
49
+ SMPL family. For more details on the code, go to this [readme
50
+ file](./transfer_model/README.md). A detailed explanation on how the mappings
51
+ were extracted can be found [here](./transfer_model/docs/transfer.md).
52
+ - 23 September 2020: A UV map is now available for SMPL-X, please check the
53
+ Downloads section of the website.
54
+ - 20 August 2020: The full shape and expression space of SMPL-X are now available.
55
+
56
+ ## Installation
57
+
58
+ To install the model please follow the next steps in the specified order:
59
+ 1. To install from PyPi simply run:
60
+ ```Shell
61
+ pip install smplx[all]
62
+ ```
63
+ 2. Clone this repository and install it using the *setup.py* script:
64
+ ```Shell
65
+ git clone https://github.com/vchoutas/smplx
66
+ python setup.py install
67
+ ```
68
+
69
+ ## Downloading the model
70
+
71
+ To download the *SMPL-X* model go to [this project website](https://smpl-x.is.tue.mpg.de) and register to get access to the downloads section.
72
+
73
+ To download the *SMPL+H* model go to [this project website](http://mano.is.tue.mpg.de) and register to get access to the downloads section.
74
+
75
+ To download the *SMPL* model go to [this](http://smpl.is.tue.mpg.de) (male and female models) and [this](http://smplify.is.tue.mpg.de) (gender neutral model) project website and register to get access to the downloads section.
76
+
77
+ ## Loading SMPL-X, SMPL+H and SMPL
78
+
79
+ ### SMPL and SMPL+H setup
80
+
81
+ The loader gives the option to use any of the SMPL-X, SMPL+H, SMPL, and MANO models. Depending on the model you want to use, please follow the respective download instructions. To switch between MANO, SMPL, SMPL+H and SMPL-X just change the *model_path* or *model_type* parameters. For more details please check the docs of the model classes.
82
+ Before using SMPL and SMPL+H you should follow the instructions in [tools/README.md](./tools/README.md) to remove the
83
+ Chumpy objects from both model pkls, as well as merge the MANO parameters with SMPL+H.
84
+
85
+ ### Model loading
86
+
87
+ You can either use the [create](https://github.com/vchoutas/smplx/blob/c63c02b478c5c6f696491ed9167e3af6b08d89b1/smplx/body_models.py#L54)
88
+ function from [body_models](./smplx/body_models.py) or directly call the constructor for the
89
+ [SMPL](https://github.com/vchoutas/smplx/blob/c63c02b478c5c6f696491ed9167e3af6b08d89b1/smplx/body_models.py#L106),
90
+ [SMPL+H](https://github.com/vchoutas/smplx/blob/c63c02b478c5c6f696491ed9167e3af6b08d89b1/smplx/body_models.py#L395) and
91
+ [SMPL-X](https://github.com/vchoutas/smplx/blob/c63c02b478c5c6f696491ed9167e3af6b08d89b1/smplx/body_models.py#L628) model. The path to the model can either be the path to the file with the parameters or a directory with the following structure:
92
+ ```bash
93
+ models
94
+ ├── smpl
95
+ │   ├── SMPL_FEMALE.pkl
96
+ │   └── SMPL_MALE.pkl
97
+ │   └── SMPL_NEUTRAL.pkl
98
+ ├── smplh
99
+ │   ├── SMPLH_FEMALE.pkl
100
+ │   └── SMPLH_MALE.pkl
101
+ ├── mano
102
+ | ├── MANO_RIGHT.pkl
103
+ | └── MANO_LEFT.pkl
104
+ └── smplx
105
+ ├── SMPLX_FEMALE.npz
106
+ ├── SMPLX_FEMALE.pkl
107
+ ├── SMPLX_MALE.npz
108
+ ├── SMPLX_MALE.pkl
109
+ ├── SMPLX_NEUTRAL.npz
110
+ └── SMPLX_NEUTRAL.pkl
111
+ ```
112
+
113
+
114
+ ## MANO and FLAME correspondences
115
+
116
+ The vertex correspondences between SMPL-X and MANO, FLAME can be downloaded
117
+ from [the project website](https://smpl-x.is.tue.mpg.de). If you have extracted
118
+ the correspondence data in the folder *correspondences*, then use the following
119
+ scripts to visualize them:
120
+
121
+ 1. To view MANO correspondences run the following command:
122
+
123
+ ```
124
+ python examples/vis_mano_vertices.py --model-folder $SMPLX_FOLDER --corr-fname correspondences/MANO_SMPLX_vertex_ids.pkl
125
+ ```
126
+
127
+ 2. To view FLAME correspondences run the following command:
128
+
129
+ ```
130
+ python examples/vis_flame_vertices.py --model-folder $SMPLX_FOLDER --corr-fname correspondences/SMPL-X__FLAME_vertex_ids.npy
131
+ ```
132
+
133
+ ## Example
134
+
135
+ After installing the *smplx* package and downloading the model parameters you should be able to run the *demo.py*
136
+ script to visualize the results. For this step you have to install the [pyrender](https://pyrender.readthedocs.io/en/latest/index.html) and [trimesh](https://trimsh.org/) packages.
137
+
138
+ `python examples/demo.py --model-folder $SMPLX_FOLDER --plot-joints=True --gender="neutral"`
139
+
140
+ ![SMPL-X Examples](./images/example.png)
141
+
142
+ ## Modifying the global pose of the model
143
+
144
+ If you want to modify the global pose of the model, i.e. the root rotation and
145
+ translation, to a new coordinate system for example, you need to take into
146
+ account that the model rotation uses the pelvis as the center of rotation. A
147
+ more detailed description can be found in the following
148
+ [link](https://www.dropbox.com/scl/fi/zkatuv5shs8d4tlwr8ecc/Change-parameters-to-new-coordinate-system.paper?dl=0&rlkey=lotq1sh6wzkmyttisc05h0in0).
149
+ If something is not clear, please let me know so that I can update the
150
+ description.
151
+
152
+ ## Citation
153
+
154
+ Depending on which model is loaded for your project, i.e. SMPL-X or SMPL+H or SMPL, please cite the most relevant work below, listed in the same order:
155
+
156
+ ```
157
+ @inproceedings{SMPL-X:2019,
158
+ title = {Expressive Body Capture: 3D Hands, Face, and Body from a Single Image},
159
+ author = {Pavlakos, Georgios and Choutas, Vasileios and Ghorbani, Nima and Bolkart, Timo and Osman, Ahmed A. A. and Tzionas, Dimitrios and Black, Michael J.},
160
+ booktitle = {Proceedings IEEE Conf. on Computer Vision and Pattern Recognition (CVPR)},
161
+ year = {2019}
162
+ }
163
+ ```
164
+
165
+ ```
166
+ @article{MANO:SIGGRAPHASIA:2017,
167
+ title = {Embodied Hands: Modeling and Capturing Hands and Bodies Together},
168
+ author = {Romero, Javier and Tzionas, Dimitrios and Black, Michael J.},
169
+ journal = {ACM Transactions on Graphics, (Proc. SIGGRAPH Asia)},
170
+ volume = {36},
171
+ number = {6},
172
+ series = {245:1--245:17},
173
+ month = nov,
174
+ year = {2017},
175
+ month_numeric = {11}
176
+ }
177
+ ```
178
+
179
+ ```
180
+ @article{SMPL:2015,
181
+ author = {Loper, Matthew and Mahmood, Naureen and Romero, Javier and Pons-Moll, Gerard and Black, Michael J.},
182
+ title = {{SMPL}: A Skinned Multi-Person Linear Model},
183
+ journal = {ACM Transactions on Graphics, (Proc. SIGGRAPH Asia)},
184
+ month = oct,
185
+ number = {6},
186
+ pages = {248:1--248:16},
187
+ publisher = {ACM},
188
+ volume = {34},
189
+ year = {2015}
190
+ }
191
+ ```
192
+
193
+ This repository was originally developed for SMPL-X / SMPLify-X (CVPR 2019), you might be interested in having a look: [https://smpl-x.is.tue.mpg.de](https://smpl-x.is.tue.mpg.de).
194
+
195
+ ## Acknowledgments
196
+
197
+ ### Facial Contour
198
+
199
+ Special thanks to [Soubhik Sanyal](https://github.com/soubhiksanyal) for sharing the Tensorflow code used for the facial
200
+ landmarks.
201
+
202
+ ## Contact
203
+ The code of this repository was implemented by [Vassilis Choutas]([email protected]).
204
+
205
+ For questions, please contact [[email protected]]([email protected]).
206
+
207
+ For commercial licensing (and all related questions for business applications), please contact [[email protected]]([email protected]).
lib/smplx/body_models.py ADDED
The diff for this file is too large to render. See raw diff
 
lib/smplx/gitignore.txt ADDED
@@ -0,0 +1,114 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #### joe made this: http://goel.io/joe
2
+
3
+ #####=== Python ===#####
4
+
5
+ # Byte-compiled / optimized / DLL files
6
+ __pycache__/
7
+ *.py[cod]
8
+ *$py.class
9
+
10
+ # C extensions
11
+ *.so
12
+
13
+ # Distribution / packaging
14
+ .Python
15
+ build/
16
+ develop-eggs/
17
+ dist/
18
+ downloads/
19
+ eggs/
20
+ .eggs/
21
+ lib/
22
+ lib64/
23
+ parts/
24
+ sdist/
25
+ var/
26
+ wheels/
27
+ *.egg-info/
28
+ .installed.cfg
29
+ *.egg
30
+ MANIFEST
31
+
32
+ # PyInstaller
33
+ # Usually these files are written by a python script from a template
34
+ # before PyInstaller builds the exe, so as to inject date/other infos into it.
35
+ *.manifest
36
+ *.spec
37
+
38
+ # Installer logs
39
+ pip-log.txt
40
+ pip-delete-this-directory.txt
41
+
42
+ # Unit test / coverage reports
43
+ htmlcov/
44
+ .tox/
45
+ .coverage
46
+ .coverage.*
47
+ .cache
48
+ nosetests.xml
49
+ coverage.xml
50
+ *.cover
51
+ .hypothesis/
52
+ .pytest_cache/
53
+
54
+ # Translations
55
+ *.mo
56
+ *.pot
57
+
58
+ # Django stuff:
59
+ *.log
60
+ local_settings.py
61
+ db.sqlite3
62
+
63
+ # Flask stuff:
64
+ instance/
65
+ .webassets-cache
66
+
67
+ # Scrapy stuff:
68
+ .scrapy
69
+
70
+ # Sphinx documentation
71
+ docs/_build/
72
+
73
+ # PyBuilder
74
+ target/
75
+
76
+ # Jupyter Notebook
77
+ .ipynb_checkpoints
78
+
79
+ # pyenv
80
+ .python-version
81
+
82
+ # celery beat schedule file
83
+ celerybeat-schedule
84
+
85
+ # SageMath parsed files
86
+ *.sage.py
87
+
88
+ # Environments
89
+ .env
90
+ .venv
91
+ env/
92
+ venv/
93
+ ENV/
94
+ env.bak/
95
+ venv.bak/
96
+
97
+ # Spyder project settings
98
+ .spyderproject
99
+ .spyproject
100
+
101
+ # Rope project settings
102
+ .ropeproject
103
+
104
+ # mkdocs documentation
105
+ /site
106
+
107
+ # mypy
108
+ .mypy_cache/
109
+ models/
110
+ output/
111
+ outputs/
112
+ transfer_data/
113
+ torch-trust-ncg/
114
+ build/
lib/smplx/joint_names.py ADDED
@@ -0,0 +1,163 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # -*- coding: utf-8 -*-
2
+
3
+ # Max-Planck-Gesellschaft zur Förderung der Wissenschaften e.V. (MPG) is
4
+ # holder of all proprietary rights on this computer program.
5
+ # You can only use this computer program if you have closed
6
+ # a license agreement with MPG or you get the right to use the computer
7
+ # program from someone who is authorized to grant you that right.
8
+ # Any use of the computer program without a valid license is prohibited and
9
+ # liable to prosecution.
10
+ #
11
+ # Copyright©2019 Max-Planck-Gesellschaft zur Förderung
12
+ # der Wissenschaften e.V. (MPG). acting on behalf of its Max Planck Institute
13
+ # for Intelligent Systems. All rights reserved.
14
+ #
15
+ # Contact: [email protected]
16
+
17
+ JOINT_NAMES = [
18
+ 'pelvis',
19
+ 'left_hip',
20
+ 'right_hip',
21
+ 'spine1',
22
+ 'left_knee',
23
+ 'right_knee',
24
+ 'spine2',
25
+ 'left_ankle',
26
+ 'right_ankle',
27
+ 'spine3',
28
+ 'left_foot',
29
+ 'right_foot',
30
+ 'neck',
31
+ 'left_collar',
32
+ 'right_collar',
33
+ 'head',
34
+ 'left_shoulder',
35
+ 'right_shoulder',
36
+ 'left_elbow',
37
+ 'right_elbow',
38
+ 'left_wrist',
39
+ 'right_wrist',
40
+ 'jaw',
41
+ 'left_eye_smplhf',
42
+ 'right_eye_smplhf',
43
+ 'left_index1',
44
+ 'left_index2',
45
+ 'left_index3',
46
+ 'left_middle1',
47
+ 'left_middle2',
48
+ 'left_middle3',
49
+ 'left_pinky1',
50
+ 'left_pinky2',
51
+ 'left_pinky3',
52
+ 'left_ring1',
53
+ 'left_ring2',
54
+ 'left_ring3',
55
+ 'left_thumb1',
56
+ 'left_thumb2',
57
+ 'left_thumb3',
58
+ 'right_index1',
59
+ 'right_index2',
60
+ 'right_index3',
61
+ 'right_middle1',
62
+ 'right_middle2',
63
+ 'right_middle3',
64
+ 'right_pinky1',
65
+ 'right_pinky2',
66
+ 'right_pinky3',
67
+ 'right_ring1',
68
+ 'right_ring2',
69
+ 'right_ring3',
70
+ 'right_thumb1',
71
+ 'right_thumb2',
72
+ 'right_thumb3',
73
+ 'nose',
74
+ 'right_eye',
75
+ 'left_eye',
76
+ 'right_ear',
77
+ 'left_ear',
78
+ 'left_big_toe',
79
+ 'left_small_toe',
80
+ 'left_heel',
81
+ 'right_big_toe',
82
+ 'right_small_toe',
83
+ 'right_heel',
84
+ 'left_thumb',
85
+ 'left_index',
86
+ 'left_middle',
87
+ 'left_ring',
88
+ 'left_pinky',
89
+ 'right_thumb',
90
+ 'right_index',
91
+ 'right_middle',
92
+ 'right_ring',
93
+ 'right_pinky',
94
+ 'right_eye_brow1',
95
+ 'right_eye_brow2',
96
+ 'right_eye_brow3',
97
+ 'right_eye_brow4',
98
+ 'right_eye_brow5',
99
+ 'left_eye_brow5',
100
+ 'left_eye_brow4',
101
+ 'left_eye_brow3',
102
+ 'left_eye_brow2',
103
+ 'left_eye_brow1',
104
+ 'nose1',
105
+ 'nose2',
106
+ 'nose3',
107
+ 'nose4',
108
+ 'right_nose_2',
109
+ 'right_nose_1',
110
+ 'nose_middle',
111
+ 'left_nose_1',
112
+ 'left_nose_2',
113
+ 'right_eye1',
114
+ 'right_eye2',
115
+ 'right_eye3',
116
+ 'right_eye4',
117
+ 'right_eye5',
118
+ 'right_eye6',
119
+ 'left_eye4',
120
+ 'left_eye3',
121
+ 'left_eye2',
122
+ 'left_eye1',
123
+ 'left_eye6',
124
+ 'left_eye5',
125
+ 'right_mouth_1',
126
+ 'right_mouth_2',
127
+ 'right_mouth_3',
128
+ 'mouth_top',
129
+ 'left_mouth_3',
130
+ 'left_mouth_2',
131
+ 'left_mouth_1',
132
+ 'left_mouth_5', # 59 in OpenPose output
133
+ 'left_mouth_4', # 58 in OpenPose output
134
+ 'mouth_bottom',
135
+ 'right_mouth_4',
136
+ 'right_mouth_5',
137
+ 'right_lip_1',
138
+ 'right_lip_2',
139
+ 'lip_top',
140
+ 'left_lip_2',
141
+ 'left_lip_1',
142
+ 'left_lip_3',
143
+ 'lip_bottom',
144
+ 'right_lip_3',
145
+ # Face contour
146
+ 'right_contour_1',
147
+ 'right_contour_2',
148
+ 'right_contour_3',
149
+ 'right_contour_4',
150
+ 'right_contour_5',
151
+ 'right_contour_6',
152
+ 'right_contour_7',
153
+ 'right_contour_8',
154
+ 'contour_middle',
155
+ 'left_contour_8',
156
+ 'left_contour_7',
157
+ 'left_contour_6',
158
+ 'left_contour_5',
159
+ 'left_contour_4',
160
+ 'left_contour_3',
161
+ 'left_contour_2',
162
+ 'left_contour_1',
163
+ ]
lib/smplx/lbs.py ADDED
@@ -0,0 +1,405 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # -*- coding: utf-8 -*-
2
+
3
+ # Max-Planck-Gesellschaft zur Förderung der Wissenschaften e.V. (MPG) is
4
+ # holder of all proprietary rights on this computer program.
5
+ # You can only use this computer program if you have closed
6
+ # a license agreement with MPG or you get the right to use the computer
7
+ # program from someone who is authorized to grant you that right.
8
+ # Any use of the computer program without a valid license is prohibited and
9
+ # liable to prosecution.
10
+ #
11
+ # Copyright©2019 Max-Planck-Gesellschaft zur Förderung
12
+ # der Wissenschaften e.V. (MPG). acting on behalf of its Max Planck Institute
13
+ # for Intelligent Systems. All rights reserved.
14
+ #
15
+ # Contact: [email protected]
16
+
17
+ from __future__ import absolute_import
18
+ from __future__ import print_function
19
+ from __future__ import division
20
+
21
+ from typing import Tuple, List, Optional
22
+ import numpy as np
23
+
24
+ import torch
25
+ import torch.nn.functional as F
26
+
27
+ from .utils import rot_mat_to_euler, Tensor
28
+
29
+
30
+ def find_dynamic_lmk_idx_and_bcoords(
31
+ vertices: Tensor,
32
+ pose: Tensor,
33
+ dynamic_lmk_faces_idx: Tensor,
34
+ dynamic_lmk_b_coords: Tensor,
35
+ neck_kin_chain: List[int],
36
+ pose2rot: bool = True,
37
+ ) -> Tuple[Tensor, Tensor]:
38
+ ''' Compute the faces, barycentric coordinates for the dynamic landmarks
39
+
40
+
41
+ To do so, we first compute the rotation of the neck around the y-axis
42
+ and then use a pre-computed look-up table to find the faces and the
43
+ barycentric coordinates that will be used.
44
+
45
+ Special thanks to Soubhik Sanyal ([email protected])
46
+ for providing the original TensorFlow implementation and for the LUT.
47
+
48
+ Parameters
49
+ ----------
50
+ vertices: torch.tensor BxVx3, dtype = torch.float32
51
+ The tensor of input vertices
52
+ pose: torch.tensor Bx(Jx3), dtype = torch.float32
53
+ The current pose of the body model
54
+ dynamic_lmk_faces_idx: torch.tensor L, dtype = torch.long
55
+ The look-up table from neck rotation to faces
56
+ dynamic_lmk_b_coords: torch.tensor Lx3, dtype = torch.float32
57
+ The look-up table from neck rotation to barycentric coordinates
58
+ neck_kin_chain: list
59
+ A python list that contains the indices of the joints that form the
60
+ kinematic chain of the neck.
61
+ dtype: torch.dtype, optional
62
+
63
+ Returns
64
+ -------
65
+ dyn_lmk_faces_idx: torch.tensor, dtype = torch.long
66
+ A tensor of size BxL that contains the indices of the faces that
67
+ will be used to compute the current dynamic landmarks.
68
+ dyn_lmk_b_coords: torch.tensor, dtype = torch.float32
69
+ A tensor of size BxL that contains the indices of the faces that
70
+ will be used to compute the current dynamic landmarks.
71
+ '''
72
+
73
+ dtype = vertices.dtype
74
+ batch_size = vertices.shape[0]
75
+
76
+ if pose2rot:
77
+ aa_pose = torch.index_select(pose.view(batch_size, -1, 3), 1,
78
+ neck_kin_chain)
79
+ rot_mats = batch_rodrigues(
80
+ aa_pose.view(-1, 3)).view(batch_size, -1, 3, 3)
81
+ else:
82
+ rot_mats = torch.index_select(
83
+ pose.view(batch_size, -1, 3, 3), 1, neck_kin_chain)
84
+
85
+ rel_rot_mat = torch.eye(
86
+ 3, device=vertices.device, dtype=dtype).unsqueeze_(dim=0).repeat(
87
+ batch_size, 1, 1)
88
+ for idx in range(len(neck_kin_chain)):
89
+ rel_rot_mat = torch.bmm(rot_mats[:, idx], rel_rot_mat)
90
+
91
+ y_rot_angle = torch.round(
92
+ torch.clamp(-rot_mat_to_euler(rel_rot_mat) * 180.0 / np.pi,
93
+ max=39)).to(dtype=torch.long)
94
+ neg_mask = y_rot_angle.lt(0).to(dtype=torch.long)
95
+ mask = y_rot_angle.lt(-39).to(dtype=torch.long)
96
+ neg_vals = mask * 78 + (1 - mask) * (39 - y_rot_angle)
97
+ y_rot_angle = (neg_mask * neg_vals +
98
+ (1 - neg_mask) * y_rot_angle)
99
+
100
+ dyn_lmk_faces_idx = torch.index_select(dynamic_lmk_faces_idx,
101
+ 0, y_rot_angle)
102
+ dyn_lmk_b_coords = torch.index_select(dynamic_lmk_b_coords,
103
+ 0, y_rot_angle)
104
+
105
+ return dyn_lmk_faces_idx, dyn_lmk_b_coords
106
+
107
+
108
+ def vertices2landmarks(
109
+ vertices: Tensor,
110
+ faces: Tensor,
111
+ lmk_faces_idx: Tensor,
112
+ lmk_bary_coords: Tensor
113
+ ) -> Tensor:
114
+ ''' Calculates landmarks by barycentric interpolation
115
+
116
+ Parameters
117
+ ----------
118
+ vertices: torch.tensor BxVx3, dtype = torch.float32
119
+ The tensor of input vertices
120
+ faces: torch.tensor Fx3, dtype = torch.long
121
+ The faces of the mesh
122
+ lmk_faces_idx: torch.tensor L, dtype = torch.long
123
+ The tensor with the indices of the faces used to calculate the
124
+ landmarks.
125
+ lmk_bary_coords: torch.tensor Lx3, dtype = torch.float32
126
+ The tensor of barycentric coordinates that are used to interpolate
127
+ the landmarks
128
+
129
+ Returns
130
+ -------
131
+ landmarks: torch.tensor BxLx3, dtype = torch.float32
132
+ The coordinates of the landmarks for each mesh in the batch
133
+ '''
134
+ # Extract the indices of the vertices for each face
135
+ # BxLx3
136
+ batch_size, num_verts = vertices.shape[:2]
137
+ device = vertices.device
138
+
139
+ lmk_faces = torch.index_select(faces, 0, lmk_faces_idx.view(-1)).view(
140
+ batch_size, -1, 3)
141
+
142
+ lmk_faces += torch.arange(
143
+ batch_size, dtype=torch.long, device=device).view(-1, 1, 1) * num_verts
144
+
145
+ lmk_vertices = vertices.view(-1, 3)[lmk_faces].view(
146
+ batch_size, -1, 3, 3)
147
+
148
+ landmarks = torch.einsum('blfi,blf->bli', [lmk_vertices, lmk_bary_coords])
149
+ return landmarks
150
+
151
+
152
+ def lbs(
153
+ betas: Tensor,
154
+ pose: Tensor,
155
+ v_template: Tensor,
156
+ shapedirs: Tensor,
157
+ posedirs: Tensor,
158
+ J_regressor: Tensor,
159
+ parents: Tensor,
160
+ lbs_weights: Tensor,
161
+ pose2rot: bool = True,
162
+ return_transformation: bool = False,
163
+ ) -> Tuple[Tensor, Tensor, Optional[Tensor], Optional[Tensor]]:
164
+ ''' Performs Linear Blend Skinning with the given shape and pose parameters
165
+
166
+ Parameters
167
+ ----------
168
+ betas : torch.tensor BxNB
169
+ The tensor of shape parameters
170
+ pose : torch.tensor Bx(J + 1) * 3
171
+ The pose parameters in axis-angle format
172
+ v_template torch.tensor BxVx3
173
+ The template mesh that will be deformed
174
+ shapedirs : torch.tensor 1xNB
175
+ The tensor of PCA shape displacements
176
+ posedirs : torch.tensor Px(V * 3)
177
+ The pose PCA coefficients
178
+ J_regressor : torch.tensor JxV
179
+ The regressor array that is used to calculate the joints from
180
+ the position of the vertices
181
+ parents: torch.tensor J
182
+ The array that describes the kinematic tree for the model
183
+ lbs_weights: torch.tensor N x V x (J + 1)
184
+ The linear blend skinning weights that represent how much the
185
+ rotation matrix of each part affects each vertex
186
+ pose2rot: bool, optional
187
+ Flag on whether to convert the input pose tensor to rotation
188
+ matrices. The default value is True. If False, then the pose tensor
189
+ should already contain rotation matrices and have a size of
190
+ Bx(J + 1)x9
191
+ dtype: torch.dtype, optional
192
+
193
+ Returns
194
+ -------
195
+ verts: torch.tensor BxVx3
196
+ The vertices of the mesh after applying the shape and pose
197
+ displacements.
198
+ joints: torch.tensor BxJx3
199
+ The joints of the model
200
+ '''
201
+
202
+ batch_size = max(betas.shape[0], pose.shape[0])
203
+ device, dtype = betas.device, betas.dtype
204
+
205
+ # Add shape contribution
206
+ v_shaped = v_template + blend_shapes(betas, shapedirs)
207
+
208
+ # Get the joints
209
+ # NxJx3 array
210
+ J = vertices2joints(J_regressor, v_shaped)
211
+
212
+ # 3. Add pose blend shapes
213
+ # N x J x 3 x 3
214
+ ident = torch.eye(3, dtype=dtype, device=device)
215
+ if pose2rot:
216
+ rot_mats = batch_rodrigues(pose.view(-1, 3)).view(
217
+ [batch_size, -1, 3, 3])
218
+
219
+ pose_feature = (rot_mats[:, 1:, :, :] - ident).view([batch_size, -1])
220
+ # (N x P) x (P, V * 3) -> N x V x 3
221
+ pose_offsets = torch.matmul(
222
+ pose_feature, posedirs).view(batch_size, -1, 3)
223
+ else:
224
+ pose_feature = pose[:, 1:].view(batch_size, -1, 3, 3) - ident
225
+ rot_mats = pose.view(batch_size, -1, 3, 3)
226
+
227
+ pose_offsets = torch.matmul(pose_feature.view(batch_size, -1),
228
+ posedirs).view(batch_size, -1, 3)
229
+
230
+ v_posed = pose_offsets + v_shaped
231
+ # 4. Get the global joint location
232
+ J_transformed, A = batch_rigid_transform(rot_mats, J, parents, dtype=dtype)
233
+
234
+ # 5. Do skinning:
235
+ # W is N x V x (J + 1)
236
+ W = lbs_weights.unsqueeze(dim=0).expand([batch_size, -1, -1])
237
+ # (N x V x (J + 1)) x (N x (J + 1) x 16)
238
+ num_joints = J_regressor.shape[0]
239
+ T = torch.matmul(W, A.view(batch_size, num_joints, 16)) \
240
+ .view(batch_size, -1, 4, 4)
241
+
242
+ homogen_coord = torch.ones([batch_size, v_posed.shape[1], 1],
243
+ dtype=dtype, device=device)
244
+ v_posed_homo = torch.cat([v_posed, homogen_coord], dim=2)
245
+ v_homo = torch.matmul(T, torch.unsqueeze(v_posed_homo, dim=-1))
246
+
247
+ verts = v_homo[:, :, :3, 0]
248
+
249
+ if return_transformation:
250
+ return verts, J_transformed, A, T
251
+
252
+ return verts, J_transformed
253
+
254
+
255
+ def vertices2joints(J_regressor: Tensor, vertices: Tensor) -> Tensor:
256
+ ''' Calculates the 3D joint locations from the vertices
257
+
258
+ Parameters
259
+ ----------
260
+ J_regressor : torch.tensor JxV
261
+ The regressor array that is used to calculate the joints from the
262
+ position of the vertices
263
+ vertices : torch.tensor BxVx3
264
+ The tensor of mesh vertices
265
+
266
+ Returns
267
+ -------
268
+ torch.tensor BxJx3
269
+ The location of the joints
270
+ '''
271
+
272
+ return torch.einsum('bik,ji->bjk', [vertices, J_regressor])
273
+
274
+
275
+ def blend_shapes(betas: Tensor, shape_disps: Tensor) -> Tensor:
276
+ ''' Calculates the per vertex displacement due to the blend shapes
277
+
278
+
279
+ Parameters
280
+ ----------
281
+ betas : torch.tensor Bx(num_betas)
282
+ Blend shape coefficients
283
+ shape_disps: torch.tensor Vx3x(num_betas)
284
+ Blend shapes
285
+
286
+ Returns
287
+ -------
288
+ torch.tensor BxVx3
289
+ The per-vertex displacement due to shape deformation
290
+ '''
291
+
292
+ # Displacement[b, m, k] = sum_{l} betas[b, l] * shape_disps[m, k, l]
293
+ # i.e. Multiply each shape displacement by its corresponding beta and
294
+ # then sum them.
295
+ blend_shape = torch.einsum('bl,mkl->bmk', [betas, shape_disps])
296
+ return blend_shape
297
+
298
+
299
+ def batch_rodrigues(
300
+ rot_vecs: Tensor,
301
+ epsilon: float = 1e-8,
302
+ ) -> Tensor:
303
+ ''' Calculates the rotation matrices for a batch of rotation vectors
304
+ Parameters
305
+ ----------
306
+ rot_vecs: torch.tensor Nx3
307
+ array of N axis-angle vectors
308
+ Returns
309
+ -------
310
+ R: torch.tensor Nx3x3
311
+ The rotation matrices for the given axis-angle parameters
312
+ '''
313
+
314
+ batch_size = rot_vecs.shape[0]
315
+ device, dtype = rot_vecs.device, rot_vecs.dtype
316
+
317
+ angle = torch.norm(rot_vecs + 1e-8, dim=1, keepdim=True)
318
+ rot_dir = rot_vecs / angle
319
+
320
+ cos = torch.unsqueeze(torch.cos(angle), dim=1)
321
+ sin = torch.unsqueeze(torch.sin(angle), dim=1)
322
+
323
+ # Bx1 arrays
324
+ rx, ry, rz = torch.split(rot_dir, 1, dim=1)
325
+ K = torch.zeros((batch_size, 3, 3), dtype=dtype, device=device)
326
+
327
+ zeros = torch.zeros((batch_size, 1), dtype=dtype, device=device)
328
+ K = torch.cat([zeros, -rz, ry, rz, zeros, -rx, -ry, rx, zeros], dim=1) \
329
+ .view((batch_size, 3, 3))
330
+
331
+ ident = torch.eye(3, dtype=dtype, device=device).unsqueeze(dim=0)
332
+ rot_mat = ident + sin * K + (1 - cos) * torch.bmm(K, K)
333
+ return rot_mat
334
+
335
+
336
+ def transform_mat(R: Tensor, t: Tensor) -> Tensor:
337
+ ''' Creates a batch of transformation matrices
338
+ Args:
339
+ - R: Bx3x3 array of a batch of rotation matrices
340
+ - t: Bx3x1 array of a batch of translation vectors
341
+ Returns:
342
+ - T: Bx4x4 Transformation matrix
343
+ '''
344
+ # No padding left or right, only add an extra row
345
+ return torch.cat([F.pad(R, [0, 0, 0, 1]),
346
+ F.pad(t, [0, 0, 0, 1], value=1)], dim=2)
347
+
348
+
349
+ def batch_rigid_transform(
350
+ rot_mats: Tensor,
351
+ joints: Tensor,
352
+ parents: Tensor,
353
+ dtype=torch.float32
354
+ ) -> Tensor:
355
+ """
356
+ Applies a batch of rigid transformations to the joints
357
+
358
+ Parameters
359
+ ----------
360
+ rot_mats : torch.tensor BxNx3x3
361
+ Tensor of rotation matrices
362
+ joints : torch.tensor BxNx3
363
+ Locations of joints
364
+ parents : torch.tensor BxN
365
+ The kinematic tree of each object
366
+ dtype : torch.dtype, optional:
367
+ The data type of the created tensors, the default is torch.float32
368
+
369
+ Returns
370
+ -------
371
+ posed_joints : torch.tensor BxNx3
372
+ The locations of the joints after applying the pose rotations
373
+ rel_transforms : torch.tensor BxNx4x4
374
+ The relative (with respect to the root joint) rigid transformations
375
+ for all the joints
376
+ """
377
+
378
+ joints = torch.unsqueeze(joints, dim=-1)
379
+
380
+ rel_joints = joints.clone()
381
+ rel_joints[:, 1:] -= joints[:, parents[1:]]
382
+
383
+ transforms_mat = transform_mat(
384
+ rot_mats.reshape(-1, 3, 3),
385
+ rel_joints.reshape(-1, 3, 1)).reshape(-1, joints.shape[1], 4, 4)
386
+
387
+ transform_chain = [transforms_mat[:, 0]]
388
+ for i in range(1, parents.shape[0]):
389
+ # Subtract the joint location at the rest pose
390
+ # No need for rotation, since it's identity when at rest
391
+ curr_res = torch.matmul(transform_chain[parents[i]],
392
+ transforms_mat[:, i])
393
+ transform_chain.append(curr_res)
394
+
395
+ transforms = torch.stack(transform_chain, dim=1)
396
+
397
+ # The last column of the transformations contains the posed joints
398
+ posed_joints = transforms[:, :, :3, 3]
399
+
400
+ joints_homogen = F.pad(joints, [0, 0, 0, 1])
401
+
402
+ rel_transforms = transforms - F.pad(
403
+ torch.matmul(transforms, joints_homogen), [3, 0, 0, 0, 0, 0, 0, 0])
404
+
405
+ return posed_joints, rel_transforms
lib/smplx/utils.py ADDED
@@ -0,0 +1,127 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # -*- coding: utf-8 -*-
2
+
3
+ # Max-Planck-Gesellschaft zur Förderung der Wissenschaften e.V. (MPG) is
4
+ # holder of all proprietary rights on this computer program.
5
+ # You can only use this computer program if you have closed
6
+ # a license agreement with MPG or you get the right to use the computer
7
+ # program from someone who is authorized to grant you that right.
8
+ # Any use of the computer program without a valid license is prohibited and
9
+ # liable to prosecution.
10
+ #
11
+ # Copyright©2019 Max-Planck-Gesellschaft zur Förderung
12
+ # der Wissenschaften e.V. (MPG). acting on behalf of its Max Planck Institute
13
+ # for Intelligent Systems. All rights reserved.
14
+ #
15
+ # Contact: [email protected]
16
+
17
+ from typing import NewType, Union, Optional
18
+ from dataclasses import dataclass, asdict, fields
19
+ import numpy as np
20
+ import torch
21
+
22
+ Tensor = NewType('Tensor', torch.Tensor)
23
+ Array = NewType('Array', np.ndarray)
24
+
25
+
26
+ @dataclass
27
+ class ModelOutput:
28
+ vertices: Optional[Tensor] = None
29
+ joints: Optional[Tensor] = None
30
+ full_pose: Optional[Tensor] = None
31
+ global_orient: Optional[Tensor] = None
32
+ transl: Optional[Tensor] = None
33
+
34
+ def __getitem__(self, key):
35
+ return getattr(self, key)
36
+
37
+ def get(self, key, default=None):
38
+ return getattr(self, key, default)
39
+
40
+ def __iter__(self):
41
+ return self.keys()
42
+
43
+ def keys(self):
44
+ keys = [t.name for t in fields(self)]
45
+ return iter(keys)
46
+
47
+ def values(self):
48
+ values = [getattr(self, t.name) for t in fields(self)]
49
+ return iter(values)
50
+
51
+ def items(self):
52
+ data = [(t.name, getattr(self, t.name)) for t in fields(self)]
53
+ return iter(data)
54
+
55
+
56
+ @dataclass
57
+ class SMPLOutput(ModelOutput):
58
+ betas: Optional[Tensor] = None
59
+ body_pose: Optional[Tensor] = None
60
+
61
+
62
+ @dataclass
63
+ class SMPLHOutput(SMPLOutput):
64
+ left_hand_pose: Optional[Tensor] = None
65
+ right_hand_pose: Optional[Tensor] = None
66
+ transl: Optional[Tensor] = None
67
+
68
+
69
+ @dataclass
70
+ class SMPLXOutput(SMPLHOutput):
71
+ expression: Optional[Tensor] = None
72
+ jaw_pose: Optional[Tensor] = None
73
+ joint_transformation: Optional[Tensor] = None
74
+ vertex_transformation: Optional[Tensor] = None
75
+
76
+
77
+ @dataclass
78
+ class MANOOutput(ModelOutput):
79
+ betas: Optional[Tensor] = None
80
+ hand_pose: Optional[Tensor] = None
81
+
82
+
83
+ @dataclass
84
+ class FLAMEOutput(ModelOutput):
85
+ betas: Optional[Tensor] = None
86
+ expression: Optional[Tensor] = None
87
+ jaw_pose: Optional[Tensor] = None
88
+ neck_pose: Optional[Tensor] = None
89
+
90
+
91
+ def find_joint_kin_chain(joint_id, kinematic_tree):
92
+ kin_chain = []
93
+ curr_idx = joint_id
94
+ while curr_idx != -1:
95
+ kin_chain.append(curr_idx)
96
+ curr_idx = kinematic_tree[curr_idx]
97
+ return kin_chain
98
+
99
+
100
+ def to_tensor(
101
+ array: Union[Array, Tensor], dtype=torch.float32
102
+ ) -> Tensor:
103
+ if torch.is_tensor(array):
104
+ return array
105
+ else:
106
+ return torch.tensor(array, dtype=dtype)
107
+
108
+
109
+ class Struct(object):
110
+ def __init__(self, **kwargs):
111
+ for key, val in kwargs.items():
112
+ setattr(self, key, val)
113
+
114
+
115
+ def to_np(array, dtype=np.float32):
116
+ if 'scipy.sparse' in str(type(array)):
117
+ array = array.todense()
118
+ return np.array(array, dtype=dtype)
119
+
120
+
121
+ def rot_mat_to_euler(rot_mats):
122
+ # Calculates rotation matrix to euler angles
123
+ # Careful for extreme cases of eular angles like [0.0, pi, 0.0]
124
+
125
+ sy = torch.sqrt(rot_mats[:, 0, 0] * rot_mats[:, 0, 0] +
126
+ rot_mats[:, 1, 0] * rot_mats[:, 1, 0])
127
+ return torch.atan2(-rot_mats[:, 2, 0], sy)
lib/smplx/vertex_ids.py ADDED
@@ -0,0 +1,77 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # -*- coding: utf-8 -*-
2
+
3
+ # Max-Planck-Gesellschaft zur Förderung der Wissenschaften e.V. (MPG) is
4
+ # holder of all proprietary rights on this computer program.
5
+ # You can only use this computer program if you have closed
6
+ # a license agreement with MPG or you get the right to use the computer
7
+ # program from someone who is authorized to grant you that right.
8
+ # Any use of the computer program without a valid license is prohibited and
9
+ # liable to prosecution.
10
+ #
11
+ # Copyright©2019 Max-Planck-Gesellschaft zur Förderung
12
+ # der Wissenschaften e.V. (MPG). acting on behalf of its Max Planck Institute
13
+ # for Intelligent Systems. All rights reserved.
14
+ #
15
+ # Contact: [email protected]
16
+
17
+ from __future__ import print_function
18
+ from __future__ import absolute_import
19
+ from __future__ import division
20
+
21
+ # Joint name to vertex mapping. SMPL/SMPL-H/SMPL-X vertices that correspond to
22
+ # MSCOCO and OpenPose joints
23
+ vertex_ids = {
24
+ 'smplh': {
25
+ 'nose': 332,
26
+ 'reye': 6260,
27
+ 'leye': 2800,
28
+ 'rear': 4071,
29
+ 'lear': 583,
30
+ 'rthumb': 6191,
31
+ 'rindex': 5782,
32
+ 'rmiddle': 5905,
33
+ 'rring': 6016,
34
+ 'rpinky': 6133,
35
+ 'lthumb': 2746,
36
+ 'lindex': 2319,
37
+ 'lmiddle': 2445,
38
+ 'lring': 2556,
39
+ 'lpinky': 2673,
40
+ 'LBigToe': 3216,
41
+ 'LSmallToe': 3226,
42
+ 'LHeel': 3387,
43
+ 'RBigToe': 6617,
44
+ 'RSmallToe': 6624,
45
+ 'RHeel': 6787
46
+ },
47
+ 'smplx': {
48
+ 'nose': 9120,
49
+ 'reye': 9929,
50
+ 'leye': 9448,
51
+ 'rear': 616,
52
+ 'lear': 6,
53
+ 'rthumb': 8079,
54
+ 'rindex': 7669,
55
+ 'rmiddle': 7794,
56
+ 'rring': 7905,
57
+ 'rpinky': 8022,
58
+ 'lthumb': 5361,
59
+ 'lindex': 4933,
60
+ 'lmiddle': 5058,
61
+ 'lring': 5169,
62
+ 'lpinky': 5286,
63
+ 'LBigToe': 5770,
64
+ 'LSmallToe': 5780,
65
+ 'LHeel': 8846,
66
+ 'RBigToe': 8463,
67
+ 'RSmallToe': 8474,
68
+ 'RHeel': 8635
69
+ },
70
+ 'mano': {
71
+ 'thumb': 744,
72
+ 'index': 320,
73
+ 'middle': 443,
74
+ 'ring': 554,
75
+ 'pinky': 671,
76
+ }
77
+ }
lib/smplx/vertex_joint_selector.py ADDED
@@ -0,0 +1,77 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # -*- coding: utf-8 -*-
2
+
3
+ # Max-Planck-Gesellschaft zur Förderung der Wissenschaften e.V. (MPG) is
4
+ # holder of all proprietary rights on this computer program.
5
+ # You can only use this computer program if you have closed
6
+ # a license agreement with MPG or you get the right to use the computer
7
+ # program from someone who is authorized to grant you that right.
8
+ # Any use of the computer program without a valid license is prohibited and
9
+ # liable to prosecution.
10
+ #
11
+ # Copyright©2019 Max-Planck-Gesellschaft zur Förderung
12
+ # der Wissenschaften e.V. (MPG). acting on behalf of its Max Planck Institute
13
+ # for Intelligent Systems. All rights reserved.
14
+ #
15
+ # Contact: [email protected]
16
+
17
+ from __future__ import absolute_import
18
+ from __future__ import print_function
19
+ from __future__ import division
20
+
21
+ import numpy as np
22
+
23
+ import torch
24
+ import torch.nn as nn
25
+
26
+ from .utils import to_tensor
27
+
28
+
29
+ class VertexJointSelector(nn.Module):
30
+
31
+ def __init__(self, vertex_ids=None,
32
+ use_hands=True,
33
+ use_feet_keypoints=True, **kwargs):
34
+ super(VertexJointSelector, self).__init__()
35
+
36
+ extra_joints_idxs = []
37
+
38
+ face_keyp_idxs = np.array([
39
+ vertex_ids['nose'],
40
+ vertex_ids['reye'],
41
+ vertex_ids['leye'],
42
+ vertex_ids['rear'],
43
+ vertex_ids['lear']], dtype=np.int64)
44
+
45
+ extra_joints_idxs = np.concatenate([extra_joints_idxs,
46
+ face_keyp_idxs])
47
+
48
+ if use_feet_keypoints:
49
+ feet_keyp_idxs = np.array([vertex_ids['LBigToe'],
50
+ vertex_ids['LSmallToe'],
51
+ vertex_ids['LHeel'],
52
+ vertex_ids['RBigToe'],
53
+ vertex_ids['RSmallToe'],
54
+ vertex_ids['RHeel']], dtype=np.int32)
55
+
56
+ extra_joints_idxs = np.concatenate(
57
+ [extra_joints_idxs, feet_keyp_idxs])
58
+
59
+ if use_hands:
60
+ self.tip_names = ['thumb', 'index', 'middle', 'ring', 'pinky']
61
+
62
+ tips_idxs = []
63
+ for hand_id in ['l', 'r']:
64
+ for tip_name in self.tip_names:
65
+ tips_idxs.append(vertex_ids[hand_id + tip_name])
66
+
67
+ extra_joints_idxs = np.concatenate(
68
+ [extra_joints_idxs, tips_idxs])
69
+
70
+ self.register_buffer('extra_joints_idxs',
71
+ to_tensor(extra_joints_idxs, dtype=torch.long))
72
+
73
+ def forward(self, vertices, joints):
74
+ extra_joints = torch.index_select(vertices, 1, self.extra_joints_idxs)
75
+ joints = torch.cat([joints, extra_joints], dim=1)
76
+
77
+ return joints