A dataset containing 400 real, high-resolution human scans of 200 subjects (100 males and 100 females in two poses each) with high-quality texture and their corresponding low-resolution meshes, with automatically computed ground-truth correspondences. See the following table.
The dataset is available for use by external parties. Due to agreements signed by the volunteer models, a license agreement must be requested and signed by the recipient and the research administration office director of your institution before the data can be provided. To make a request for the data, please contact us on Shapify3D (at) uni.lu or use the following contact form.
(1) Once a license agreement is signed, we will give access to download the data.
(2) If this data is used, in whole or in part, the following paper must be referenced:
3DBodyTex: Textured 3D Body Dataset
; ; ; ; ; ; in 2018 Sixth International Conference on 3D Vision (3DV 2018) (2018).
- BODYFITR: Robust Automatic 3D Human Body Fitting
Saint, Alexandre Fabian A; Shabayek, Abd El Rahman; Cherenkova, Kseniya; Gusev, Gleb; Aouada, Djamila; Ottersten, Björn
in Proceedings of the 2019 IEEE International Conference on Image Processing (ICIP) (2019, September 22)
- 3DBodyTex: Textured 3D Body Dataset
; ; ; ; ; ;
in 2018 Sixth International Conference on 3D Vision (3DV 2018) (2018)
- Towards Automatic Human Body Model Fitting to a 3D Scan
; ; ; ; ;
in D’APUZZO, Nicola (Ed.) Proceedings of 3DBODY.TECH 2017 – 8th International Conference and Exhibition on 3D Body Scanning and Processing Technologies, Montreal QC, Canada, 11-12 Oct. 2017 (2017, October)
- DEFORMATION TRANSFER OF 3D HUMAN SHAPES AND POSES ON MANIFOLDS
; ; ;
in IEEE International Conference on Image Processing, Beijing 17-20 Spetember 2017 (2017)
This work was funded by the National Research Fund (FNR), Luxembourg, AFR PPP reference 11806282, and by Artec Europe SARL. The authors are grateful to the volunteers for the scanning, to the whole Computer Vision Lab at SnT for collecting the data, and to the contributors of the open source libraries used in this work.