Recovery of Partial Textured Scans

The task of this challenge is to accurately reconstruct a full 3D textured mesh from a partial 3D scan. It involves 2 tracks:

  • Track 1: Recovering textured human body scans from partial acquisitions. The dataset used in this scope is the 3DBodyTex.v2 dataset, containing 2500 textured 3D scans. It is an extended version of the original 3DBodyTex.v1 dataset, first published in the 2018 International Conference on 3D Computer Vision, 3DV 2018.
  • Track 2: Recovering textured object scans from partial acquisitions. It involves the recovery of generic object scans from the 3DObjTex.v1 dataset, which is a subset from the ViewShape online repository of 3D scans. This dataset contains over 2000 various generic objects with different levels of complexity in texture and geometry.

The training data consists of pairs, (XY), of partial and complete scans, respectively. The goal is to recover Y from X. As part of the challenge, we share routines to generate partial scans from the given complete scans Y. However, the participants are free to devise their own way of generating partial data, as an augmentation or a complete alternative.

  • Any custom procedure should be reported with description and implementation among the deliverable.
  • A quality-check is performed to guarantee a reasonable level of defects.
  • The partial scans are generated synthetically.
  • For privacy reasons, all meshes are anonymized by blurring the shape and texture of the faces, similarly to the 3DBodyTex data.
  • During evaluation, the face and hands are ignored because the shape from raw scans is less reliable.

New: In addition to the routines for partial data generation provided in the previous editions (SHARP 2020 and SHARP 2021), more realistic partial data generation routines have been put in place for this edition. These routines and further documentation can be found in the GitLab repository of SHARP 2022. Samples of partial scans from Track 1 and Track 2 can be found below.

Track 1

Track 2