Motivation

The 1st SHApe Recovery from Partial textured 3D scans (SHARP) Workshop and Challenge will be held in conjunction with ECCV on the afternoon of August 23, 2020, in Glasgow, Scotland.

An online participation is now possible, where physical presence in Glasgow is not required. 

Capturing complete 3D scans is challenging because of multiple factors including varying levels of details (e.g. finer anatomical parts and clothing wrinkles in human scans), occlusion, non-rigidity, movement, and optical properties (e.g., fabric, material, hair, reflection, texture patterns, etc.).

Photogrammetric capture systems can acquire 3D scan sequences at high frame rates but miss details, suffer from occlusion because of the fixed setup and are bulky. Hand-held scanners can adapt to the target, resolving finer details, handle more complex geometries and occlusion, but require longer acquisition time. In practice, these trade-offs result in partial data in the form of scans filled with inaccuracies, uncertainties and/or missing parts.

The goal of the proposed workshop and challenges is to promote the development of methods to recover a complete 3D scan from its partial acquisition. The acquisition time of hand-held scanners could be decreased if we could acquire partial information of an object and complete the missing areas as a post-processing step.

The quality of photogrammetric scans could be enhanced by completing the missing parts and correcting the areas of lower reliability. Advances in such methods would also impact applications more generally at both the industrial and the consumer levels. It would allow, for example, capturing more complex objects (e.g., complex clothing) and achieving higher-quality acquisitions on lower-end devices. Practical applications impacted by the proposed challenges include virtual reality, medical treatment, fitness monitoring and fashion. These challenges provide test-beds for developing novel methods of 3D scan completion.

Sponsor

Artec3D-Logo-Print-Color

Challenges

We propose two challenges. The task of the challenges is to reconstruct a full 3D textured mesh from a partial 3D scan. The first challenge is for human bodies, and the second challenge is for a variety of generic objects. These challenges launch two new unique datasets:

1) Challenge 1: Recovery of Human Body Scans.

This challenge is focused on textured 3D human body scans. It introduces a new unique dataset, 3DBodyTex.v2 – an extension of the 3DBodyTex data. It contains 3300 textured 3D body scans with tight-fitting or wide clothing, with a large variation in poses and in subjects;

2) Challenge 2: Recovery of Generic Object Scans.

This challenge is focused on textured 3D scans of generic objects. It introduces a new unique dataset, 3DObjectTex.v1 – a subset from the ViewShape repository – containing 2000 textured 3D scans of very diverse objects.

IMPORTANT

An overall 6k€ will be awarded as cash prizes to the winners.

Call for Online Participation (New)

In an effort to give more researchers the opportunity to participate in the SHARP 2020 challenges, an online participation is now open. All tracks will be available with the same deadline for registration. Instead of submitting an accompanying paper, participants will be required to submit a code. Submitting an accompanying paper, to be included in the proceedings of ECCV, is still highly encouraged.

Call for Papers

Participants in one or multiple tracks of the SHARP Challenge are expected to document their results by submitting:

– long papers (max 8 pages excluding references): when submitting completely original work, not previously published, accepted for publication, or under review in any peer-reviewed venue including journals, conferences or workshops.

– short papers (max 4 pages including references): when submitting a shared content with a paper accepted at ECCV (or any other conference, workshop or journal).

All accepted papers will be included in the ECCV 2020 conference proceedings. The papers will be peer-reviewed and they must comply with the ECCV 2020 proceedings style and format.

Important Dates

 

20th Feb 2020

20th Feb 2020

21st Mar 2020

6th Apr 2020

8th Apr 2020

10th Apr 2020

3rd July 2020

8th July 2020

14th July 2020

16th July 2020

 

/       Opening of registration of participants

/        Webpage online

/        Announcement of online participation

/        Deadline for registration of participants (extended)

/        Deadline to return the signed Data License Agreement

/        Release of the training dataset

/        Deadline for submission of results

/        Deadline for submission of accompanying paper 

/        Notification of acceptance of accompanying papers 

/        Camera-ready submission of accepted papers 

Organizers

DjamilaAouada1
Djamila Aouada

SnT, University of Luxembourg

djamila.aouada@uni.lu

IMG_3274_ne
Kseniya Cherenkova

Artec3D, SnT

kcherenkova@artec-group.com

Asaint
Alexandre Saint

SnT, University of Luxembourg

alexandre.saint@uni.lu

DA457009-9004-43AA-A72C-35D40E8FC8F7_1_105_c
David Fofi

University of Burgundy

david.fofi@u-bourgogne.fr

gleb_
Gleb Gusev

Artec3D

gleb@artec-group.com

Ottersten2
Bjorn Ottersten

SnT, University of Luxembourg

bjorn.ottersten@uni.lu

TEST

A dataset containing 400 real, high-resolution human scans of 200 subjects (100 males and 100 females in two poses each) with high-quality texture and their corresponding low-resolution meshes, with automatically computed ground-truth correspondences. See the following table.

TEST

A dataset containing 400 real, high-resolution human scans of 200 subjects (100 males and 100 females in two poses each) with high-quality texture and their corresponding low-resolution meshes, with automatically computed ground-truth correspondences. See the following table.

TEST

A dataset containing 400 real, high-resolution human scans of 200 subjects (100 males and 100 females in two poses each) with high-quality texture and their corresponding low-resolution meshes, with automatically computed ground-truth correspondences. See the following table.

TEST

A dataset containing 400 real, high-resolution human scans of 200 subjects (100 males and 100 females in two poses each) with high-quality texture and their corresponding low-resolution meshes, with automatically computed ground-truth correspondences. See the following table.