mocap-guided data augmentation
MoCap-guided Data Augmentation for 3D Pose Estimation in the Wild
This paper addresses the problem of 3D human pose estimation in the wild. A significant challenge is the lack of training data, i.e., 2D images of humans annotated with 3D poses. Such data is necessary to train state-of-the-art CNN architectures. Here, we propose a solution to generate a large set of photorealistic synthetic images of humans with 3D pose annotations. We introduce an image-based synthesis engine that artificially augments a dataset of real images with 2D human pose annotations using 3D Motion Capture (MoCap) data.
Reviews: MoCap-guided Data Augmentation for 3D Pose Estimation in the Wild
Authors compare their method to a very recent state-of-the-art method and use the same evaluation protocol. They present detailed evaluation of how synthetically generated data fairs against real data and how using them both together improves performance. Additionally authors demonstrate the advantage of formulating 3D pose estimation as a classification problem over direct regression to joint coordinates. This is facilitated by a large number of training samples, which they show is crucial for the classification CNN: in the small data regime full body regression actually outperforms 3D pose classification. However, authors only evaluate their method using evaluation protocol described in [17, 40].
MoCap-guided Data Augmentation for 3D Pose Estimation in the Wild
Rogez, Gregory, Schmid, Cordelia
This paper addresses the problem of 3D human pose estimation in the wild. A significant challenge is the lack of training data, i.e., 2D images of humans annotated with 3D poses. Such data is necessary to train state-of-the-art CNN architectures. Here, we propose a solution to generate a large set of photorealistic synthetic images of humans with 3D pose annotations. We introduce an image-based synthesis engine that artificially augments a dataset of real images with 2D human pose annotations using 3D Motion Capture (MoCap) data.