Publication

Real-time 360 Body Scanning System for Virtual Reality Research Applications

Abstract

We present a low-cost solution to perform the online and realistic representation of users using an array of depth cameras. The system is composed of a cluster of 10 Microsoft Kinect 2 cameras, each one associated to a compact NUC PC to stream live depth & color images to a master PC which reconstructs live the point cloud of the scene and can in particular show the body of users standing in the capture area. A custom geometric calibration procedure allows accurate reconstruction of the different 3D data streams. Despite the inherent limitations of depth cameras, in particular sensor noise, the system provides a convincing representation of the user's body, is not limited by changes in clothing (also during immersion), can capture complex poses and even interactions between two persons or with physical objects. The advantage of using depth cameras over conventional cameras is that little processing is required for dynamic reconstruction of unknown shapes, thus allowing true interactive applications. The resulting live 3D model can be inserted in any virtual environment (e.g. Unity 3D software integration plugin), and can be subject to all usual 3D manipulation and transformations.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.