4.5 Article

AppFusion: Interactive Appearance Acquisition Using a Kinect Sensor

Journal

COMPUTER GRAPHICS FORUM
Volume 34, Issue 6, Pages 289-298

Publisher

WILEY
DOI: 10.1111/cgf.12600

Keywords

appearance modelling

Funding

  1. NSF China [61272305, 61303135]
  2. National Program for Special Support of Eminent Professionals of China
  3. Fundamental Research Funds for the Central Universities [2013QNA5011]

Ask authors/readers for more resources

We present an interactive material acquisition system for average users to capture the spatially varying appearance of daily objects. While an object is being scanned, our system estimates its appearance on-the-fly and provides quick visual feedback. We build the system entirely on low-end, off-the-shelf components: a Kinect sensor, a mirror ball and printed markers. We exploit the Kinect infra-red emitter/receiver, originally designed for depth computation, as an active hand-held reflectometer, to segment the object into clusters of similar specular materials and estimate the roughness parameters of BRDFs simultaneously. Next, the diffuse albedo and specular intensity of the spatially varying materials are rapidly computed in an inverse rendering framework, using data from the Kinect RGB camera. We demonstrate captured results of a range of materials, and physically validate our system.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available