4.5 Article

Unstructured Light Fields

Journal

COMPUTER GRAPHICS FORUM
Volume 31, Issue 2, Pages 305-314

Publisher

WILEY
DOI: 10.1111/j.1467-8659.2012.03009.x

Keywords

-

Funding

  1. NSF [0964004]
  2. Foxconn
  3. MathWorks
  4. Div Of Information & Intelligent Systems
  5. Direct For Computer & Info Scie & Enginr [964004] Funding Source: National Science Foundation

Ask authors/readers for more resources

We present a system for interactively acquiring and rendering light fields using a hand-held commodity camera. The main challenge we address is assisting a user in achieving good coverage of the 4D domain despite the challenges of hand-held acquisition. We define coverage by bounding reprojection error between viewpoints, which accounts for all 4 dimensions of the light field. We use this criterion together with a recent Simultaneous Localization and Mapping technique to compute a coverage map on the space of viewpoints. We provide users with real-time feedback and direct them toward under-sampled parts of the light field. Our system is lightweight and has allowed us to capture hundreds of light fields. We further present a new rendering algorithm that is tailored to the unstructured yet dense data we capture. Our method can achieve piecewise-bicubic reconstruction using a triangulation of the captured viewpoints and subdivision rules applied to reconstruction weights.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available