“Seeing the World in a Bag of Chips”, 2020-01-14 (; backlinks; similar):
We address the dual problems of novel view synthesis and environment reconstruction from hand-held RGBD sensors.
Our contributions include (1) modeling highly specular objects, (2) modeling inter-reflections and Fresnel effects, and (3) enabling surface light field reconstruction with the same input needed to reconstruct shape alone.
In cases where scene surface has a strong mirror-like material component, we generate highly detailed environment images, revealing room composition, objects, people, buildings, and trees visible through windows.
Our approach yields state-of-the-art view synthesis techniques, operates on low dynamic range imagery, and is robust to geometric and calibration errors.
View PDF: