The ability to produce dynamic Depth of Field effects in live video streams was until recently a quality unique to movie cameras. In this paper, we present a computational camera solution coupled with real-time GPU processing to produce runtime dynamic Depth of Field effects. We first construct a hybrid-resolution stereo camera with a high-res/low-res camera pair. We recover a low-res disparity map of the scene using GPU-based Belief Propagation and subsequently upsample it via fast Cross/Joint Bilateral Upsampling. With the recovered high-resolution disparity map, we warp the high-resolution video stream to nearby viewpoints to synthesize a light field towards the scene. We exploit parallel processing and atomic operations on the GPU to resolve visibility when multiple pixels warp to the same image location. |
Related Papers |
An Analysis of Color Demosaicing in Plenoptic Cameras Zhan Yu, Jingyi Yu, Andrew Lumsdaine, and Todor Georgiev to appear in Proceedings of IEEE CVPR 2012 (video) |
||
Dynamic Depth of Field on Live Video Streams : A Stereo Solution Zhan Yu, Christopher Thorpe, Xuan Yu, Scott Grauer-Gray, Feng Li, Jingyi Yu In Proceedings of Computer Graphics International 2011 (video) |
||
Dual Focus Stereo Imaging Feng Li, Jian Sun, Jue Wang, Jingyi Yu SPIE Electronic Imaging 2010 (video) |