FrameBreak: Dramatic Image Extrapolation by Guided Shift-Maps


Image Extrapolation. To extrapolate an input image, we first find a guide image with larger field of view and similar scene structure. We then learn self-similarity within the guide image. This prior scene structure information will be transfered to the input image and guide the synthesis.


Abstract

We significantly extrapolate the field of view of a photograph by learning from a roughly aligned, wide-angle guide image of the same scene category. Our method can extrapolate typical photos into complete panoramas. The extrapolation problem is formulated in the shift-map image synthesis framework. We analyze the self-similarity of the guide image to generate a set of allowable local transformations and apply them to the input image. Our guided shift-map method preserves to the scene layout of the guide image when extrapolating a photograph. While conventional shiftmap methods only support translations, this is not expressive enough to characterize the self-similarity of complex scenes. Therefore we additionally allow image transformations of rotation, scaling and reflection. To handle this increase in complexity, we introduce a hierarchical graph optimization method to choose the optimal transformation at each output pixel. We demonstrate our approach on a variety of indoor, outdoor, natural, and man-made scenes.

Paper

Source Code

Files