Skies are common backgrounds in photos but are often less interesting due to the time of photographing. Professional photographers correct this by using sophisticated tools with painstaking efforts that are beyond the command of ordinary users. In this work, we propose an automatic background replacement algorithm that can generate realistic, artifact-free images with a diverse styles of skies. The key idea of our algorithm is to utilize visual semantics to guide the entire process including sky segmentation, search and replacement. First we train a deep convolutional neural network for semantic scene parsing, which is used as visual prior to segment sky regions in a coarse-to-fine manner. Second, in order to find proper skies for replacement, we propose a data-driven sky search scheme based on semantic layout of the input image. Finally, to re-compose the stylized sky with the original foreground naturally, an appearance transfer method is developed to match statistics locally and semantically. We show that the proposed algorithm can automatically generate a set of visually pleasing results. In addition, we demonstrate the effectiveness of the proposed algorithm with extensive user studies.
|Journal||ACM Transactions on Graphics|
|Publication status||Published - 2016 Jul 11|
|Event||ACM SIGGRAPH 2016 - Anaheim, United States|
Duration: 2016 Jul 24 → 2016 Jul 28
Bibliographical notePublisher Copyright:
© 2016 ACM.
All Science Journal Classification (ASJC) codes
- Computer Graphics and Computer-Aided Design