Universal style transfer aims to transfer arbitrary visual styles to content images. Existing feed-forward based methods, while enjoying the inference efficiency, are mainly limited by inability of generalizing to unseen styles or compromised visual quality. In this paper, we present a simple yet effective method that tackles these limitations without training on any pre-defined styles. The key ingredient of our method is a pair of feature transforms, whitening and coloring, that are embedded to an image reconstruction network. The whitening and coloring transforms reflect a direct matching of feature covariance of the content image to a given style image, which shares similar spirits with the optimization of Gram matrix based cost in neural style transfer. We demonstrate the effectiveness of our algorithm by generating high-quality stylized images with comparisons to a number of recent methods. We also analyze our method by visualizing the whitened features and synthesizing textures via simple feature coloring.
|Number of pages||11|
|Journal||Advances in Neural Information Processing Systems|
|Publication status||Published - 2017|
|Event||31st Annual Conference on Neural Information Processing Systems, NIPS 2017 - Long Beach, United States|
Duration: 2017 Dec 4 → 2017 Dec 9
Bibliographical noteFunding Information:
This work is supported in part by the NSF CAREER Grant #1149783, gifts from Adobe and NVIDIA.
© 2017 Neural information processing systems foundation. All rights reserved.
All Science Journal Classification (ASJC) codes
- Computer Networks and Communications
- Information Systems
- Signal Processing