Universal style transfer via feature transforms

Yijun Li, Chen Fang, Jimei Yang, Zhaowen Wang, Xin Lu, Ming Hsuan Yang

Research output: Contribution to journalConference article

60 Citations (Scopus)

Abstract

Universal style transfer aims to transfer arbitrary visual styles to content images. Existing feed-forward based methods, while enjoying the inference efficiency, are mainly limited by inability of generalizing to unseen styles or compromised visual quality. In this paper, we present a simple yet effective method that tackles these limitations without training on any pre-defined styles. The key ingredient of our method is a pair of feature transforms, whitening and coloring, that are embedded to an image reconstruction network. The whitening and coloring transforms reflect a direct matching of feature covariance of the content image to a given style image, which shares similar spirits with the optimization of Gram matrix based cost in neural style transfer. We demonstrate the effectiveness of our algorithm by generating high-quality stylized images with comparisons to a number of recent methods. We also analyze our method by visualizing the whitened features and synthesizing textures via simple feature coloring.

Original languageEnglish
Pages (from-to)386-396
Number of pages11
JournalAdvances in Neural Information Processing Systems
Volume2017-December
Publication statusPublished - 2017 Jan 1
Event31st Annual Conference on Neural Information Processing Systems, NIPS 2017 - Long Beach, United States
Duration: 2017 Dec 42017 Dec 9

Fingerprint

Coloring
Image reconstruction
Image quality
Textures
Costs

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Cite this

Li, Y., Fang, C., Yang, J., Wang, Z., Lu, X., & Yang, M. H. (2017). Universal style transfer via feature transforms. Advances in Neural Information Processing Systems, 2017-December, 386-396.
Li, Yijun ; Fang, Chen ; Yang, Jimei ; Wang, Zhaowen ; Lu, Xin ; Yang, Ming Hsuan. / Universal style transfer via feature transforms. In: Advances in Neural Information Processing Systems. 2017 ; Vol. 2017-December. pp. 386-396.
@article{750881047b444e26af5779c5be88ec93,
title = "Universal style transfer via feature transforms",
abstract = "Universal style transfer aims to transfer arbitrary visual styles to content images. Existing feed-forward based methods, while enjoying the inference efficiency, are mainly limited by inability of generalizing to unseen styles or compromised visual quality. In this paper, we present a simple yet effective method that tackles these limitations without training on any pre-defined styles. The key ingredient of our method is a pair of feature transforms, whitening and coloring, that are embedded to an image reconstruction network. The whitening and coloring transforms reflect a direct matching of feature covariance of the content image to a given style image, which shares similar spirits with the optimization of Gram matrix based cost in neural style transfer. We demonstrate the effectiveness of our algorithm by generating high-quality stylized images with comparisons to a number of recent methods. We also analyze our method by visualizing the whitened features and synthesizing textures via simple feature coloring.",
author = "Yijun Li and Chen Fang and Jimei Yang and Zhaowen Wang and Xin Lu and Yang, {Ming Hsuan}",
year = "2017",
month = "1",
day = "1",
language = "English",
volume = "2017-December",
pages = "386--396",
journal = "Advances in Neural Information Processing Systems",
issn = "1049-5258",

}

Li, Y, Fang, C, Yang, J, Wang, Z, Lu, X & Yang, MH 2017, 'Universal style transfer via feature transforms', Advances in Neural Information Processing Systems, vol. 2017-December, pp. 386-396.

Universal style transfer via feature transforms. / Li, Yijun; Fang, Chen; Yang, Jimei; Wang, Zhaowen; Lu, Xin; Yang, Ming Hsuan.

In: Advances in Neural Information Processing Systems, Vol. 2017-December, 01.01.2017, p. 386-396.

Research output: Contribution to journalConference article

TY - JOUR

T1 - Universal style transfer via feature transforms

AU - Li, Yijun

AU - Fang, Chen

AU - Yang, Jimei

AU - Wang, Zhaowen

AU - Lu, Xin

AU - Yang, Ming Hsuan

PY - 2017/1/1

Y1 - 2017/1/1

N2 - Universal style transfer aims to transfer arbitrary visual styles to content images. Existing feed-forward based methods, while enjoying the inference efficiency, are mainly limited by inability of generalizing to unseen styles or compromised visual quality. In this paper, we present a simple yet effective method that tackles these limitations without training on any pre-defined styles. The key ingredient of our method is a pair of feature transforms, whitening and coloring, that are embedded to an image reconstruction network. The whitening and coloring transforms reflect a direct matching of feature covariance of the content image to a given style image, which shares similar spirits with the optimization of Gram matrix based cost in neural style transfer. We demonstrate the effectiveness of our algorithm by generating high-quality stylized images with comparisons to a number of recent methods. We also analyze our method by visualizing the whitened features and synthesizing textures via simple feature coloring.

AB - Universal style transfer aims to transfer arbitrary visual styles to content images. Existing feed-forward based methods, while enjoying the inference efficiency, are mainly limited by inability of generalizing to unseen styles or compromised visual quality. In this paper, we present a simple yet effective method that tackles these limitations without training on any pre-defined styles. The key ingredient of our method is a pair of feature transforms, whitening and coloring, that are embedded to an image reconstruction network. The whitening and coloring transforms reflect a direct matching of feature covariance of the content image to a given style image, which shares similar spirits with the optimization of Gram matrix based cost in neural style transfer. We demonstrate the effectiveness of our algorithm by generating high-quality stylized images with comparisons to a number of recent methods. We also analyze our method by visualizing the whitened features and synthesizing textures via simple feature coloring.

UR - http://www.scopus.com/inward/record.url?scp=85046993249&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85046993249&partnerID=8YFLogxK

M3 - Conference article

AN - SCOPUS:85046993249

VL - 2017-December

SP - 386

EP - 396

JO - Advances in Neural Information Processing Systems

JF - Advances in Neural Information Processing Systems

SN - 1049-5258

ER -

Li Y, Fang C, Yang J, Wang Z, Lu X, Yang MH. Universal style transfer via feature transforms. Advances in Neural Information Processing Systems. 2017 Jan 1;2017-December:386-396.