Skip to content

Hongyu-Li/Style_Transfer_Implementation_and_Applications

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

36 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AML Project: Neural Style Transfer

Term: Fall 2018

  • Project title: Neural Style Transfer Implementation and Applications

  • Project Demo: See our live demo of style transfer!

  • Team members

  • Project summary: In this project, we did three things. Firstly, we implemented two style transfer algorithms: the original 1 to 1 neural style transfer algorithm (fixed style for a certain image) that was came up with by Gatys and fast neural style transfer algorithm (fixed style for arbitrary image) that was proposed by Johnson. Secondly, we applied the fast algorithm in real-time off a webcam. Lastly, we are built an app demo so that users could upload their own images and design their own styled photos.

    starry_butler webcam app demo

  • Project report: report.pdf

  • Project environment: We implemented our code by using TensorFlow eager execution which is an imperative programming environment that evaluates operations immediately, without building graphs. Eager execution would be default in TensorFlow 2.0, so we use this mode to implement our code.

    • Implementations: As for the implementation of one-to-one neural style transfer algorithm, we trained our model on Colab. However, as for the implementation of faster neural style transfer algorithm, we trained our model by using GCP due to the computation complexity.
    • App demo: We created this demo by using Dash which is a Python framework for building analytical web applications.

In this repository we provide:

How to reproduce our results:

  • Implementation of 1 to 1 neural style transfer: open lib/style_transfer_alpha_final.ipynb and run it. This is an end-to-end notebook which means you do not need to revise anything in order to reproduce our results. (Note: you could run it without GPU.)
  • Implementation of fast neural style transfer: open lib/style_transfer_beta_final.ipynb and run it. This is an end-to-end notebook which means you do not need to revise anything in order to reproduce our results. (Note: you should run it with GPU and it would take 1 hour or longer to train.)
  • Webcam application: open lib/webcam_final.ipynb and run it. This is an end-to-end notebook which means you do not need to revise anything in order to reproduce our results. (Note: you should run it locally because the camera device is required for this application.)
  • If you want to reproduce the result of Starry Night, you do not have to revise the notebook.
  • If you want to reproduce the result of Victoire, you have to replace beta_model_style_1.h5 with beta_model_style_2.h5 everywhere in step 5.
  • If you want to reproduce the result of Women at Their Toilette, you have to replace beta_model_style_1.h5 with beta_model_style_3.h5 everywhere in step 5.
  • If you want to reproduce the result of Google Map, you have to replace beta_model_style_1.h5 with beta_model_style_4.h5 everywhere in step 5.
  • Dash app demo: Please go to dashapp folder to see more details about how to start and deploy our dash app demo.

References:

[1] Leon A Gatys,Alexander S Ecker,Matthias Bethge. A neural algorithm of artistic style[J]. arXiv preprint arXiv:1508.06576, 2015.

[2] Justin Johnson,Alexandre Alahi,Li Fei-Fei. Perceptual losses for real-time style transfer and super-resolution[C]. Springer,2016:694-711.

[3] Falong Shen,Shuicheng Yan,Gang Zeng. Meta Networks for Neural Style Transfer[J]. arXiv preprint arXiv:1709.04111, 2017.

[4] Francois Chollet. Deep learning with python[M].Manning Publications Co., 2017.

[5] Waseem Rawat,Zenghui Wang. Deep convolutional neural networks for image classification: A comprehensive review[J]. Neural computation, 2017, 29(9): 2352-2449.

[6] Andrew Ng. Nuts and bolts of building AI applications using Deep Learning[C]. NIPS,2016.

[7] Guillaume Berger,Roland Memisevic. Incorporating long-range consistency in CNN-based texture generation[J]. arXiv preprint arXiv:1606.01286, 2016.

[8] Karen Simonyan,Andrew Zisserman. Very deep convolutional networks for large-scale image recognition[J]. arXiv preprint arXiv:1409.1556, 2014.

[9] Ciyou Zhu,Richard H Byrd,Peihuang Lu, etal. Algorithm 778: L-BFGS-B: Fortran subroutines for large-scale bound-constrained optimization[J]. ACM Transactions on Mathematical Software (TOMS), 1997, 23(4): 550-560.

[10] Harish Narayanan Blog: https://harishnarayanan.org/writing/artistic-style-transfer/.

[11] TF tutorial: https://medium.com/tensorflow/neural-style-transfer-creating-art-with-deep-learning-using-tf-keras-and-eager-execution-7d541ac31398.

[12] Neural Style Transfer Implementation with Tensorflow Graph Mode: https://github.com/Kautenja/a-neural-algorithm-of-artistic-style.

[13] Fast Style Transfer with Pytorch: https://github.com/jcjohnson/fast-neural-style.

About

Final Project for GR5242 Advanced Machine Learning. Visit https://style-transfer-257412.appspot.com to see our live demo.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 99.9%
  • Other 0.1%