Nick Morgan Neural Style Transfer Save

Generating Art with Convolutional Neural Networks

Project README

Update - March 2021

  • I have created a new repo, (nst-zoo)which is much, much more thorough than this one.
  • This repo contains some spaghetti code, but contains some fun images, and will probably stay up forever since it made it into the arctic code vault.

neural-style-transfer

Example Image

Follow along in a GPU-enabled workbook! Link.

Neural Style Transfer utilizes the VGG-19 Image Classification Neural Network to apply transfer learning to images.

VGG19 - Clifford K. Yang

This repository explores two methods - one introduced by Leon A. Gatys, and another introduced by Justin Johnson.

Gatys' method is an iterative process (typically 150-200 iterations) to optimize a generated image, based on a cost function for style and content. The content cost function is defined by comparing the outputs of conv4_2 between the generated image and the content image. The style cost function is defined by comparing the outputs of [conv1_1, conv2_1, conv3_1, conv4_1, conv5_1] between the generated image and the style image.

Instead of using an iterative optimization, Johnson's method transforms images using a single forward pass through the network. It still uses the pre-trained VGG network, but instead trains the model to make transformations using 80,000 training images from the Microsoft COCO dataset. A single forward pass through this network has a loss which is comparable to ~100 iterations of Gatys' method.

The main notebook, Neural_Style_Transfer.ipynb, contains all relevant documentation for this repository.

Open Source Agenda is not affiliated with "Nick Morgan Neural Style Transfer" Project. README Source: Nick-Morgan/neural-style-transfer
Stars
41
Open Issues
0
Last Commit
3 years ago

Open Source Agenda Badge

Open Source Agenda Rating