Deep Type - deep convolutional neural networks for style transfer in typography
Submitted by irfan basha sheik (@sheikirfanbasha) on Sunday, 30 April 2017
At Imaginea, we run a social network for typoholics called Fontli as our designers have a passion for the field. Folks share typography that they catch in the wild or work that they’ve created themselves. Members ask others for font identification and tips, and tag what they’re able to identify themselves.
Given that we’re into typography, we would love to have a system where we can take a picture of some type and apply it to text of our own choice! We know that Deep Convolutional Neural Networks(DCNNs) have recently been achieving great results in image transformation tasks, most notably in the artistic style transfer. As these DCNNs are capable of capturing and transferring style of one image onto the other, we want to explore them and use to build a new system to transfer style of typography. We call this system as Deep Type.
In this talk, we’ll start with the working of DCNNs and how they are used in style transfer algorithms. We’ll dive into the details about how this algorithms works in practice, especially focusing on jhonson’s neural style implementation. We’ll discuss how the DCNNs with no training and the DCNNs with pre-training perform in the context of style transfer, along with their advantages and disadvantages. We’ll showcase the results of our experiments on typography style transfer.
- Brief Introduction to DNNs
- Problem statement of Deep Type
- Algorithm used for style transfer
- Tweaks and techniques for style transfer
- Style transfer with slow neural style Vs fast neural style
- Results and conclusions for typography content images
Basics of ML and Deep Learning
Researcher at Imaginea Labs. Passionate to learn and develop applications which are related to Artificial intelligence (Computer vision, Machine learning), simulators or games.