Taming Convolution Neural Networks for Image Recognition
The talk is about CNN’s the poster-boys of Deep Learning. One of the most successful models which have given absolutely amazing results in image recognition tasks. The talk will first cover the basics of Convnet. Talk about the reasons why the world uses and what makes them great models for Image recognition. On the surface not much has changed in convolutional neural networks, but in last five years there have been considerable advances, in the architectures and training procedures. The talk will go down into the gory details of all these advances and what they mean for people training ConvNets. Then we will get into the practical advice for training ConvNets about the do’s,dont’s and the must.
Pre-requisites- Since we will start from the scratch about ConvNets. The only thing which is expected out of the audience is understanding of Back-Propogation and Stochastic Gradient Descent. Also some linear algebra basics, will also be helpful.
Key-takeaways - A good understanding of ConvNets. How to use them effectively and practical advice on how to train them. Also a lot of Info on latest advances in architecture and training methodologies.
Intended-audience - Practitioners, Researcher’s and Any other inquisitive soul who want’s to know about ConvNets
Structure of talk:
Where it all Started (1min)
The big-bang of ConvNets (1min)
Structure of Convnets (5min)
How far Convnet’s have came from the big-bang in terms of advances (2min)
Specifics into the latest architectures-
Alexnet- The big-bang (3min)
ZFnet - How visualization helps (5min)
There will be a small tangent here and we will talk some-stuff about visualization using Deconvolution
VGGnet- A bigger deeper version (2min)
Practical advice on how to train a Network- both from scratch as well using Transfer Learning(10min)
Saurabh has been working at MAD Street Den, Chennai as a Machine Learning Engineer since past year and a half,specifically working on Deep Learning based products. He loves to train Convolutional Neural Networks of all types and sizes for different applications. Apart from CNN’s he has special interest in recurrent architectures and discovering their powers. When he is not working on DL based stuff, he loves to play around with micro-controllers.