Deep Learning Conf 2016

A conference on deep learning.

Residual Learning and Stochastic Depth in Deep Neural Networks

Submitted by pradyumna reddy (@pradyu1993) on Friday, 6 May 2016

videocam_off

Technical level

Intermediate

Section

Crisp talk

Status

Confirmed & Scheduled

View proposal in schedule

Vote on this proposal

Login to vote

Total votes:  +12

Abstract

The talk will introduce Deep Residual Learning and provide an in depth idea of how Residual Networks work. It will also cover stochastic depth method which helps to increase the depth of residual networks beyond 1200 layers.

Outline

Residual networks are famed for receiving first place in latest ILSVRC image classification.
They are able to achieve a start of the art performance in image classification tasks beating previous VGG net
Stochastic Depth is a ridiculously simple idea which can help in training the network even if the maximum depth is in order of 1000s.

The talk would cover the following:
1. Introduction to convolution layer, batch normalization and relu depending on the audience comfort level with these concepts.
2. Basic Introduction of architectures of Deep Neural Networks which previously won ILSVRC
3. Deep Residual Learning and how to implement Residual networks in TensorFlow
4. Deep Neural Networks with Stochastic depth
5. If time permits will discuss other similar architectures like Recombinator Networks and summation based networks.

Speaker bio

Pradyumna is a Statistical Analyst at @WalmartLabs. He completed his under graduation in Computer Science from BITS Pilani Goa Campus. He did his undergraduate thesis under Prof Yogesh Rathi, Director of Pediatric Image Computing at Psychiatry Neuroimaging Lab Harvard Medical School. He was also a Member of Board of Reviewers at 23rd WSCG International Conference on Computer Graphics, Visualization and Computer Vision 2015.

Links

Comments

Login with Twitter or Google to leave a comment