Anthill Inside 2017
On theory and concepts in Machine Learning, Deep Learning and Artificial Intelligence. Formerly Deep Learning Conf.
Jul 2017
24 Mon
25 Tue
26 Wed
27 Thu
28 Fri
29 Sat 09:00 AM – 05:40 PM IST
30 Sun
On theory and concepts in Machine Learning, Deep Learning and Artificial Intelligence. Formerly Deep Learning Conf.
Jul 2017
24 Mon
25 Tue
26 Wed
27 Thu
28 Fri
29 Sat 09:00 AM – 05:40 PM IST
30 Sun
Anuj Gupta
Once in a while comes an (crazy!) idea that can change the very fundamentals of an area. In this talk we will see one such idea that can change how neural networks are trained.
As of now Back propagation algorithm is at the heart of training any neural net. However, the algorithm suffers from certain drawbacks which forces layers of the neural net to be trained strictly in sequential manner. In this talk we see a very powerful technique to break free from this severe limitation.
To facilitate better understanding, I will be giving a github repo as a take away so that the audience can go back, download the code and play with it.
Code assosiated with this talk : https://github.com/anujgupta82/Synthetic_Gradients
Basic understanding of Back propagation algorithm
Anuj Gupta is a senior ML researcher at Freshdesk; working in the area NLP, Machine Learning, Deep learning. Earlier he was heading ML efforts at Airwoot(Now acquired by Freshdesk). He dropped out of Phd in ML to work with startups. He graduated from IIIT H with specialization in theoretical comp science.
He has given tech talks at prestigious forums like PyData DC, Fifth Elephant, ICDCN, PODC, IIT Delhi, IIIT Hyderabad and special interest groups like DLBLR. More about him - https://www.linkedin.com/in/anuj-gupta-15585792/
https://docs.google.com/presentation/d/10qQeuHkQ9ZkzEXD7IQS9MCpnnFEH-oMBgvq8S5bHQdM/
Jul 2017
24 Mon
25 Tue
26 Wed
27 Thu
28 Fri
29 Sat 09:00 AM – 05:40 PM IST
30 Sun
Hosted by
{{ gettext('Login to leave a comment') }}
{{ gettext('Post a comment…') }}{{ errorMsg }}
{{ gettext('No comments posted yet') }}