Anthill Inside 2017
On theory and concepts in Machine Learning, Deep Learning and Artificial Intelligence. Formerly Deep Learning Conf.
Jul 2017
24 Mon
25 Tue
26 Wed
27 Thu
28 Fri
29 Sat 09:00 AM – 05:40 PM IST
30 Sun
SATYAM SAXENA
Recurrent neural networks (RNNs) offer a compelling tool for processing natural language input in a straightforward sequential manner. Though they suffer from various limitations which do not allow them to easily model even the simple transduction tasks. In this talk, we will discuss new memory-based recurrent networks that implement continuously differentiable analogues of traditional data structures such as Stacks, Queues, and DeQues.
Satyam Saxena is a ML researcher at Freshdesk. An IIT alumnus, his interest lie in NLP, Machine Learning, Deep Learning. Prior to this, he was a part of ML group Cisco. He was a visiting researcher at Vision Labs in IIIT Hyd where he used computer vision and deep learning to build applications to assisting visually impaired people. He presented some of this work at ICAT 2014, Turkey. https://www.linkedin.com/in/sam-iitj/
https://docs.google.com/presentation/d/17UOEgy7dq6rdJZ0c63dDo6Fj148d-iYk5ghvXDj3Sqk/edit?usp=sharing
Hosted by
{{ gettext('Login to leave a comment') }}
{{ gettext('Post a comment…') }}{{ errorMsg }}
{{ gettext('No comments posted yet') }}