Anthill Inside 2018
On the current state of academic research, practice and development regarding Deep Learning and Artificial Intelligence.
Jul 2018
23 Mon
24 Tue
25 Wed 08:45 AM – 05:25 PM IST
26 Thu
27 Fri
28 Sat
29 Sun
Ashwin
Attention Mechanisms have been popular for the past couple of years, giving new insights in image and NLP applications using Recurrent Neural Networks. We would like to discuss advances in Attention Mecahnisms in this panel with specific emphasis on two new innovations, Compositional Attention Networks (https://arxiv.org/abs/1803.03067) and Hierarchical Recurrent Attention Networks (https://arxiv.org/abs/1701.07149). Both the papers address a critical gap in Deep Neural Net architectures, that of reasoning from first principles. We will discuss the applicability of Attention Mechanisms beyond the paper to address concept based reasoning.
Attention Mechanisms have been popular for the past couple of years, giving new insights in image and NLP applications using Recurrent Neural Networks. We would like to discuss advances in Attention Mecahnisms in this panel with specific emphasis on two new innovations, Compositional Attention Networks (https://arxiv.org/abs/1803.03067) and Hierarchical Recurrent Attention Networks (https://arxiv.org/abs/1701.07149). Both the papers address a critical gap in Deep Neural Net architectures, that of reasoning from first principles. We will discuss the applicability of Attention Mechanisms beyond the paper to address concept based reasoning.
Ashwin is a Data Scientist with interests in NLP, AI and machine reasoning.
Hosted by
{{ gettext('Login to leave a comment') }}
{{ gettext('Post a comment…') }}{{ errorMsg }}
{{ gettext('No comments posted yet') }}