Anthill Inside 2018

Anthill Inside 2018

On the current state of academic research, practice and development regarding Deep Learning and Artificial Intelligence.

Ashwin

@srisriashwin

Attention Mechanisms and Machine Reasoning

Submitted May 1, 2018

Attention Mechanisms have been popular for the past couple of years, giving new insights in image and NLP applications using Recurrent Neural Networks. We would like to discuss advances in Attention Mecahnisms in this panel with specific emphasis on two new innovations, Compositional Attention Networks (https://arxiv.org/abs/1803.03067) and Hierarchical Recurrent Attention Networks (https://arxiv.org/abs/1701.07149). Both the papers address a critical gap in Deep Neural Net architectures, that of reasoning from first principles. We will discuss the applicability of Attention Mechanisms beyond the paper to address concept based reasoning.

Outline

Attention Mechanisms have been popular for the past couple of years, giving new insights in image and NLP applications using Recurrent Neural Networks. We would like to discuss advances in Attention Mecahnisms in this panel with specific emphasis on two new innovations, Compositional Attention Networks (https://arxiv.org/abs/1803.03067) and Hierarchical Recurrent Attention Networks (https://arxiv.org/abs/1701.07149). Both the papers address a critical gap in Deep Neural Net architectures, that of reasoning from first principles. We will discuss the applicability of Attention Mechanisms beyond the paper to address concept based reasoning.

Speaker bio

Ashwin is a Data Scientist with interests in NLP, AI and machine reasoning.

Comments

{{ gettext('Login to leave a comment') }}

{{ gettext('Post a comment…') }}
{{ gettext('New comment') }}
{{ formTitle }}

{{ errorMsg }}

{{ gettext('No comments posted yet') }}

Hosted by

Anthill Inside is a forum for conversations about risk mitigation and governance in Artificial Intelligence and Deep Learning. AI developers, researchers, startup founders, ethicists, and AI enthusiasts are encouraged to: more