Anthill Inside 2018

On the current state of academic research, practice and development regarding Deep Learning and Artificial Intelligence.

Attention Mechanisms and Machine Reasoning

Submitted by Ashwin (@srisriashwin) on Tuesday, 1 May 2018

videocam_off

Technical level

Intermediate

Section

Full talk

Status

Confirmed & Scheduled

View proposal in schedule

Vote on this proposal

Login to vote

Total votes:  +5

Abstract

Attention Mechanisms have been popular for the past couple of years, giving new insights in image and NLP applications using Recurrent Neural Networks. We would like to discuss advances in Attention Mecahnisms in this panel with specific emphasis on two new innovations, Compositional Attention Networks (https://arxiv.org/abs/1803.03067) and Hierarchical Recurrent Attention Networks (https://arxiv.org/abs/1701.07149). Both the papers address a critical gap in Deep Neural Net architectures, that of reasoning from first principles. We will discuss the applicability of Attention Mechanisms beyond the paper to address concept based reasoning.

Outline

Attention Mechanisms have been popular for the past couple of years, giving new insights in image and NLP applications using Recurrent Neural Networks. We would like to discuss advances in Attention Mecahnisms in this panel with specific emphasis on two new innovations, Compositional Attention Networks (https://arxiv.org/abs/1803.03067) and Hierarchical Recurrent Attention Networks (https://arxiv.org/abs/1701.07149). Both the papers address a critical gap in Deep Neural Net architectures, that of reasoning from first principles. We will discuss the applicability of Attention Mechanisms beyond the paper to address concept based reasoning.

Speaker bio

Ashwin is a Data Scientist with interests in NLP, AI and machine reasoning.

Comments

Login with Twitter or Google to leave a comment