Anthill Inside For members

Anthill Inside 2019

A conference on AI and Deep Learning

Make a submission

Accepting submissions till 01 Nov 2019, 04:20 PM

Taj M G Road, Bangalore, Bangalore

Tickets

Loading…

##About the 2019 edition:

The schedule for the 2019 edition is published here: https://hasgeek.com/anthillinside/2019/schedule

The conference has three tracks:

  1. Talks in the main conference hall track
  2. Poster sessions featuring novel ideas and projects in the poster session track
  3. Birds of Feather (BOF) sessions for practitioners who want to use the Anthill Inside forum to discuss:
  • Myths and realities of labelling datasets for Deep Learning.
  • Practical experience with using Knowledge Graphs for different use cases.
  • Interpretability and its application in different contexts; challenges with GDPR and intepreting datasets.
  • Pros and cons of using custom and open source tooling for AI/DL/ML.

#Who should attend Anthill Inside:

Anthill Inside is a platform for:

  1. Data scientists
  2. AI, DL and ML engineers
  3. Cloud providers
  4. Companies which make tooling for AI, ML and Deep Learning
  5. Companies working with NLP and Computer Vision who want to share their work and learnings with the community

For inquiries about tickets and sponsorships, call Anthill Inside on 7676332020 or write to sales@hasgeek.com


#Sponsors:

Sponsorship slots for Anthill Inside 2019 are open. Click here to view the sponsorship deck.


Anthill Inside 2019 sponsors:


#Bronze Sponsor

iMerit Impetus

#Community Sponsor

GO-JEK iPropal
LightSpeed Semantics3
Google Tact.AI
Amex

Hosted by

Anthill Inside is a forum for conversations about risk mitigation and governance in Artificial Intelligence and Deep Learning. AI developers, researchers, startup founders, ethicists, and AI enthusiasts are encouraged to: more

simrat

@sims

Yes! Attention is all you need for NLP

Submitted Apr 14, 2019

Natural language processing a very tough problem to crack. We humans have our own style of speaking and though many a times we might mean the same thing we say it differently. This makes it very difficult for the machine to understand and process language at human level.

At VMware we believe in delivering the best to our customers - the best of products, the best of services, the best of everything. In order to deliver the best in support we process huge volumes of support tickets to structure free text and provide intelligent solutions. As part of a project, we were required to process a set of support tickets, identify key topics/categories, map them to a very different document set etc. Even though I wouldn’t be able to go into the details of the algorithm we have built, I would like to help built an intuition on how best to go about solving such problems.

For instance consider the problem of identifying topics/categories from your document set. The first and the most obvious approach would be topic modelling. Yeah, we can do topic modelling and also tune it in many ways like using seed keywords for bootstrapping. This works well when we have very different document groups and the keywords are clear distinguishers, but what happens when you have a group of similar documents with keywords used in multiple different contexts. Clearly the topics are contextual and there is a need to go beyond keyword based modelling. In this talk we will understand how can we make machines understand the context, take a sample problem and break down the approach.

P.S The title of the talk is inspired by the paper released by google called “Attention Is All You Need” which introduces Transformers and we will learn more about them and how they learn context efficiently in the talk.

Outline

  • Brief evolution of NLP
  • Challenges in working with free text
  • Why do we need to understand context
  • How can we understand context
    -- Overview of Transformers and Self-attention
  • Demonstration of context based sequence-to-sequence modelling with below use cases
    -- Document summarization
    -- Anomaly detection
  • Adaptation of attention network - heirarchical attention network
  • Key takeaways

Speaker bio

Data scientist with overall 8 years of experience in software development, applied research and machine learning. Currently working at VMware as Lead Data Scientist. Tech enthusiast and stationary hoarder :)

Slides

https://docs.google.com/presentation/d/1HFLuAYt2vde6neyyvzGU-G97jB5v4HtsufUdC8BUFS4/edit?usp=sharing

Comments

{{ gettext('Login to leave a comment') }}

{{ gettext('Post a comment…') }}
{{ gettext('New comment') }}
{{ formTitle }}

{{ errorMsg }}

{{ gettext('No comments posted yet') }}

Make a submission

Accepting submissions till 01 Nov 2019, 04:20 PM

Taj M G Road, Bangalore, Bangalore

Hosted by

Anthill Inside is a forum for conversations about risk mitigation and governance in Artificial Intelligence and Deep Learning. AI developers, researchers, startup founders, ethicists, and AI enthusiasts are encouraged to: more