About the 2019 edition:
The schedule for the 2019 edition is published here: https://hasgeek.com/anthillinside/2019/schedule
The conference has three tracks:
- Talks in the main conference hall track
- Poster sessions featuring novel ideas and projects in the poster session track
- Birds of Feather (BOF) sessions for practitioners who want to use the Anthill Inside forum to discuss:
- Myths and realities of labelling datasets for Deep Learning.
- Practical experience with using Knowledge Graphs for different use cases.
- Interpretability and its application in different contexts; challenges with GDPR and intepreting datasets.
- Pros and cons of using custom and open source tooling for AI/DL/ML.
Who should attend Anthill Inside:
Anthill Inside is a platform for:
- Data scientists
- AI, DL and ML engineers
- Cloud providers
- Companies which make tooling for AI, ML and Deep Learning
- Companies working with NLP and Computer Vision who want to share their work and learnings with the community
For inquiries about tickets and sponsorships, call Anthill Inside on 7676332020 or write to firstname.lastname@example.org
Sponsorship slots for Anthill Inside 2019 are open. Click here to view the sponsorship deck.
Anthill Inside 2019 sponsors:
Efficient Machine Translation for low resource languages using Transformers
Transformer is the first transduction model relying entirely on self-attention to compute representations of its input and output without using sequence aligned RNNs or convolution. Transformers were recently used by OpenAI in their language models, and also used recently by DeepMind for AlphaStar, their program to defeat a top professional Starcraft player.
Build a translation mechanism for datasets with scarcely available parallel sentence pair corpus to obtain relatively high BLEU scores.
- Transformer Model Architecture
a. Encoder [Theory + Code]
b. Decoder [Theory + Code]
- Self-Attention [Theory + Code]
- Multi-head Attention [Theory + Code]
- Positional Encoding [Theory + Code]
- Note on Bleu Score
- Solving a Real World Translation Problem with low resource data
- Attention Visualization
- Translation Results
Basic Familiarity with Neural Networks and Linear Algebra.
Have over 3+ years of industrial experience in Data Science. Currently working as a data scientist (NLP) at niki.ai, where I have built models for Parse Classification, Unsupervised Synonym Detection, Identifying Code Mixing in text, etc. I’ve also participated in numerous data science competitions across Kaggle, AnalyticsVidhya, Topcoder, Crowdanalytix etc and finished in the top 10 in atleast a dozen of those.
Specialties: data science, machine learning, predictive modelling, natural language processing, deep learning, big data, artificial intelligence.