Anthill Inside 2017

On theory and concepts in Machine Learning, Deep Learning and Artificial Intelligence. Formerly Deep Learning Conf.

Ashish Mogha

@ashishmogha

Neural Machine Translation

Submitted Jun 20, 2017

Our thinking process is desinged such that multiple thoughts capture our minds at different point of time,therefore hampering our ability to recollect every thought from scratch.Our thoughts have persistence,traditional neural networks can’t do this, and it seems like a major shortcoming but recurrent neural networks address this issue.
In the domain of NLP/Speech, RNNs transcribe speech to text, perform machine translation, generate handwritten text, and of course, they have been used as powerful language models (both on the level of characters and words).

It turns out that over the past two years, deep learning has totally rewritten our approach to machine translation.Machine Translation is a sub-field of computational linguistics that investigates the use of software to translate text or speech from one language to another.
Deep learning researchers who know almost nothing about language translation are throwing together relatively simple machine learning solutions that are beating the best expert-built language translation systems in the world.

This talk will be benefitted to those who are interested in advanced applicaton of Deep Neural Network and is looking forward to see the implementation of the latest state-of-the-art models. In this talk, we’re going to take a peek into the realm of neural network machine translation and code will be open-sourced and shared on github.

Outline

Language is the backbone of our civilization.In history we used it to exchange goods and moreover without written records of previous scientific discoveries, we could have never accomplish great events like travelling to space.
We have accomplished a lot, in a world where 13 of the most common languages are natively speaken by less than 50% of the population.
Imagine what would the world be like if this language barrier is removed.

In this talk I will present a model to translate text from one language to another and learning a model like this would be incredible and would be a great fun.

We would cover the following:

  • Introduction to Recurrent Nueral Networks , LSTM’s
  • Hyperparameters
  • Embeddings in nueral network by implementing the Word2Vec model
  • Building a recurrent neural network for predicting sentiment
  • Deep dive into Sequence2Sequence(RNN architecture)
  • Motivation: Machine Translation Advancement
  • Explanation of Code.

Requirements

Basic Understanding of Neural Networks,Deep Learning,Backpropogation,RNN’s and Natural Language Processing.

Speaker bio

Ashish Mogha is currently second year undergraduate pursuing Bachelor of Engineering in Electronics and Communication from IPU, New Delhi.Passionate to learn about Artificial intelligence,Natural Language Processing, Deep Learning, Machine learning.
He has extensive experience with Theano and TensorFlow and actively contributes back to the open source community as well.

Slides

https://docs.google.com/presentation/d/1rnQxB9F3NE9lG0BRk56JhW3M2bctgipynzphkZlCt80/edit?usp=sharing

Comments

{{ gettext('Login to leave a comment') }}

{{ gettext('Post a comment…') }}
{{ gettext('New comment') }}
{{ formTitle }}

{{ errorMsg }}

{{ gettext('No comments posted yet') }}

Hosted by

Anthill Inside is a forum for conversations about risk mitigation and governance in Artificial Intelligence and Deep Learning. AI developers, researchers, startup founders, ethicists, and AI enthusiasts are encouraged to: more