Deep learning with limited data
Submitted by Aman Neelapa (@aman42) on Wednesday, 5 July 2017
When working on a domain specific problem, it’s often impossible to find large datasets to build right sized models. However models trained on one task capture relations in the data which can easily be reused for different problems in the same domain. Recent advances in transfer learning and few shot learning demonstrate the ability of deep networks to assimilate new data without falling prey to catastrophic forgetting.Further, they leverage this data to make accurate predictions after only a few samples.
This talk is meant for those who have been facing the ubiquitous problem of shortage of labelled data for their problem domain. It is meant to expose the listeners to the cutting edge research in this field and provide them with pointers and techniques regarding how to think about this problem.
Introduction to Transfer Learning
- Why is transfer learning useful?
- Common issues with transfer learning.
Recent Approaches to transfer learning
- Learning to learn - how to learn across tasks?
- Use of memory - how external memory can be leveraged for meta learning
- Smarter synapses - what can we incorporate from biological synapses to make artificial neurons more resilient to change?
At BicycleAI, I am an ML researcher working on task-oriented dialogue systems. In the past, I have worked on recommendation systems, information retrieval, small-text classification, document image translation and demand prediction problems. As a BITS Pilani and Stanford alum, I avidly follow recent work in Deep Learning and work on ways to translate bleeding edge research to practical solutions. More in my linkedin bio: