Deep Learning is a new area of research that is getting us closer in achieving one of the primary objectives of Machine Learning – Artificial Intelligence.
It is used widely in the fields of Image Recognition, Natural Language Processing (NLP) and Video Classification.
Deep Learning Conf is a single day conference followed by workshops on the second day. The conference will have full, crisp and lightning talks from morning to evening. The workshops on the next day will introduce participants to neural networks followed by two tracks of three-hour workshops on NLP and Computer Vision / AI. Participants can join either one of the two workshop tracks.
We are looking for talks and workshops from academics and practitioners of Deep Learning on the following topics:
- Applications of Deep Learning in software.
- Applications of Deep Learning in hardware.
- Conceptual talks and cutting edge research on Deep Learning.
- Building businesses with Deep Learning at the core.
We are inviting proposals for:
- Full-length 40 minute talks.
- Crisp 15-minute talks.
- Lightning talks of 5 mins duration.
Proposals will be filtered and shortlisted by an Editorial Panel. Along with your proposal, you must share the following details:
- Links to videos / slide decks when submitting proposals. This will help us understand your past speaking experience.
- Blog posts you may have written related to your proposal.
- Outline of your proposed talk – either in the form of a mind map or a text document or draft slides.
If your proposal involves speaking about a library / tool / software that you intend to open source in future, the proposal will be considered only when the library / tool / software in question is made open source.
We will notify you about the status of your proposal within two-three weeks of submission.
Selected speakers have to participate in one-two rounds of rehearsals before the conference. This is mandatory and helps you prepare for speaking at the conference.
There is only one speaker per session. Entry is free for selected speakers. As our budget is limited, we will prefer speakers from locations closer home, but will do our best to cover for anyone exceptional. HasGeek will provide a grant to cover part of your travel and accommodation in Bangalore. Grants are limited and made available to speakers delivering full sessions (40 minutes or longer).
HasGeek believes in open source as the binding force of our community. If you are describing a codebase for developers to work with, we’d like it to be available under a permissive open source licence. If your software is commercially licensed or available under a combination of commercial and restrictive open source licences (such as the various forms of the GPL), please consider picking up a sponsorship. We recognise that there are valid reasons for commercial licensing, but ask that you support us in return for giving you an audience. Your session will be marked on the schedule as a sponsored session.
- Proposal submission deadline: 31 May 2016
- Schedule announcement: 15 June 2016
- Conference dates: 1 July 2016
CMR Institute of Technology, Bangalore
For more information about speaking proposals, tickets and sponsorships, contact email@example.com or call +91-7676332020.
Activations, Objectives and Optimisers - Nuts & Bolts of a DeepNet
Building a good Deep Network is not an easy task. From the architecture of the network to various parameters - each choice is very crucial as it has a huge bearing on the performance (both accuracy and efficiency) of the DeepNet. Of these, three important component any practitioner must choose are : which Activation function, Loss/Objective Function and Optimiser to use ?
Given the large number of choices for each of these, the task doesn’t get any easier. In this talk we will take a deeper look into each of the choices available to us while choosing these components.
We will address what, why, how, Pros & Cons for each of these choices:
Activation Function - Softmax, Softplus, Softsign, relu, tanh, sigmoid, hard sigmoid, linear
Loss/Objective Function - mean squared error, mean absolute error, mean absolute percentage error, mean squared logarithmic error, squared hinge, hinge, binary crossentropy(logloss), categorical crossentropy(multiclass logloss), sparse categorical crossentropy, poisson, cosine proximity
Optimizer - SGD, RMS prop, Adagrad, Adadelta, Adam, Adamax
This talk aims to bring out a better understanding of these Nuts & Bolts.
I currently work as a pricipal ML researcher at Airwoot(Now acquired by Freshdesk), building inteligent applications using NLP + Deep Learning. Before to joining industry I was a part of Data-Science group at IIT Delhi and research scholar with theory group at IIIT - hyderabad.
Prior to this I have given talks at IIIT Hyd, ICDCN, IIT Delhi, ISOC
You can find more about me on