The Fifth Elephant 2015

A conference on data, machine learning, and distributed and parallel computing

Anand Chandrasekaran

@madstreetden

Keeping Moore's law alive: Neuromorphic computing

Submitted Jun 15, 2015

This talk explores the implications of Neuromorphic Engineering, or ‘building brains in silicon’, on the development of extremely parallel compute techniques such as deep learning.

Outline

Moore’s law is a term coined by Carver Mead, a Caltech professor who is also the father of Neuromorphic Engineering. It refers to the observation, now more hope than reality, that advances in technology will allow a doubling of compute capability in silicon every 18 months. Recent advances in the use of highly parallel compute methods, that are loosely based on neural systems in our brain, are changing how compute is accomplished. These techniques, collectively termed deep learning networks, burst onto to the world because of one reason: the ability to perform lots of parallel computations on graphics cards. However, it is in truly custom hardware, such as that pioneered by the Neuromorphic community that we will find the salvation of Moore’s law. When we blend powerful compute techniques with custom silicon architectures, we can keep the hope alive of continuing to double the compute capability of the world.

If you are in the space of deep learning or have heard about how GPUs have revolutionalized high performance computing, this talk will take you to the extreme bleeding edge of that world.

Requirements

None, I will keep transistor physics out of this.

Speaker bio

The speaker was one of the creators of Neurogrid, a system built in Stanford that until recently was the largest Neuromorphic system in the world. He is also the CTO and Founder of Mad Street Den, a computer vision and AI startup based out of Chennai.

Comments

{{ gettext('Login to leave a comment') }}

{{ gettext('Post a comment…') }}
{{ gettext('New comment') }}
{{ formTitle }}

{{ errorMsg }}

{{ gettext('No comments posted yet') }}

Hosted by

Jump starting better data engineering and AI futures