arrow_back Birds Of Feather (BOF) session: Hubs and spokes of AI
Deep Learning in the Browser: Explorable Explanations, Model Inference & Rapid Prototyping
Submitted by Amit Kapoor (@amitkaps) on Thursday, 28 June 2018
Technical level: Intermediate
This talk is designed in the “Show, not Tell” format. We will focus on showcasing three particular use case (live-demos) where deep learning models can be used for explanations, inference and training on the browser directly. The demos will also show the application from three different types of data - tabular, text and image.
Explorable Explanations: Explaining the DL model and allowing the users to build intuition on the model helps in generating insight. We showcase an explorable explanation for loan default DL model, which allows the user to explore the feature space and threshold boundaries using interactive visualisations to drive decision making.
Model Inference: Inference is the most common use case and the browser allows you to ‘bring your DL model to the data’. It also allows you test how the model works, when executed on the edge. We showcase an comments sentiment application in the browser, which can identify and warn about the toxicity of the comments as you type in a text box.
Rapid Prototyping: Training of DL models is now possible in the browser itself, if done smartly. We showcase a rapid prototyping image classification example which allows the user to play with transfer learning to build a model specific for a user-generated image input.
We will end the talk on how we see the ecosystems of tools for DL in the browser emerging and making it easy for everyone to start doing this.
- Tensorflow.js for deep learning model training and inference.
- Ml.js for traditional machine learning model training and inference.
- Vega and Vega-lite for interactive dashboards.
- Arrow.js for data loading, and type inference
The working demos will be available on the web and open-source code on Github. An initial draft of the slides is attached.