Let's dope it: Interoperable ML via Deep Learning
One of the biggest hurdles to reducing time-to-market of an ML Product is the two language problem. Generaly speaking, the tech stacks of the Producers of the ML models and its Consumers are different. Say, a DataScientist may work with Python, but a Production Engineer may want it in a JVM language. There are multiple approaches to solving this problem. Languages like Julia offer the expressiveness of high level languages and performnance of lowe level languages like C. Distributed computing framewokrs like Spark provide a rich set of APIs in multiple languages. There were also attempts to express Models in a markup language (PMML). But one things that is certain is that entropy of the developer world is always on the rise. In such cases, how can ML made interporable and yet let the developer bring his/her own knowledge without having to re-learn new unified stacks?
In the last few years, Deep Learning has made great strides due to variety of reasons: availablity of cheap data, computing power and technology and the science itself. It is notable that industry leaders in this area have made their deep learning frameworks accessible by open sourcing them. There are also standards such as as ONNX and NNEF which can make Deep Learning interporable. That is, a Deep Learning model created in MXTNet on a laptop can be scored using, say, TensorFlow Server on a TPU in the cloud. But what about the mainstream, non-deep learning ML algorithms such as Generalized Lienar Models, Decision Trees, Colloborative Filtering algorithms. Can we, some how exploit the Deep Learning frameworks, to make them interoprable as well.
We answer this question affirmitively: It is based on the view point that 1) Deep Learning has a lego block architecture, and is object oriented in spirit and 2) Many classical algorithms can be composed using those lego blocks. It means that, algorithm’s intent is specified via any standard ML modeling libraries like sciki-learn and fullfill that intent via a backend Deep Learning modelm without any radical changes in the developer experience.
We will demo how algorithms in python’s scikit-learn, trained on a laptop, can be scored in a browser. All this by just adding one line of code. That tool is called – IMLY: Interporable Machine Learning: Yay!
- interoporabilty in ML: challenges and current solutions
- emergence of deep learning technologies and standardization efforts
- deep learning: a lego block architecture for composing algorithms/models
- a tour of equivalence between classical/popular ML algorithms and their deep learnign counter parts
- demo of an open source project IMLY: Interporable Machine Learning: Yay!
Soma S Dhavala is freelance consultant operating at the interface of Statistics, Machine Learning, Computing, and Internet of Things. His interests are in Graphs, Meta Machine Learning and their application in representing, and reasoning with information. Since last few years, he has been working on ML-as-Infrastructure.
Soma currently consults for Framewirk, is a member of the Design Council at EkStep. Since 2014 and is leading the efforts in designing Machine Learning Infrastructure and developing various DataProducts concerned with Learning such as Recommendation Engines, Auto-Tagging Content. He designed and ran a Deep Learning and Natural Language Processing immersive bootcamp for NIIT. He is also working on Deep Variational Inference with his collaborators in the academia. He is also a co-founder of VitalTicks Pvt Ltd, a start-up in the digital health care space.
In the past, Soma worked with Dow AgroSciences in the area of ML applications in Systems Biology and with General Electric Global Research Center in the area of Information Theory and Parallel Computing.
Soma obtained his Ph.D (TAMU, 2010) in Statistics where he worked on applying Bayesian Nonparametrics to systems biology (2010), has a Masters (IIT-M, 2000) in EE and a B.E (SRKREC-Andhra University, 1997) in ECE. He also worked as a post-doc close to a year in the area of Dynamical Systems at TAMU. He has over 10+ publications and multiple patents.