Rootconf Delhi edition

Rootconf Delhi edition

On network engineering, infrastructure automation and DevOps

Rootconf is a platform to discuss real-world problems around Site Reliability Engineering (SRE), DevOps for data engineering platforms, evaluating and adopting technologies such as Kubernetes and containers, and DevSecOps.

Rootconf Delhi edition will be held on 18 January 2020 at the India International Centre (IIC).

Speakers from Flipkart, Hotstar, MindTickle, Red Hat and will discuss the following topics:

  1. Scaling and engineering challenges from Hotstar’s and Flipkart’s experiences.
  2. Data store choices.
  3. Kubernetes and K8s – when to choose what and why?
  4. DevSecOps

Who should attend Rootconf:

  1. Operations engineers
  2. DevOps programmers
  3. Software developers
  4. SRE
  5. Tech leads

To know more about Rootconf, check the following resources:



Click here to view the Sponsorship Deck.
Email for bulk ticket purchases, and sponsoring the above Rootconf Series.

Rootconf Delhi sponsors:

Silver Sponsor


Bronze Sponsors

upcloud SumoLogic

Community Partner

IFF Null Delhi

For information about the event, tickets (bulk discounts automatically apply on 5+ and 10+ tickets) and speaking, call Rootconf on 7676332020 or write to

Hosted by

Rootconf is a forum for discussions about DevOps, infrastructure management, IT operations, systems engineering, SRE and security (from infrastructure defence perspective). more

Nikunj Jain


Real time Machine Learning Inference Platform @ Zomato

Submitted Nov 27, 2019

The main problem we were facing at Zomato that it takes 1-2 month to take a ML model live. Data scientists and ML engineers work on a variety of problems at Zomato such as predicting kitchen preparation time (time taken by the restaurant to prepare the food given the live order state of the kitchen), predicting rider assignment time (time to assign a free rider to pick up the order given the real time availability of riders), personalised ranking of the restaurants for a user etc. I will go in detail about the platform we made to cater to these use cases and which made it very easy for anyone to take a model live in less than a week.

Key takeaways:

  • Learn about main components of real time ML Inference Platform
  • Build a production ready real time ML Pipeline for low response time and high reliability at scale
  • Build a platform where data scientists and engineers in the company can build and deploy ML models at scale using standardise workflow and deployment

Target Audience:

Software Engineers, ML Engineers and DevOps Engineers


  • Challenges and Problems at Zomato
  • Requirements of the ML platform
  • Overall Architecture of the ML Platform
  • Case Study : Predicting Kitchen Preparation Time
  • Real time feature computation pipeline - why did we choose Flink ?
  • Platform for Data scientists to develop and log their models independently - why did we choose MLFlow ?
  • Platform for model deployment - why AWS Sagemaker ?
  • Realtime Feature Store backed by redis
  • Non realtime Feature Store backed by cassandra
  • ML Gateway to fetch features from Feature Store and call sagemaker for inference
  • Workflow to deploy a new model
  • Future work

Speaker bio

I have been working with Data Scientists and ML engineers for more than 3 years at Zomato solving various user facing problems like personalized ranking, prediction kitchen preparation time, rider assignment time using machine learning. Being a software engineer at heart, I understand the problems being faced to make any real time complex machine learning model live in production at a scale. I have deployed all the above models facing 100k rpm at peak time.



{{ gettext('Login to leave a comment') }}

{{ gettext('Post a comment…') }}
{{ gettext('New comment') }}
{{ formTitle }}

{{ errorMsg }}

{{ gettext('No comments posted yet') }}

Hosted by

Rootconf is a forum for discussions about DevOps, infrastructure management, IT operations, systems engineering, SRE and security (from infrastructure defence perspective). more