Anthill Inside 2019

On infrastructure for AI and ML: from managing training data to data storage, cloud strategy and costs of developing ML models

AgentBuddy: Leveraging Bandit Algorithms for a human-in-loop system for Customer Care Agents (Paper accepted for the demo track at SIGIR-2019)

Submitted by Mithun Ghosh (@mithunghosh) on Wednesday, 17 April 2019


Preview video

Section: Full talk Technical level: Intermediate Session type: Demo

Abstract

We have developed a human-in-the loop system, AgentBuddy that is helping Intuit improve the quality of search it offers to internal Customer Care Agents (CCAs). AgentBuddy aims to reduce the cognitive effort on part of the CCAs while at the same time boosting the quality of our legacy federated search system. It addresses two key pain points 1)Given several candidate query answering mechanisms, how to select the right mechanism given a question and 2)Having retrieved a set of lengthy documents how to help the agent zoom in on the content most important for the question at hand. We address #1 using an elegant approach for principled exploration based on bandit algorithms and for #2 we have several models based on supervised and unsupervised learning. We will share several lessons from a practical perspective in working with business teams especially around determining the right metrics and coming up with a validation process acceptable to business. Since this is a real world working system deployed on AWS, we will also discuss practical challenges in scaling and how we overcame them.

Outline

Business Problem
Approach
Formulation
ML algorithms used : a)Bandit algorithma and b)Algorithms for highlighting
Architectural challenges
Scaling up
Latency issues and sane fall-backs
Validation
Ground truth labels
Merics and measurement
Business Impact
Takeaways for Data Science practice

Speaker bio

Hrishi has been a regular speaker at Anthill and delivered a full-length session on “Building and driving adoption for a robust semantic search system” https://www.youtube.com/watch?v=niKXwqcTpao&t=650s&list=PL279M8GbNset5FCdcLd_ovckHE14PkIhM&index=3
He’s a Staff Machine Learning Engineer at Intuit working on problems in NLP and ML applied to Information Retrieval.
Hrishi did his Master’s from Indian Institute of Science (IISc), Bangalore in 2005 where he worked on Computer Vision for studying atomization in Cryogenic Rocket Engines. Post that he did a full-time PGDM from IIM-Kozhikode. He has been working in the ML/Analytics space for over 12 years and has had long stints at Amazon Core ML and at Mu Sigma before joining Intuit’s IAT team.
Aside of work he spends time playing with his 5-year-old daughter and in solving puzzles.

Slides

https://drive.google.com/drive/u/0/folders/18V9AqJb4nQpfTBNB3w1y8qrHWtXs4Nbk

Preview video

https://archive.org/details/Demo1_201902

Comments

  • Milo Demigo a month ago

    Good article, very useful, thank you very much.
    https://putlocker.page

  • Abhishek Balaji (@booleanbalaji) Reviewer a month ago

    Hello Hrishi,

    Thank you for submitting a proposal. To proceed with evaluation, we need to see detailed slides and a preview video for your proposal. Your slides must cover the following:

    • Problem statement/context, which the audience can relate to and understand. The problem statement has to be a problem (based on this context) that can be generalized for all.
    • What were the tools/options available in the market to solve this problem? How did you evaluate alternatives, and what metrics did you use for the evaluation?
    • Why did you pick the option that you did?
    • Explain how the situation was before the solution you picked/built and how it changed after implementing the solution you picked and built? Show before-after scenario comparisons & metrics.
    • What compromises/trade-offs did you have to make in this process?
    • What is the one takeaway that you want participants to go back with at the end of this talk? What is it that participants should learn/be cautious about when solving similar problems?
    • Is the tool free/open-source? If not, what can the audience takeaway from the talk?

    We need to see the updated slides on or before 21 May in order to close the decision on your proposal. If we do not receive an update by 21 May we’ll move the proposal for consideration at a future event.

Login with Twitter or Google to leave a comment