Rootconf Mini 2024

Geeking out on systems and security since 2012

Tickets

Loading…

Akshay Sethi

@akshaysethi

Practical tips for building AI applications using LLMs - Best practices and trade-offs

Submitted Oct 30, 2024

Overview
At KushoAI, we’ve built an AI agent that can autonomously perform API testing for you. While building this, we came across a lot of problems specific to AI applications built on top of LLMs that you don’t see anywhere else. Since this is a fairly new area of development, we had to spend a lot of time figuring out solutions for them on our own.

The agenda of this talk is to give you an idea of the kind of problems that you’ll face while building AI applications, various tried and tested approaches to solve them and trade-offs that you need to consider while building AI applications.

We hope that attendees will be able to learn from our experience of building AI applications and get started on their own journey.

Talk outline

  • How to handle LLM inconsistencies while generating structured data
  • How (and why) to implement streaming in your application
  • Background jobs - why do you need them and how to manage them
  • Tools for A/B testing your prompts to find the most effective model for a particular task
  • Prompt observability for debugging
  • Prompt caching for cost-saving
  • Comparison of various LLM APIs available for general use
    • Which ones work better based on task at hand

Prerequisites

  • Python basics
  • GenAI basics

Speaker Bio
Sourabh Gawande is the Co-founder & CTO at KushoAI. He has about a decade of experience building products at multinationals like Dell-EMC, and at unicorns like FalconX (B2B crypto brokerage) and Ninjacart (Agri-tech supply chain)

Comments

{{ gettext('Login to leave a comment') }}

{{ gettext('Post a comment…') }}
{{ gettext('New comment') }}
{{ formTitle }}

{{ errorMsg }}

{{ gettext('No comments posted yet') }}

Hybrid Access Ticket

Hosted by

We care about site reliability, cloud costs, security and data privacy