Building Next Generational Voice Augmented eXperiences for mobile apps
Voice based interactions are rapidly gaining mainstream acceptance. And the current state of the art is to build skills or actions behind popular voice assistants like Alexa, Google Assistant. But apps are still the most popular channel for brands to connect to their customers. What if we could bring the convenience of Voice and marry them with the power of visual experiences that Apps provide?
This talk will highlight the fundamentals of designing & implementing Voice Augmented eXperiences in mobile apps. We will also cover how VAX is different from popular voice assistants like Alexa or Google Assistant.
What are the takeaways:
The fundamentals of designing & implementing Voice Augmented eXperiences in mobile apps
How VAX is different from building actions/skills for popular voice assistants like Alexa or Google Assistant.
How this next generational User eXperience can reduce drop offs, help in gaining new users, and allows easy information retrieval.
Who should attend:
Mobile app developers interested in learning newer technologies/methodologies
UI/UX designers interested in learning about newer paradigms and expanding their canvas of interaction
Product managers who are looking for ways to increase their app’s growth and reduce existing customer frictions.
Innovation leaders who want to understand futuristic use-cases for their apps.
This link has the draft slides. It outlines what the talk is going to be about
Just come with an open mind :)
Kumar Rangarajan is the co-founder and the obsessive dictator @ Slang Labs. He was earlier the co-founder of Little Eye Labs which was acquired by Facebook. His software career started with GE, HP, Rational (acquired by IBM), S7 (acquired by BlueCoat) and most recently in Facebook.