Mar 2025
10 Mon
11 Tue
12 Wed
13 Thu
14 Fri 04:00 PM – 08:35 PM IST
15 Sat 09:00 AM – 05:50 PM IST
16 Sun 09:00 AM – 06:05 PM IST
CoderMonkey
While most AI features on the web rely on servers, client-side AI runs directly in the user’s browser.
When we build features with AI models on the web, we often rely on server-side solutions for larger models. This is especially true for generative AI, where even the smallest models are about thousand times bigger than the median web page size. It’s also true for other AI use cases, where models can range from 10s to 100s of megabytes. As these models aren’t shared across websites, each site has to download them on page load. This is impractical for developers and users.
Now Tensorflow.js models can be trained and evaluated in the browser or in the backend. This enables web applications to use AI online as well as offline. Moreover, there is vendor independent hardware acceleration with WebGL when using the browser.
Chrome team is developing web platform APIs and browser features designed to integrate AI models, including large language models (LLMs), directly into the browser. With built-in AI, our website or web application can perform AI-powered tasks without needing to deploy or manage its own AI models.
This offers benefits such as low latency, reduced server-side costs, no API key requirements, increased user privacy, and offline access.
We can try doing a hands-on session showcasing a small app that leverages Chrome’s built-in AI web APIs. As a web developer, I found the topic really interesting and futuristic
Hosted by
{{ gettext('Login to leave a comment') }}
{{ gettext('Post a comment…') }}{{ errorMsg }}
{{ gettext('No comments posted yet') }}