The Rise of On-Device AI: What’s Next for LLMs in the Browser?
With models like Gemini Nano running entirely on-device, the web is entering a new era of AI-powered applications that don’t require servers.
This talk delves into the latest advancements in browser-based LLMs and tooling, including Gemini Nano, WebLLM, and ONNX.js. We’ll examine the trade-offs, performance considerations, and implications of these developments for AI on the web.
Whether you’re building AI-powered applications or simply curious about the future of machine learning in the browser, this session provides a glimpse into what’s ahead.
Ryan Seddon
Frontend Engineer with 20+ years of experience at companies like Seek, Zendesk, and Culture Amp. Passionate about building scalable web applications and improving developer experience. Focused on modern JavaScript, performance, and AI in the browser.