Building NoAIBills: A Chrome Extension That Runs LLMs Locally Using WebGPU, Transformers.js, and Chrome's Prompt API
Feb 13 · 5 min read · No external software, no cloud, no complexity—just install and start chatting. TL;DR: I built a Chrome extension that runs Llama, DeepSeek, Qwen, and other LLMs entirely in-browser. No server, no Ollama, no API keys. Here's the full story—why I built...
Join discussion