Use Chrome browser APIs to run lightweight local AI models inside the browser, enabling direct local inference similar to Ollama.