SomeTools

WebLLM Chatbot

Chat with AI models that run entirely in your browser using WebLLM. Download models once, cache them locally, and clear them any time.

⚠️ WebLLM Requirements (click to expand)

WebLLM requires WebGPU support in your browser.

  • WebGPU Support: Required for model execution. Check support at webgpureport.org
  • Browser Requirements: Chrome 113+, Edge 113+, or Safari 18+ (with WebGPU enabled)
  • Models: Large files that are downloaded once and cached in your browser's IndexedDB
  • You can clear individual models or all WebLLM cache from this page.
Saved Chats

No saved chats yet

Select a model, download it if needed, then start chatting. Your messages and responses stay in your browser.

Press Enter to send, Shift+Enter for new line

FAQ

Where are models stored?

WebLLM stores models in your browser's IndexedDB under the webllm database. You can clear individual models or all models using the buttons above.

Can I use this offline?

Yes. After a model has been downloaded and cached, you can chat offline. The model runs entirely in your browser.

Is my chat data sent to a server?

No. All prompts and responses are processed locally by WebLLM in your browser.