LLMX; Easiest 3rd party Local LLM UI for the web!
LLM X does not make any external api calls. (go ahead, check your network tab and see the Fetch section). Your chats and image generations are 100% private. This site / app works completely offline.
Ollama + Firefox: LLM X uses ollama js to update models and show model information, currently there is a cors issue on firefox when using the app from github that does not allow updating models or seeing model information to work. Advised to use chrome for these or use the CLI until fixed. Apologies. Github issue
ollama pull llava
(or use the app)OLLAMA_ORIGINS
= https://mrdjohnson.github.io
OLLAMA_ORIGINS=https://mrdjohnson.github.io ollama serve
$env:OLLAMA_ORIGINS="https://mrdjohnson.github.io"; ollama serve
)lms server start --cors=true
./webui.sh --api --listen --cors-allow-origins "*"
ollama serve
lms server start
./webui.sh --api --listen
yarn install
, yarn dev
Conversation about logo |
---|
Image generation example! |
---|
Showing off omnibar and code |
---|
Showing off code and light theme |
---|
Responding about a cat |
---|
Another logo response |
---|
What is this? ChatGPT style UI for the niche group of folks who run Ollama (think of this like an offline chat gpt server) locally. Supports sending and receiving images and text! WORKS OFFLINE through PWA (Progressive Web App) standards (its not dead!)
Why do this? I have been interested in LLM UI for a while now and this seemed like a good intro application. I've been introduced to a lot of modern technologies thanks to this project as well, its been fun!
Why so many buzz words? I couldn't help but bee cool 😎
Logic helpers:
UI Helpers:
Project setup helpers:
Inspiration: ollama-ui's project. Which allows users to connect to ollama via a web app
Perplexity.ai Perplexity has some amazing UI advancements in the LLM UI space and I have been very interested in getting to that point. Hopefully this starter project lets me get closer to doing something similar!
(please note the minimum engine requirements in the package json)
Clone the project, and run yarn
in the root directory
yarn dev
starts a local instance and opens up a browser tab under https:// (for PWA reasons)
LangChain.js was attempted while spiking on this app but unfortunately it was not set up correctly for stopping incoming streams, I hope this gets fixed later in the future OR if possible a custom LLM Agent can be utilized in order to use LangChain
Originally I used create-react-app 👴 while making this project without knowing it is no longer maintained, I am now using Vite. 🤞 This already allows me to use libs like ollama-js
that I could not use before. Will be testing more with langchain very soon
This readme was written with https://stackedit.io/app
Changes to the main branch trigger an immediate deploy to https://mrdjohnson.github.io/llm-x/