Open Web UI - Using Ollama. Done Simpler
Run models locally. Use Ollama with Open Web UI to get a confortable interface for LLMs.
Run models locally. Use Ollama with Open Web UI to get a confortable interface for LLMs.
Setup LibreChat with docker. Compared with different ways to setup LLMs locally - From Ollama to PrivateGPT
Setup Elia AI CLI with docker to work with different LLMs: Open and Private ones
Quickly Prototype Gen AI Web Apps in pure Python.
Selfhosting Dify AI with Docker. Vector Database for Gen AI Projects.