Vector DBs? Local LLMs? Use Vector Admin!
Managing VectorDBs with UI with to VectorAdmin. SelfHost VectorAdmin with Docker and use it with Local VectorDBs.
Managing VectorDBs with UI with to VectorAdmin. SelfHost VectorAdmin with Docker and use it with Local VectorDBs.
Using Local LLMs with Ollam and making them Interact thanks to CrewAI. Guide to SelfHost Conversational AI.
SelfHosting LogSeq - Build a private knowledge base that can be used together with Ollama. A Free Obsidian alternative
OLLama is a great FOSS project that allow us to deploy LLMs locally and manage them in a way familiar for Docker users.
Lets use LLMs Locally with PrivateGPT and Docker. Chat with your Documents while Keeping Data Safe and Private.