Using Local LLMs with Ollam and making them Interact thanks to CrewAI. Guide to SelfHost Conversational AI.
SelfHosting LogSeq - Lets build a knowledge base that can be used together with Ollama.
OLLama is a great FOSS project that allow us to deploy LLMs locally and manage them in a way familiar for Docker users.
Lets use LLMs Locally with PrivateGPT and Docker. Chat with your Documents while Keeping Data Safe and Private.
Selfhosting ChromaDB with Docker. Vector Database for Gen AI Projects.