Local AI Agents with CrewAI (and Ollama)
Using Local LLMs with Ollam and making them Interact thanks to CrewAI. Guide to SelfHost Conversational AI.
Using Local LLMs with Ollam and making them Interact thanks to CrewAI. Guide to SelfHost Conversational AI.
SelfHosting LogSeq - Build a private knowledge base that can be used together with Ollama. A Free Obsidian alternative
OLLama is a great FOSS project that allow us to deploy LLMs locally and manage them in a way familiar for Docker users.
Lets use LLMs Locally with PrivateGPT and Docker. Chat with your Documents while Keeping Data Safe and Private.
Selfhosting ChromaDB with Docker. Vector Database for Gen AI Projects.