Deploy local LLMs like Containers - OLLama Docker
OLLama is a great FOSS project that allow us to deploy LLMs locally and manage them in a way familiar for Docker users.
OLLama is a great FOSS project that allow us to deploy LLMs locally and manage them in a way familiar for Docker users.
Lets use LLMs Locally with PrivateGPT and Docker. Chat with your Documents while Keeping Data Safe and Private.
Selfhosting ChromaDB with Docker. Vector Database for Gen AI Projects.
Installing Whishper with Docker, the Free, Local and Open-Source Audio Transcription.
Lets SelfHost Tabby, a Free and Local Coding Assistant. Use Open Source LLMs, and interact with them locally, using regular CPUs to help you Code for Free. Say Goodbye to Copilot and the Cloud.