PrivateGPT: SelfHosting LLMs to Chat with your Docs

Using Local LLMs with PrivateGPT

OLLama - A great project to use Open Source LLM Models locally.

Deploy local LLMs like containers - OLLama Docker

OLLama is a great FOSS project that allow us to deploy LLMs locally and manage them in a way familiar for Docker users.

Tabby: Free Github Copilot Alternative

Use Generative text AI Models as your free coding Assistant thanks to Tabby. We can use Open Source LLMs, and interact with them locally, using regular CPUs.

Generative AI with Python: GPT4ALL and Local LLMs

Lets learn how to use Open source LLMs together with Python with GPT4All.

From Fiction to Function: A Hands-On Guide to Open Source LLM Models.

Interacting with Generative AI Language Models Locally

Explore the capabilities of Generative text AI models. Dive into Open Source LLMs, and learn how to interact with them locally and for FREE. We will use our CPU only and Docker for a robust dependencies setup.