PrivateGPT: SelfHosting LLMs to Chat with your Docs

Using Local LLMs with PrivateGPT

OLLama - A great project to use Open Source LLM Models locally.

Deploy local LLMs like containers - OLLama Docker

OLLama is a great FOSS project that allow us to deploy LLMs locally and manage them in a way familiar for Docker users.

Tabby: Free Github Copilot Alternative

Use Generative text AI Models as your free coding Assistant thanks to Tabby. We can use Open Source LLMs, and interact with them locally, using regular CPUs.

Generative AI with Python: GPT4ALL and Local LLMs

Lets learn how to use Open source LLMs together with Python with GPT4All.

.

Chaos Theory in Motion: a Double Pendulum Simulator in Python

Exploring the intersection of Chaos Theory with Python and a Double Pendulum Simulator App. Using Streamlit, physics equations, Docker, and Cloudflare, the app showcases the unpredictable dance of determinism and chaos in a digital realm.