SelfHosting Tabby - Guide to use Open Source LLM Models as Code Completion Tools.

Tabby ML: Free Github Copilot Alternative

Lets SelfHost Tabby, a Free and Local Coding Assistant. Use Open Source LLMs, and interact with them locally, using regular CPUs to help you Code for Free. Say Goodbye to Copilot and the Cloud.

Miro Alternative: Selfhosting ExcaliDraw with Docker

Stay Productive with this F/OSS Miro Alternative. ExcaliDraw with Docker.

ChatGPT Clone

Generative AI with Python: GPT4ALL and Local LLMs

Lets learn how to use Open source LLMs together with Python with GPT4All.

StableDifussion Locally for Free

Stable Difussion WebUI - Locally with Automatic111

Running Stable Difussion Locally with Automatic111 and Docker

KoboldCpp

Local LLMs with koboldcpp

Using Local LLMs with koboldcpp . Guide to SelfHost LLMs (Run any GGML and GGUF models locally).

September 10, 2023 · 2 min · 394 words ·  ·  Gen-AI