Github Copilot is Great. But it is not F/OSS ❗
Fortunately, there is already a F/OSS Tool that allow us to use Open LLMs to helps us to code better.
Yes, I am talking about TabbyML. And yes, it can run on CPU mode.
The Tabby Project
- Tabby is a SelHosted AI Coding Assistant - The TabbyML Site
- The Tabby Source Code at Github
- The Docker Container to deploy Tabby
- Mixed License: Apache v2 ✅
Why Tabby as Coding Assistant?
- It boasts several key features:
- Self-contained, with no need for a DBMS or cloud service.
- OpenAPI interface, easy to integrate with existing infrastructure (e.g Cloud IDE).
- Supports consumer-grade GPUs, also working with CPUs
- Yep, we will run our F/OSS Local Coding Assistant
Running Tabby with Docker
We will be using this Tabby Docker Image.
- Choose a model to run with Tabby
- Get ready to get code completion on these languages with Tabby
- Model example: StarCoder
- StarCoderBase-1B is a 1B parameter model trained on 80+ programming languages from The Stack (v1.2), with opt-out requests excluded. The model uses Multi Query Attention, a context window of 8192 tokens, and was trained using the Fill-in-the-Middle objective on 1 trillion tokens.
- Model example: StarCoder
Tabby Docker CLI
If you have Docker installed and are confortable with the terminal, you can just execute the following to use Tabby with StarCoder-1b LLM:
docker run -it -p 8080:8080 -v $HOME/.tabby:/data tabbyml/tabby serve --model TabbyML/StarCoder-1B
#podman run -it -p 8080:8080 -v tabby-data:/data tabbyml/tabby serve --model TabbyML/StarCoder-1B
Tabby Docker-Compose
I prefer to deploy Docker container as Stacks with Portainer, and this is the configuration file we will need:
version: '3'
services:
tabbyml:
image: tabbyml/tabby
ports:
- "8080:8080"
volumes:
- tabby-data:/data
command: ["serve", "--model", "TabbyML/StarCoder-1B"]
#command: serve --model TabbyML/StarCoder-1B
volumes:
tabby-data:
Here I am using the StarCoder-1B as Language Model, but feel free to change it as per your needs.
Using Tabby with VSCode
We need the extension TabbyML.vscode-tabby
FAQ
Other Coding Assistants
- Codeium: Codeium Site
- Official Codeium Page
- They are using internal LLM and requires sign-up❗
- GPT-Pilot GPT Pilot in Github
- The project is F/OSS with MIT License. Yet it uses OpenAI API ❗
- BitoAI: BitoAI Site
- It uses ChatGPT4 in the background ❗