Github Copilot is Great. But it is not F/OSS ❗

Fortunately, there is already a F/OSS Tool that allow us to use Open LLMs to helps us to code better.

Yes, I am talking about TabbyML. And yes, it can run on CPU mode.

Why Tabby as Coding Assistant?

  • It boasts several key features:
    • Self-contained, with no need for a DBMS or cloud service.
    • OpenAPI interface, easy to integrate with existing infrastructure (e.g Cloud IDE).
    • Supports consumer-grade GPUs, also working with CPUs
      • Yep, we will run our F/OSS Local Coding Assistant

When you Setup Tabby and cancel your Github Copilot

Running Tabby with Docker

We will be using this Tabby Docker Image.

  • Choose a model to run with Tabby
  • Get ready to get code completion on these languages with Tabby
    • Model example: StarCoder
      • StarCoderBase-1B is a 1B parameter model trained on 80+ programming languages from The Stack (v1.2), with opt-out requests excluded. The model uses Multi Query Attention, a context window of 8192 tokens, and was trained using the Fill-in-the-Middle objective on 1 trillion tokens.

Tabby Docker CLI

If you have Docker installed and are confortable with the terminal, you can just execute the following to use Tabby with StarCoder-1b LLM:

docker run -it -p 8080:8080 -v $HOME/.tabby:/data tabbyml/tabby serve --model TabbyML/StarCoder-1B
#podman run -it -p 8080:8080 -v tabby-data:/data tabbyml/tabby serve --model TabbyML/StarCoder-1B

Tabby Docker-Compose

I prefer to deploy Docker container as Stacks with Portainer, and this is the configuration file we will need:

version: '3'
    image: tabbyml/tabby
      - "8080:8080"
      - tabby-data:/data
    command: ["serve", "--model", "TabbyML/StarCoder-1B"]
    #command: serve --model TabbyML/StarCoder-1B


Here I am using the StarCoder-1B as Language Model, but feel free to change it as per your needs.

Using Tabby with VSCode

We need the extension TabbyML.vscode-tabby


Other Coding Assistants