Open Web UI - Using Ollama. Done Simpler
Run models locally. Use Ollama with Open Web UI to get a confortable interface for LLMs.
Run models locally. Use Ollama with Open Web UI to get a confortable interface for LLMs.
Markdown is just amazing. For note taking and even website creation
Website Analytics + Uptime Monitor + Server Status with one docker container. Simply Tianji.
A step-by-step guide to self-hosting FreshRSS with Docker on your server.
Setting up Fail2Ban with NGINX with Docker.