How to start with Streamlit

  1. Install Python
  2. Get your IDE Ready
  3. Install the Dependencies

Get the Local LLMs Ready

  1. Install Docker (Optional)
  2. Install Portainer (Optional)
  3. Run Ollama
  4. Download LLama3 with Ollama

Using LLama3 with Streamlit

ollama pull llama3:8b
#ollama list

ollama run llama3:8b
A Small but Powerful Model? Try These 👇

ollama run qwen2:0.5b

ollama run phi3:3.8b

ollama run gemma:2b

ollama run deepseek-coder:1.3b

Get Access to The Comercial LLMs

These LLMs are great, but are not F/OSS ❎

How to get OpenAI API’s

How to get Google’s Gemini API’s

How to get Anthropic’s Claude API’s

How to get Groq

How to use the MultiModel Streamlit App


Conclusions

FAQ