You have been using Python for a while and now you want to share some cool apps with your colleagues.

The moment to be an enabler is really close, but your colleagues might now be that confortable with code.

Fortunately, there are couple of libraries that we can use in Python that will leverage our coding and provide user friendly UI’s, so we can share our code as a Web App.

Get final user fast in the loop with this UI’s in Python.

We will cover several of them:

  1. Streamlit

  2. Chainlit

  3. Gradio

  4. Others

  • Taipy - Data and AI into production web apps in Python
  • Mesop - Build AI Apps in Python

F/OSS Libraries to Create Quick AI Apps

Streamlit

Project Details References
Official Site Streamlit Site
Code Base Streamlit Source Code at Github
License License Apache v2 βœ…

Streamlit Example

Awsome Streamlit Public Resources

You can use Streamlit together with PyGWalker - Example -> https://github.com/Jcharis/Streamlit_DataScience_Apps/blob/master/Using_PyGWalker_With_Streamlit/app.py

Make a better Streamlit UI with a navigation menu - https://github.com/Sven-Bo/streamlit-navigation-menu/blob/master/streamlit_menu_demo.py and https://www.youtube.com/watch?v=hEPoto5xp3k

What kind of AI-Apps can you do with Streamlit?

You can Create a ChatGPT Clone with Streamlit

Even an App to Summarize YouTube Videos

A very interesting analysis ( This Streamlit App Not F/OSS Unfortunately) - https://state-of-llm.streamlit.app/

Chainlit

Like Streamlit, but Chainlit is for Pure AI Apps:

Project Details References
Official Docs The Chainlit Project Documentation
Code Base The Chainlit Source Code at GitHub
License Apache v2 βœ…
  • Fast Development: Chainlit boasts seamless integration with existing codebases, allowing you to incorporate AI functionalities quickly. You can also start from scratch and build your conversational AI in minutes.
  • Multi-Platform Support: Write your conversational AI logic once and deploy it across various platforms effortlessly. This flexibility ensures your AI is accessible from wherever your users interact.
  • Data Persistence: Chainlit offers functionalities for data collection, monitoring, and analysis. This allows you to analyze user interactions, improve your AI’s performance over time, and gain valuable insights from user behavior.

Chainlit Example

Example ChainLit Apps

To run Chainlit, we need:

chainlit run app.py --port 8090 #defaults to 8000

Chainlit is now ready at localhost:8090

Gradio

Gradio is a fantastic open-source Python library that streamlines the process of building user interfaces (UIs) for various purposes.

It empowers you to create interactive demos, share your work with ease, or provide user-friendly interfaces for your machine learning models or Python functions.

Certainly! Below is a simple Markdown table with two columns and two rows:

Project Details References
Official Web The Gradio Site
Code Base The Gradio Source Code at Github
License Apache v2 βœ…

You can fill in “Data 1” and “Data 2” with whatever specific information you need. This format is a straightforward way to organize and present data clearly in Markdown-supported environments.

Example of Gradio Apps

Other Quick UI for AI Apps in Python

Taipy

Data and AI algorithms into production-ready web apps with Taipy. With Python only.

Project Details References
Official Web The Taipy Site
Code Base The Taipy Source Code at Github
License Apache v2 βœ…

Turns Data and AI algorithms into production-ready Web-Apps with only Python

pip install taipy

A Sample of Taipy:

Mesop

With Mesop, you can Build delightful web apps quickly in Python

How to Make the most of your AI-Apps Project

Learning how to effectively use and manage AI applications can significantly increase the efficiency and success of your projects.

Learn to use Docker

By using Docker, you can ensure that your AI application will run the same way, regardless of the environment it’s deployed in.

This significantly reduces the potential for bugs caused by differences in local development environments, and makes it easier to collaborate with others.

You can consider Podman as an alternative

Leverage CI/CD

You dont have to, but it will make your workflow faster.

Vector Stores for AI

Vector stores play a crucial role in RAG systems by enabling efficient storage and retrieval of high-dimensional vectors representing text or other data.

They provide a foundation for similarity search and help in finding relevant information quickly. Here are some comments on vector stores and the provided links:

More about Vector DataBases ⏬

  1. The Vector Admin Project
    • Vector Admin is an open-source project that simplifies the management and visualization of vector databases.
    • It provides a user-friendly interface for interacting with vector stores and performing tasks like indexing, querying, and monitoring.
  2. FOSS Vector DBs for AI Projects: FOSS Vector DBs for AI Projects

    • There are several open-source vector database options available for AI projects.
    • These vector stores offer scalability, fast similarity search, and integration with various programming languages and frameworks.
    • Some popular FOSS vector databases include Faiss, Pinecone …
  3. ChromaDB
    • ChromaDB is an open-source vector database designed for easy integration and fast similarity search.
    • It provides a simple and intuitive API for storing and retrieving vectors.
    • ChromaDB supports various distance metrics and offers features like filtering and metadata handling.
    • It can be self-hosted using Docker, making it convenient for deployment and management.

Vector stores are essential components in RAG systems as they enable efficient retrieval of relevant information based on the similarity of vectors.

They help in scaling the retrieval process and improving the overall performance of RAG applications.

RAGs

RAG (Retrieval-Augmented Generation) is a technique that enhances the capabilities of large language models (LLMs) by allowing them to access and process information from external sources.

It follows a three-step approach:

  • Retrieve relevant information based on the user query
  • process the retrieved information and formulate additional questions - ask
  • use the retrieved and processed information to generate a comprehensive response.

How to get HTTPs to your AI App

Feel free to use Cloudflare Tunnels or a Proxy like NGINX


FAQ

How to Take Gen AI Apps one Step further?

  1. Master Prompt Engineering
  1. Understand RAGs and make the most of Vector Stores
  1. AgentOps / LLMOps

What are some bundled Gen AI Apps to use with Docker?

PrivateGPT

What are the best F/OSS LLMs right now?

Sure, here are the links formatted using your Hugo component:

Qwen2 Mistral LLama3 Nemotron-4-340B-Instruct Yi-1.5-34B-Chat LLM For Coding

You can run LLMs locally with Ollama and Open Web UI

What it is Hugging Face

Hugging Face is a multifaceted platform that caters to the needs of developers and researchers working with artificial intelligence

  • Model Hub: The Model Hub serves as a central repository for pre-trained LLMs in various domains like text generation, translation, and question answering. You can explore these models, download them for use in your projects, and even fine-tune them on your own data to improve their performance for specific tasks.

  • Datasets: Hugging Face provides access to a rich collection of datasets specifically designed for training and evaluating LLMs. This includes datasets for text summarization, sentiment analysis, dialogue systems, and more.

  • Build Your Own AI Applications: Use these pre-trained models as building blocks to create your own custom AI applications without starting from scratch.

  • Share and Collaborate: Share your own AI models and projects with the Hugging Face community, and learn from what others are building.

Useful F/OSS VSCode Extensions

If you are new with these Python Frameworks to build cool UI’s, you can get help from these extensions:

Continue Dev

⏩ Continue is the leading open-source AI code assistant. You can connect any models and any context to build custom autocomplete and chat experiences inside VS Code and JetBrains

The easiest way to code with any LLM - An open-source autopilot in your IDE

https://continue.dev/docs/reference/Model%20Providers/ollama

TIP: ollama run starcoder2:3b (for code autocomplete) + ollama run llama3 (as chat)

AI-Genie

A Visual Studio Code - ChatGPT Integration.

Supports, GPT-4o GPT-4 Turbo, GPT3.5 Turbo, GPT3 and Codex models (Not F/OSS Models ❎)

Create new files, view diffs with one click; your copilot to learn code, add tests, find bugs and more.

Alita Code

How to use AI to Code

Specify what you want it to build, the AI asks for clarification, and then builds it.