🦙 Local and online AI hub
-
Updated
Aug 25, 2025 - Python
🦙 Local and online AI hub
✨ AI interface for tinkerers (Ollama, Haystack RAG, Python)
A single-file tkinter-based Ollama GUI project with no external dependencies.
Your gateway to both Ollama & Apple MlX models
Desktop UI for Ollama made with PyQT
Enables users to interact with the LLM via Ollama by implementating a client-server architecture utilizing FastAPI as server-side framework and Streamlit for user interface.
Simple Ollama GUI app written in Python. Everything you need for your LLM.
SparkOllama is your go-to ollama web-UI interface powered by Streamlit for seamless, AI-powered conversations. Built on Ollama's advanced LLM, it delivers smart, dynamic responses effortlessly. Whether for work or fun, SparkOllama makes engaging with AI simple and intuitive.
Ollamate is an AI assistant using Ollama to run your favorite local LLMs.
A simple PDF AI chatbot which runs locallly on your machine with streamlit UI
Spring break project for easier access to 'ollama' language models.
A simple GUI for local hosted LLM (via Ollama)
This desktop application, built with customtkinter, provides an interactive chat interface for local Large Language Models (LLMs) served via Ollama.
A complete LLM-based medical symptom classifier that processes user symptoms through a sophisticated pipeline : User Query → RoBERTa Classifier → Symptom Retriever (RAG) → Prompt Builder → Local LLM → Final Output
Streamlit web application designed to facilitate interactions with Large Language Models (LLMs) and manage knowledge bases using Retrieval Augmented Generation (RAG). It allows users to chat directly with local LLMs, query custom RAG memories, and interact with PDF documents.
Add a description, image, and links to the ollama-gui topic page so that developers can more easily learn about it.
To associate your repository with the ollama-gui topic, visit your repo's landing page and select "manage topics."