Add CLI example for streaming Ollama responses live_ollama_cli.py #551
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
🚀 Add a Live Streaming CLI for Real-Time Interaction with Ollama
This PR introduces a vibrant and interactive CLI tool located at
examples/live_ollama_cli.py
, designed to showcase how users can connect with Ollama's local REST API in a lively, character-by-character streaming experience — just like chatting with a real AI assistant.🎯 Highlights:
requests
(no heavy dependencies)📍 Use Case:
Perfect for developers who want a fast, minimal, and engaging way to test Ollama locally without writing a full app or using Python notebooks.
🧪 Tested with:
tinyllama
http://localhost:11434
Thanks for checking this out! Hope this example helps more users start building cool interfaces with Ollama. 🙌