A wrapper over OpenLLM Models like Qwen/LLaMA that makes them do func-ky stuff!
Note: Just a POC and may not fuction as a product. Make sure to download Ollama and Qwen2.5:3b model before using the app.
Prologue: If you run a base OpenSource LLM Model, it has General Knowledge, but cannot answer simple questions like "What is the date today?" as it
- has knowldege upto a traning cut-of-date.
- does not have access to internet to have information to answer such questions.
This PoC attempts to provide LLM API with information at runtime by defining fuctions that it can access by understanding the context of the user request.
Demo:
1: Base model unable to answer "What is the date today?"
2: We create a function that can get system date
3: Now the model can access this fuction and answer questions around dates
The above example should have already provided you with ideas of the potential it has, a few simple examples.
- Get Weather details.
- Get Latest news.
- Manage reminders.
- Search the internet.
- Query databases.
The application will be available at http://localhost:8080
- Open the main chat interface
- Type your request for a function in natural language
- The AI will generate appropriate Python code
- Use the copy button to copy the generated code
- Toggle dark/light mode using the settings gear icon
- Click "Manage Functions" in the settings menu
- Add new functions:
- Enter function name and description
- Use "Generate Function" to create function body using AI
- Add example questions
- Test the function before saving
- Edit existing functions:
- Click "Edit" on any function
- Modify details as needed
- Update or regenerate the function body
- Test functions:
- Use the "Test" button to try out functions
- Enter test parameters
- View results immediately
- Click "Generate Function" button
- Enter a detailed description of what you want the function to do
- The AI will generate appropriate Python code
- Review and modify the generated code if needed
- Test the function before saving
- Natural language interaction
- Code syntax highlighting
- Copy code functionality
- Message history
- Dark/Light theme support
- Create, edit, and delete functions
- AI-powered function generation
- Built-in testing interface
- Example management
- Function organization
- Real-time function testing
- Parameter input
- Result visualization
- Error handling
- Test history
The application can be configured through config.py
:
- Available AI models
- Default model selection
- Other application settings
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature
) - Commit your changes (
git commit -m 'Add some AmazingFeature'
) - Push to the branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- Ollama for the AI models
- Flask framework
- SQLite for database management