Transform how AI understands code through graph-native semantic intelligence.
# Clone this repository
git clone --recursive https://github.com/Consiliency/codegraph
cd codegraph
# Start the development environment
./scripts/start-dev.sh
# Run the demo
./scripts/demo.sh
- Docker 20.10+
- Docker Compose 2.0+
- Node.js 18+
- Python 3.11+
- 16GB RAM recommended
- 20GB free disk space
┌─────────────┐ ┌─────────────┐ ┌─────────────┐
│ Browser │────▶│ TypeScript │────▶│ Python │
│ (Dashboard) │ │ API │ │ Analysis │
└─────────────┘ └─────────────┘ └─────────────┘
│ │
▼ ▼
┌─────────────┐ ┌─────────────┐
│ Memgraph │ │ Qdrant │
│ (Graph) │ │ (Vectors) │
└─────────────┘ └─────────────┘
codegraph-core
: High-performance C++ engine (Tree-sitter, BLAKE3)codegraph-api
: TypeScript GraphQL/REST API servercodegraph-analysis
: Python ML workflows and analysiscodegraph-proto
: Protocol Buffer definitionscodegraph-deploy
: Docker compose and infrastructure
-
Clone and setup:
git clone --recursive https://github.com/Consiliency/codegraph cd codegraph cp .env.example .env # Edit .env and add your OpenAI API key
-
Start services:
./scripts/start-dev.sh
-
Access the system:
- Dashboard: http://localhost:4000/dashboard/
- GraphQL: http://localhost:4000/graphql
- Memgraph: http://localhost:3000
- ✅ Multi-language AST parsing (Python, JavaScript, TypeScript)
- ✅ Graph-based code representation
- ✅ Semantic search with OpenAI embeddings
- ✅ Real-time monitoring dashboard
- ✅ Protocol buffer communication
- ✅ 70% LLM token reduction (projected)
See CONTRIBUTING.md for development guidelines.
Apache License 2.0