Give Your AI
Superpowers Over Your Codebase
One SQLite file. 170+ programming languages. Infinite AI possibilities. Enable semantic search, relationship mapping, and persistent memory for any AI coding agent—completely local and private.
pip install devscriptorEverything You Need
for AI-Powered Development
A complete toolkit for turning your codebase into an AI-navigable knowledge graph
Single SQLite Database
All your codebase knowledge in one portable file. Query with SQL, perform semantic search with vectors, and maintain a persistent knowledge graph that any AI agent can access.
170+ Languages
Universal code analysis powered by tree-sitter. From Python and JavaScript to Haskell, Rust, and Zig—if it has a grammar, Devscriptor understands it.
Context Graph Memory
AI agents that remember. Store architectural decisions, coding standards, and project knowledge across sessions. Your AI gains long-term memory that persists beyond a single conversation.
100% Local & Private
Your code never leaves your machine. Completely sandboxed, isolated environment with zero external dependencies. Perfect for proprietary codebases and security-conscious teams.
MCP Server Built-in
Native integration with Claude Desktop, Cline, and any MCP-compatible AI agent. 20+ tools exposed including semantic search, dead code detection, and complexity analysis.
Semantic Search
Find code by meaning, not just keywords. Vector embeddings powered by FastEmbed or SentenceTransformers enable natural language code discovery across your entire codebase.
Relationship Mapping
Understand how your code connects. Automatic detection of inheritance, implementation, function calls, imports, and data flow relationships across files and modules.
Local LLM Ready
Works seamlessly with locally hosted LLMs via Ollama, LM Studio, or any OpenAI-compatible local server. No API keys, no rate limits, complete control.
How It Works
From codebase to AI superpowers in four simple steps
Analyze
Point Devscriptor at your codebase. It scans all files across 170+ languages, extracting functions, classes, relationships, and structure.
Store
All extracted knowledge is stored in a single SQLite file—entities, relationships, and vector embeddings for semantic search.
Query
Your AI agent connects via MCP and queries the knowledge graph. Find functions, understand relationships, search by meaning.
Intelligence
The AI responds with context-aware answers, referencing actual code relationships and maintaining knowledge across sessions.
170+ languages
Knowledge Graph
Claude, Cline, etc.
Get Started in Minutes
Install with pip, configure MCP, and start querying your codebase
Install Devscriptor
Install the base package with pip
pip install devscriptorInstall with MCP Support
Includes Model Context Protocol server support
pip install devscriptor[mcp]Quick Start
Start analyzing code in 3 lines
from devscriptor import Devscriptor
# Initialize
cg = Devscriptor()
# Analyze your codebase
result = cg.analyze_codebase("/path/to/your/code")
# Search for functions
functions = cg.search_functions("process_data")Configure MCP (Optional)
Add to your Claude Desktop or Cline configuration for seamless AI integration
{
"mcpServers": {
"devscriptor": {
"command": "python",
"args": ["-m", "devscriptor.mcp"]
}
}
}Works With Your Favorite AI Agents
Native MCP server with 20+ tools for seamless AI integration
Core Analysis
analyze_codebaseAnalyze codebase structure
search_functionsFind functions by name
search_classesFind classes by name
semantic_searchFind code by meaning
get_entity_detailsGet full entity info
Advanced Analysis
find_dead_codeDetect unused code
analyze_complexityCode complexity metrics
detect_clonesFind duplicate code
get_relationshipsEntity relationships
Context Graph
add_context_noteAdd knowledge notes
search_contextSearch stored knowledge
describe_fileCreate file description
get_file_contextGet file history
query_with_contextContext-first queries
Example Conversation
Find all functions that handle user authentication
I'll search for authentication-related functions using semantic search...
Found 5 functions: validate_token(), authenticate_user(), login(), verify_session(), check_permissions()
AI Agents That Remember
Context Graph is a long-term memory system that persists knowledge across AI sessions. Your AI assistant never forgets architectural decisions, coding standards, or project context.
File Context Workflow
Single-description-per-file pattern with modification chains. Continue exactly where you left off, even across days or weeks.
Code Linking
Link notes to specific functions, classes, or files. Automatically detects when linked code changes and flags stale context.
Context-First Queries
AI agents query Context Graph before Code Graph for efficiency. Fast semantic search across your project knowledge.
Structured Note Types
7 specialized note types for comprehensive knowledge capture
Architectural Decision Records
Document significant architectural decisions with context, consequences, and alternatives.
contextdecisionconsequencesalternativesCoding Standards
Store coding conventions, style guides, and project standards.
scopestandardrationaleenforcementPerformance Optimizations
Record performance improvements with before/after metrics.
problemsolutionbefore_metricsafter_metricsLessons Learned
Document failures and how to prevent them in the future.
errorcausefixpreventionSession Continuity in Action
WEEK 1
AI creates a file description for auth.py: “Handles JWT authentication with refresh token support”
WEEK 2
AI records modification: “Added OAuth2 support for Google login”
WEEK 3 - AI REMEMBERS
“I see this file handles JWT auth with refresh tokens and recently added OAuth2. The Google login flow is in the OAuth2 handler function...”
Your Code Never Leaves Your Machine
100% local processing. Completely isolated. Perfect for proprietary codebases and security-conscious teams.
Zero External Dependencies
Everything runs locally. No cloud services, no external APIs, no data leaves your machine.
SQLite Database
Your codebase knowledge stored in a single local file that you control completely.
Works Offline
No internet connection required. Analyze code and query your knowledge graph anywhere.
Sandboxed Environment
Completely isolated from system processes. Safe to run on any codebase.
Devscriptor vs Cloud-Based Solutions
| Feature | Devscriptor | Cloud Services |
|---|---|---|
| Code leaves your machine | - | |
| Requires internet connection | - | |
| Third-party data processing | - | |
| Storage on external servers | - | |
| Works with local LLMs | - | |
| Zero API costs | - | |
| Complete data control | - |
Whether you're working on proprietary software, handling sensitive data, or simply value your privacy—Devscriptor keeps your code where it belongs: on your machine.
See It In Action
From basic analysis to advanced AI memory management
Analyze Codebase
Build a complete knowledge graph of your codebase
from devscriptor import Devscriptor
# Initialize Devscriptor
cg = Devscriptor()
# Analyze a codebase
result = cg.analyze_codebase("/path/to/your/codebase")
print(f"Files processed: {result['files_processed']}")
print(f"Entities found: {result['total_entities']}")Semantic Search
Find code by meaning, not just keywords
from devscriptor import Devscriptor
cg = Devscriptor()
# Find functions by semantic meaning
results = cg.find_similar(
"function that validates user input"
)
for result in results:
print(f"{result['name']}: {result['similarity_score']:.2f}")Context Graph - ADR
Store architectural decisions for AI memory
from devscriptor import Devscriptor
from devscriptor.context_graph import NoteType
cg = Devscriptor()
# Add an Architectural Decision Record
result = cg.context_graph.add_note(
title="Use PostgreSQL for Main Database",
content="We decided to use PostgreSQL...",
note_type=NoteType.ADR,
tags=["database", "architecture"],
structured_data={
"context": "Need reliable data storage",
"decision": "Use PostgreSQL 15",
"consequences": "Requires migration"
}
)
print(f"Note created: {result['note_id']}")File Context Workflow
Maintain session continuity across AI conversations
from devscriptor import Devscriptor
cg = Devscriptor()
# Create file description
desc = cg.context_graph.file_tracker.create_file_description(
file_path="src/auth.py",
purpose="Handles JWT authentication",
key_entities=["validate_token", "AuthMiddleware"],
dependencies=["fastapi", "jose"]
)
# Record modification after editing
mod = cg.context_graph.file_tracker.create_modification(
file_path="src/auth.py",
changes_summary="Added refresh token support",
rationale="Users were being logged out too frequently"
)
# Get complete context
context = cg.context_graph.file_tracker.get_file_context("src/auth.py")
print(f"Modifications: {len(context.modifications)}")Advanced Analysis
Detect dead code and complexity issues
from devscriptor import Devscriptor
cg = Devscriptor()
cg.analyze_codebase("/path/to/code")
# Find dead code
dead_code = cg.detect_dead_code()
print(f"Unused functions: {len(dead_code['functions'])}")
print(f"Unused classes: {len(dead_code['classes'])}")
# Analyze complexity
complexity = cg.analyze_complexity()
for func in complexity['high_complexity']:
print(f"{func.name}: cyclomatic={func.cyclomatic}")MCP Server Usage
Query via MCP tools from any AI agent
# Configure in Claude Desktop or Cline
{
"mcpServers": {
"devscriptor": {
"command": "python",
"args": ["-m", "devscriptor.mcp"]
}
}
}
# Then ask your AI:
# "Find all functions related to user authentication"
# "What's the complexity of the payment module?"
# "Show me dead code in the codebase"Frequently Asked Questions
Everything you need to know about Devscriptor
What is Devscriptor?
Devscriptor is a code analysis tool that builds a knowledge graph of your codebase in a single SQLite file. It extracts entities (functions, classes, etc.), relationships (calls, inheritance), and enables semantic search—making your codebase queryable by AI agents.
How does the MCP integration work?
Devscriptor includes a native MCP (Model Context Protocol) server that exposes 20+ tools to AI agents like Claude Desktop and Cline. Once configured, your AI can query your codebase using natural language, find related code, detect dead code, and more—directly through the MCP protocol.
Is my code sent to any external service?
No. Devscriptor is 100% local. All analysis happens on your machine, all data is stored in a local SQLite file, and nothing is sent to external servers. This makes it perfect for proprietary codebases and sensitive projects.
Does it work with local LLMs?
Yes! Devscriptor works great with locally hosted LLMs via Ollama, LM Studio, or any OpenAI-compatible local server. Since all processing is local, you can use it with local models for complete privacy and zero API costs.
What languages are supported?
Devscriptor supports 170+ programming languages through tree-sitter. This includes Python, JavaScript, TypeScript, Java, C++, C#, Go, Rust, Ruby, Swift, Kotlin, and many more. If a language has a tree-sitter grammar, Devscriptor can analyze it.
What is the Context Graph?
Context Graph is an AI memory system that persists across sessions. It stores architectural decisions, coding standards, file descriptions, and modification history. When you return to a project days or weeks later, your AI remembers the context and can continue where you left off.
How does semantic search work?
Devscriptor generates vector embeddings for your code using FastEmbed or SentenceTransformers. These embeddings capture the semantic meaning of code, allowing you to search by concept (e.g., "functions that handle authentication") rather than just keywords.
Is it suitable for large codebases?
Yes. Devscriptor uses parallel processing, efficient SQLite storage, and streaming for large files. It can handle enterprise-scale codebases with millions of lines of code.
How do I get started?
Install with pip: pip install devscriptor. Then import it in Python and call analyze_codebase() on your code directory. For MCP integration, install with pip install devscriptor[mcp] and add the MCP config to your AI agent.
Is Devscriptor free?
Yes, Devscriptor is open source and free to use. Check the GitHub repository for the license and source code.
Still have questions?
Open an issue on GitHub