Deep Package Research
AI-powered comprehensive package documentation using LLM knowledge
How It Works
The deep research feature leverages LLM knowledge to provide comprehensive documentation about any package, library, or framework. It uses a single, powerful prompt to extract detailed information from the model's training data.
Fetches real-time documentation from Context7's aggregated sources. When you provide a specific query, the documentation is enhanced with LLM synthesis for comprehensive answers.
Documentation Sources:
- • Context7 real-time docs aggregation
- • Official package documentation
- • Latest API references
- • Real code examples
- • LLM synthesis (when query provided)
- • More sources coming soon
How It Works:
- • Searches Context7's index
- • Fetches real documentation
- • Returns code examples & docs
- • Optionally enhances with LLM
✨ Key Benefit: Real-time docs, always current
Usage Examples
Basic Package Research
# Research a package deep_research_package({ "package_name": "fastapi" }) # Returns real-time documentation + synthesis: { "package": "fastapi", "status": "success", "documentation": { "snippets": [ { "title": "FastAPI Quickstart", "description": "Basic FastAPI application setup", "code": "from fastapi import FastAPI..." } ], "total_snippets": 47 }, "source": "real_docs", "provider": "UpstashProvider", "version": "2.0.0" }
Research Specific Topics
# Research specific aspect of a package deep_research_package({ "package_name": "fastapi", "query": "how to implement OAuth2 authentication" }) # Returns real docs + LLM synthesis: { "package": "fastapi", "query": "how to implement OAuth2 authentication", "status": "success", "documentation": { "snippets": [...] // Real FastAPI OAuth2 code examples }, "answer": "To implement OAuth2 in FastAPI, you can use the built-in security utilities...", "source": "real_docs+llm", "provider": "UpstashProvider" }
Compare Packages
# Compare similar packages deep_research_package({ "package_name": "fastapi vs flask vs django" }) # Returns comparative analysis: { "comparison": "FastAPI vs Flask vs Django", "summary_table": { "performance": {...}, "features": {...}, "learning_curve": {...}, "use_cases": {...} }, "detailed_comparison": [...], "migration_guides": [...], "recommendation": "Choose based on..." }
Configuration
# Set up your LLM provider (choose one) # Option 1: OpenAI export OPENAI_API_KEY="your-openai-key" # Option 2: Anthropic export ANTHROPIC_API_KEY="your-anthropic-key" # Option 3: Ollama (free, local) # Install: curl -fsSL https://ollama.com/install.sh | sh # Pull model: ollama pull qwen2.5-coder:latest # No API key needed! # Option 4: Pass API key directly deep_research_package({ "package_name": "numpy", "api_key": "your-api-key", # Optional "model": "gpt-4" # Optional, auto-detects by default })
Key Benefits
Always Up-to-Date
LLMs are trained on recent data, providing current documentation even for new packages.
Comprehensive Coverage
Get documentation for any package, even obscure ones, thanks to broad LLM training.
Contextual Understanding
Ask about specific use cases, integrations, or comparisons - the LLM understands context.
- •API Key Required: You need an LLM API key (OpenAI, Anthropic, etc.) or use Ollama locally for free.
- •Cost Considerations: Each research query uses LLM tokens. Monitor usage to control costs, or use Ollama for free local inference.
🚀 Quick Start with Ollama (Free)
Get started with deep research using free, local AI:
# 1. Install Ollama curl -fsSL https://ollama.com/install.sh | sh # 2. Pull a coding model ollama pull qwen2.5-coder:latest # 3. Use with kit-dev-mcp (no API key needed!) deep_research_package({ "package_name": "requests" })