Core Feature

Deep Package Research

AI-powered comprehensive package documentation using LLM knowledge

How It Works

The deep research feature leverages LLM knowledge to provide comprehensive documentation about any package, library, or framework. It uses a single, powerful prompt to extract detailed information from the model's training data.

Multi-Source Documentation Research
Context7
Chroma Package Search

Combines multiple documentation sources for comprehensive package research. Uses Chroma Package Search for source code exploration and Context7 for documentation aggregation. When you provide a specific query, results are enhanced with LLM synthesis for comprehensive answers.

Documentation Sources:

  • Chroma Package Search: Source code exploration
  • Context7: Real-time docs aggregation
  • • Official package documentation
  • • Latest API references
  • • Real code examples from repositories
  • • LLM synthesis (when query provided)

How It Works:

  • • Searches package source code via Chroma
  • • Fetches documentation from Context7
  • • Combines both sources intelligently
  • • Returns code examples & docs
  • • Optionally enhances with LLM

✨ Key Benefit: Both source code and docs in one query

Usage Examples

Basic Package Research

# Research a package
deep_research_package({
  "package_name": "fastapi"
})

# Returns real-time documentation + synthesis:
{
  "package": "fastapi",
  "status": "success",
  "documentation": {
    "snippets": [
      {
        "title": "FastAPI Quickstart",
        "description": "Basic FastAPI application setup",
        "code": "from fastapi import FastAPI..."
      }
    ],
    "total_snippets": 47
  },
  "chroma_results": [
    {
      "file_path": "fastapi/applications.py",
      "line_number": 45,
      "content": "class FastAPI(Starlette):"
    }
  ],
  "source": "multi_source",
  "providers": ["ChromaPackageSearch", "UpstashProvider"],
  "version": "2.0.0"
}

Research Specific Topics

# Research specific aspect of a package
deep_research_package({
  "package_name": "fastapi",
  "query": "how to implement OAuth2 authentication"
})

# Returns real docs + LLM synthesis:
{
  "package": "fastapi",
  "query": "how to implement OAuth2 authentication",
  "status": "success",
  "documentation": {
    "snippets": [...] // Real FastAPI OAuth2 code examples
  },
  "chroma_results": [
    {
      "file_path": "fastapi/security/oauth2.py",
      "snippet": "OAuth2 implementation details..."
    }
  ],
  "answer": "To implement OAuth2 in FastAPI, you can use the built-in security utilities...",
  "source": "multi_source+llm",
  "providers": ["ChromaPackageSearch", "UpstashProvider"]
}

Compare Packages

# Compare similar packages
deep_research_package({
  "package_name": "fastapi vs flask vs django"
})

# Returns comparative analysis:
{
  "comparison": "FastAPI vs Flask vs Django",
  "summary_table": {
    "performance": {...},
    "features": {...},
    "learning_curve": {...},
    "use_cases": {...}
  },
  "detailed_comparison": [...],
  "migration_guides": [...],
  "recommendation": "Choose based on..."
}

Configuration

# Set up your providers (optional but recommended)

# For source code search (Chroma Package Search)
export CHROMA_PACKAGE_SEARCH_API_KEY="your-chroma-key"
# Get from: https://cloud.trychroma.com

# For LLM synthesis (choose one)
# Option 1: OpenAI
export OPENAI_API_KEY="your-openai-key"

# Option 2: Anthropic
export ANTHROPIC_API_KEY="your-anthropic-key"

# Option 3: Ollama (free, local)
# Install: curl -fsSL https://ollama.com/install.sh | sh
# Pull model: ollama pull qwen2.5-coder:latest
# No API key needed!

# The tool will use all available providers automatically
deep_research_package({
  "package_name": "numpy",
  "query": "FFT implementation"  # Optional
})

Key Benefits

Always Up-to-Date

LLMs are trained on recent data, providing current documentation even for new packages.

Comprehensive Coverage

Get documentation for any package, even obscure ones, thanks to broad LLM training.

Contextual Understanding

Ask about specific use cases, integrations, or comparisons - the LLM understands context.

Important Notes
  • API Key Required: You need an LLM API key (OpenAI, Anthropic, etc.) or use Ollama locally for free.
  • Cost Considerations: Each research query uses LLM tokens. Monitor usage to control costs, or use Ollama for free local inference.

🚀 Quick Start with Ollama (Free)

Get started with deep research using free, local AI:

# 1. Install Ollama
curl -fsSL https://ollama.com/install.sh | sh

# 2. Pull a coding model
ollama pull qwen2.5-coder:latest

# 3. Use with kit-dev-mcp (no API key needed!)
deep_research_package({
  "package_name": "requests"
})