Lynkr

Lynkr – Claude Code-Compatible Proxy for Databricks

Lynkr is an open-source Claude Code-compatible proxy that allows the Claude Code CLI to run directly with any LLMs without losing the features offered by Anthropic backend. It supports MCP servers, Git workflows, repo intelligence, workspace tools, prompt caching for LLM-powered development and many other features.


Lynkr

MCP • Git Tools • Repo Intelligence • Prompt Caching • Workspace Automation

⭐ Star on GitHub ·
📘 Documentation ·
🐙 Source Code


🚀 What is Lynkr?

Lynkr is an open-source Claude Code-compatible backend proxy that lets you run the Claude Code CLI and Claude-style tools directly against Databricks or Azure-hosted Anthropic models instead of the default Anthropic cloud.

It enables full repo-aware LLM workflows:

This makes Databricks a first-class environment for AI-assisted software development, LLM agents, automated refactoring, debugging, and ML/ETL workflow exploration.


🌟 Key Features (SEO Summary)

✔ Claude Code-compatible API (/v1/messages)

Emulates Anthropic’s backend so the Claude Code CLI works without modification.

✔ Works with Databricks LLM Serving

Supports Databricks-hosted Claude Sonnet / Haiku models, or any LLM served from Databricks.

✔ Supports Azure Anthropic models

Route Claude Code requests into Azure’s /anthropic/v1/messages endpoint.

✔ Supports Azure OpenAI models

Connect to Azure OpenAI deployments (GPT-4o, etc.) with full tool calling support.

✔ Supports OpenRouter (100+ models)

Access GPT-4o, Claude, Gemini, Llama, and more through a single unified API with full tool calling support.

✔ Full Model Context Protocol (MCP) integration

Auto-discovers MCP manifests and exposes them as tools for smart workflows.

✔ Repo Intelligence: CLAUDE.md, Symbol Index, Cross-file analysis

Lynkr builds a repo index using SQLite + Tree-sitter for rich context.

✔ Git Tools and Workflow Automation

Commit, push, diff, stage, generate release notes, etc.

✔ Prompt Caching (LRU + TTL)

Reuses identical prompts to reduce cost + latency.

✔ Workspace Tools

Task tracker, file I/O, test runner, index rebuild, etc.

✔ Client-Side Tool Execution (Passthrough Mode)

Tools can execute on the Claude Code CLI side instead of the server, enabling local file operations and commands.

✔ Fully extensible Node.js architecture

Add custom tools, policies, or backend adapters.


📚 Table of Contents


🧩 What Lynkr Solves

The Problem

Claude Code is exceptionally useful—but it only communicates with Anthropic’s hosted backend.

This means:

❌ You can’t point Claude Code at Databricks LLMs
❌ You can’t run Claude workflows locally, offline, or in secure contexts
❌ MCP tools must be managed manually
❌ You don’t control caching, policies, logs, or backend behavior

The Solution: Lynkr

Lynkr is a Claude Code-compatible backend that sits between the CLI and your actual model provider.


Claude Code CLI
↓
Lynkr Proxy
↓
Databricks / Azure Anthropic / MCP / Tools

This enables:


🏗 Architecture Overview


Claude Code CLI
↓  (HTTP POST /v1/messages)
Lynkr Proxy (Node.js + Express)
↓
────────────────────────────────────────
│  Orchestrator (Agent Loop)          │
│  ├─ Tool Execution Pipeline         │
│  ├─ MCP Registry + Sandbox          │
│  ├─ Prompt Cache (LRU + TTL)        │
│  ├─ Session Store (SQLite)          │
│  ├─ Repo Indexer (Tree-sitter)      │
│  ├─ Policy Engine                   │
────────────────────────────────────────
↓
Databricks / Azure Anthropic / Other Providers

Key directories:


⚙ Installation

npm install -g lynkr
lynkr start

Homebrew

brew tap vishalveerareddy123/lynkr
brew install vishalveerareddy123/lynkr/lynkr

From source

git clone https://github.com/vishalveerareddy123/Lynkr.git
cd Lynkr
npm install
npm start

🔧 Configuring Providers

Databricks Setup

MODEL_PROVIDER=databricks
DATABRICKS_API_BASE=https://<workspace>.cloud.databricks.com
DATABRICKS_API_KEY=<personal-access-token>
DATABRICKS_ENDPOINT_PATH=/serving-endpoints/databricks-claude-sonnet-4-5/invocations
WORKSPACE_ROOT=/path/to/your/repo
PORT=8080

Azure Anthropic Setup

MODEL_PROVIDER=azure-anthropic
AZURE_ANTHROPIC_ENDPOINT=https://<resource>.services.ai.azure.com/anthropic/v1/messages
AZURE_ANTHROPIC_API_KEY=<api-key>
AZURE_ANTHROPIC_VERSION=2023-06-01
WORKSPACE_ROOT=/path/to/repo
PORT=8080

Azure OpenAI Setup

MODEL_PROVIDER=azure-openai
AZURE_OPENAI_ENDPOINT=https://<resource>.openai.azure.com
AZURE_OPENAI_API_KEY=<api-key>
AZURE_OPENAI_DEPLOYMENT=gpt-4o
PORT=8080

OpenRouter Setup

What is OpenRouter?

OpenRouter provides unified access to 100+ AI models (GPT-4o, Claude, Gemini, Llama, etc.) through a single API. Benefits:

Configuration:

MODEL_PROVIDER=openrouter
OPENROUTER_API_KEY=sk-or-v1-...                                    # Get from https://openrouter.ai/keys
OPENROUTER_MODEL=openai/gpt-4o-mini                                # See https://openrouter.ai/models
OPENROUTER_ENDPOINT=https://openrouter.ai/api/v1/chat/completions
PORT=8080
WORKSPACE_ROOT=/path/to/your/repo

Popular Models:

See https://openrouter.ai/models for complete list.

Getting Started:

  1. Visit https://openrouter.ai
  2. Sign in with GitHub/Google/email
  3. Create API key at https://openrouter.ai/keys
  4. Add credits (minimum $5)
  5. Configure Lynkr as shown above

💬 Using Lynkr With Claude Code CLI

export ANTHROPIC_BASE_URL=http://localhost:8080
export ANTHROPIC_API_KEY=dummy

Then:

claude chat
claude diff
claude review
claude apply

Everything routes through your Databricks or Azure model.


🧠 Repo Intelligence & Indexing

Lynkr uses Tree-sitter and SQLite to analyze your workspace:

It generates a structured CLAUDE.md so the model always has context.


⚡ Prompt Caching

Lynkr includes an LRU+TTL prompt cache.

Benefits:

Configure:

PROMPT_CACHE_ENABLED=true
PROMPT_CACHE_TTL_MS=300000
PROMPT_CACHE_MAX_ENTRIES=64

🧩 Model Context Protocol (MCP)

Lynkr automatically discovers MCP manifests from:

~/.claude/mcp

or directories defined via:

MCP_MANIFEST_DIRS

MCP tools become available inside the Claude Code environment, including:

Optional sandboxing uses Docker or OCI runtimes.


🔧 Git Tools

Lynkr includes a full suite of Git operations:

Policies:

Example:

Disallow push unless tests pass? Set POLICY_GIT_REQUIRE_TESTS=true.


🔄 Client-Side Tool Execution (Passthrough Mode)

Lynkr supports client-side tool execution, enabling tools to execute on the Claude Code CLI machine instead of the proxy server.

Enable passthrough mode:

export TOOL_EXECUTION_MODE=client
npm start

How it works:

  1. Model generates tool calls (from Databricks/OpenRouter/Ollama)
  2. Proxy converts to Anthropic format with tool_use blocks
  3. Claude Code CLI receives tool_use blocks and executes locally
  4. CLI sends tool_result blocks back in the next request
  5. Proxy forwards complete conversation back to the model

Benefits:

Use cases:

Configuration:


🧪 API Example (Index Rebuild)

curl http://localhost:8080/v1/messages \
  -H 'Content-Type: application/json' \
  -d '{
    "model": "claude-proxy",
    "messages": [{ "role": "user", "content": "Rebuild the index." }],
    "tool_choice": {
      "type": "function",
      "function": { "name": "workspace_index_rebuild" }
    }
  }'

🤖 ACE Framework Working Nature

Lynkr’s agentic architecture is inspired by the Autonomous Cognitive Entity (ACE) Framework, specifically implementing the Reflector pattern to enable self-improving capabilities.

The Agentic Loop

  1. Input Processing: The Orchestrator receives natural language intent from the user.
  2. Execution (Agent Model): The system executes tools (Git, Search, File Ops) to achieve the goal.
  3. Reflection (Reflector Role): After execution types, the Reflector agent analyzes the transcript to extract “skills” and optimize future performance.

The Reflector

The Reflector (src/agents/reflector.js) is an introspective component that analyzes:

This “working nature” allows Lynkr to not just execute commands, but to learn from interaction, continuously refining its internal heuristics for tool selection and planning.


🛣 Roadmap

✅ Recently Completed (December 2025)

🔮 Future Features


🔗 Links

If you use Databricks or Azure Anthropic and want rich Claude Code workflows, Lynkr gives you the control and extensibility you need.

Feel free to open issues, contribute tools, or integrate with MCP servers!