mcp

module
v1.2.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 3, 2026 License: MIT

README

Arkestone MCP Servers

CI codecov Go Report Card Go Version Release

A suite of Model Context Protocol (MCP) servers for GitHub Copilot and other AI coding assistants. Each server dynamically serves a different type of context — custom instructions, skills, prompts, ADRs, persistent memory, or knowledge graphs — from local directories and GitHub repositories.

Built with the Go MCP SDK. Browse the MCP Registry for the full ecosystem of MCP servers.

Table of Contents

Available Servers

Server Description Port
mcp-instructions Serves Copilot custom instruction files (.github/copilot-instructions.md, .github/instructions/**) from local dirs and GitHub repos :8080
mcp-skills Serves Copilot skills (skills/*/SKILL.md) with frontmatter metadata and reference bundles :8081
mcp-prompts Serves VS Code Copilot prompt files (.github/prompts/*.prompt.md) and chat mode files :8082
mcp-adr Serves Architecture Decision Records from docs/adr/, docs/decisions/, or doc/adr/ :8083
mcp-memory Persistent memory store — remember, recall, and forget information across sessions :8084
mcp-graph Knowledge graph — store entities and relationships, query neighbors and shortest paths :8085

Each server has its own README and CHANGELOG.

Installation

go install

The fastest way to install. Requires Go 1.24+.

# Install the latest version
go install github.com/Arkestone/mcp/servers/mcp-instructions/cmd/mcp-instructions@latest
go install github.com/Arkestone/mcp/servers/mcp-skills/cmd/mcp-skills@latest
go install github.com/Arkestone/mcp/servers/mcp-prompts/cmd/mcp-prompts@latest
go install github.com/Arkestone/mcp/servers/mcp-adr/cmd/mcp-adr@latest
go install github.com/Arkestone/mcp/servers/mcp-memory/cmd/mcp-memory@latest
go install github.com/Arkestone/mcp/servers/mcp-graph/cmd/mcp-graph@latest

# Install a pinned version
go install github.com/Arkestone/mcp/servers/mcp-instructions/cmd/[email protected]
Docker

Pre-built images are published to the GitHub Container Registry after each release:

# Pull the latest version
docker pull ghcr.io/arkestone/mcp-instructions:latest
docker pull ghcr.io/arkestone/mcp-skills:latest
docker pull ghcr.io/arkestone/mcp-prompts:latest
docker pull ghcr.io/arkestone/mcp-adr:latest
docker pull ghcr.io/arkestone/mcp-memory:latest
docker pull ghcr.io/arkestone/mcp-graph:latest

# Pull a specific version
docker pull ghcr.io/arkestone/mcp-instructions:v0.0.1
Pre-built Binaries

Download pre-built binaries for Linux, macOS, and Windows from GitHub Releases:

# Example: Linux amd64
curl -L https://github.com/Arkestone/mcp/releases/latest/download/mcp-instructions_linux_amd64.tar.gz | tar xz
mv mcp-instructions /usr/local/bin/
Build from Source
git clone https://github.com/Arkestone/mcp.git
cd mcp
make build    # → ./bin/mcp-instructions, ./bin/mcp-skills, ...

Quick Start

# stdio — used by VS Code, Claude Desktop, Cursor, etc.
mcp-instructions -dirs /path/to/repo

# HTTP — for shared team deployments
mcp-instructions -transport http -addr :8080 -repos github/awesome-copilot

# With LLM optimization
mcp-instructions -transport http -llm.enabled -llm.endpoint https://api.openai.com/v1

See each server's README for the full configuration reference.

Transport Mechanisms

Transport When to Use
stdio Local clients (VS Code, Claude Desktop, Cursor, etc.) — the client spawns the server process
Streamable HTTP Remote/shared deployments — the server runs independently, clients connect over HTTP

All servers default to stdio. Pass -transport http to switch to HTTP.

Quick Install in VS Code

Click to install any server directly into VS Code or VS Code Insiders. stdio requires the binary installed locally (go install); HTTP connects to a running local server; Docker runs the server in a container (no local install needed). File-scanning servers use ${workspaceFolder} — open a folder first.

Server Transport VS Code VS Code Insiders
mcp-instructions stdio VS Code VS Code Insiders
HTTP VS Code VS Code Insiders
Docker VS Code VS Code Insiders
mcp-skills stdio VS Code VS Code Insiders
HTTP VS Code VS Code Insiders
Docker VS Code VS Code Insiders
mcp-prompts stdio VS Code VS Code Insiders
HTTP VS Code VS Code Insiders
Docker VS Code VS Code Insiders
mcp-adr stdio VS Code VS Code Insiders
HTTP VS Code VS Code Insiders
Docker VS Code VS Code Insiders
mcp-memory stdio VS Code VS Code Insiders
HTTP VS Code VS Code Insiders
Docker VS Code VS Code Insiders
mcp-graph stdio VS Code VS Code Insiders
HTTP VS Code VS Code Insiders
Docker VS Code VS Code Insiders

MCP Client Configuration

VS Code

Quick install: Use the badges in Quick Install in VS Code above, or open the Command Palette (Ctrl+Shift+P / Cmd+Shift+P) and run MCP: Open User Configuration to add a server to your global mcp.json. See the VS Code MCP documentation for full details.

.vscode/mcp.json (project-level)
{
  "servers": {
    "instructions": { "command": "mcp-instructions", "args": ["-dirs", "${workspaceFolder}"] },
    "skills":       { "command": "mcp-skills",       "args": ["-dirs", "${workspaceFolder}"] },
    "prompts":      { "command": "mcp-prompts",      "args": ["-dirs", "${workspaceFolder}"] },
    "adrs":         { "command": "mcp-adr",          "args": ["-dirs", "${workspaceFolder}"] },
    "memory":       { "command": "mcp-memory" },
    "graph":        { "command": "mcp-graph" }
  }
}
Global user settings (settings.json)
{
  "github.copilot.chat.mcp.servers": {
    "instructions": {
      "command": "mcp-instructions",
      "args": ["-dirs", "/path/to/repo"]
    }
  }
}
Claude Desktop

macOS: ~/Library/Application Support/Claude/claude_desktop_config.json Windows: %APPDATA%\Claude\claude_desktop_config.json

{
  "mcpServers": {
    "instructions": {
      "command": "mcp-instructions",
      "args": ["-dirs", "/path/to/repo"]
    },
    "skills": {
      "command": "mcp-skills",
      "args": ["-dirs", "/path/to/repo"]
    },
    "memory": {
      "command": "mcp-memory",
      "env": { "MEMORY_DIR": "~/.local/share/mcp-memory" }
    },
    "graph": {
      "command": "mcp-graph"
    }
  }
}
Cursor
.cursor/mcp.json (project-level)
{
  "mcpServers": {
    "instructions": {
      "command": "mcp-instructions",
      "args": ["-dirs", "."]
    },
    "skills": {
      "command": "mcp-skills",
      "args": ["-dirs", "."]
    },
    "prompts": {
      "command": "mcp-prompts",
      "args": ["-dirs", "."]
    },
    "memory": {
      "command": "mcp-memory"
    }
  }
}
Global (~/.cursor/mcp.json)
{
  "mcpServers": {
    "memory": {
      "command": "mcp-memory",
      "env": { "MEMORY_DIR": "~/.local/share/mcp-memory" }
    }
  }
}
Windsurf

macOS/Linux: ~/.codeium/windsurf/mcp_config.json Windows: %USERPROFILE%\.codeium\windsurf\mcp_config.json

{
  "mcpServers": {
    "instructions": {
      "command": "mcp-instructions",
      "args": ["-dirs", "/path/to/repo"]
    },
    "memory": {
      "command": "mcp-memory",
      "env": { "MEMORY_DIR": "~/.local/share/mcp-memory" }
    }
  }
}
Claude Code

~/.mcp.json (global) or .mcp.json (project-level)

{
  "mcpServers": {
    "instructions": {
      "command": "mcp-instructions",
      "args": ["-dirs", "."]
    },
    "skills": {
      "command": "mcp-skills",
      "args": ["-dirs", "."]
    },
    "memory": {
      "command": "mcp-memory"
    },
    "graph": {
      "command": "mcp-graph"
    }
  }
}
JetBrains

In JetBrains IDEs (IntelliJ IDEA, GoLand, PyCharm, etc.), go to Settings → Tools → AI Assistant → Model Context Protocol (MCP):

{
  "mcpServers": {
    "instructions": {
      "command": "mcp-instructions",
      "args": ["-dirs", "$PROJECT_DIR$"]
    },
    "memory": {
      "command": "mcp-memory"
    }
  }
}
Zed

Add to ~/.config/zed/settings.json:

{
  "context_servers": {
    "mcp-instructions": {
      "command": {
        "path": "mcp-instructions",
        "args": ["-dirs", "/path/to/repo"]
      }
    },
    "mcp-memory": {
      "command": {
        "path": "mcp-memory",
        "args": []
      }
    }
  }
}
Docker Compose — all servers
services:
  mcp-instructions:
    image: ghcr.io/arkestone/mcp-instructions:latest
    ports: ["8080:8080"]
    environment:
      INSTRUCTIONS_TRANSPORT: http
      INSTRUCTIONS_DIRS: /data
    volumes: ["./instructions:/data:ro"]

  mcp-skills:
    image: ghcr.io/arkestone/mcp-skills:latest
    ports: ["8081:8081"]
    environment:
      SKILLS_TRANSPORT: http
      SKILLS_DIRS: /data
    volumes: ["./skills:/data:ro"]

  mcp-prompts:
    image: ghcr.io/arkestone/mcp-prompts:latest
    ports: ["8082:8082"]
    environment:
      PROMPTS_TRANSPORT: http
      PROMPTS_SOURCES_DIRS: /data
    volumes: ["./prompts:/data:ro"]

  mcp-adr:
    image: ghcr.io/arkestone/mcp-adr:latest
    ports: ["8083:8083"]
    environment:
      ADR_TRANSPORT: http
      ADR_DIRS: /data
    volumes: ["./docs:/data:ro"]

  mcp-memory:
    image: ghcr.io/arkestone/mcp-memory:latest
    ports: ["8084:8084"]
    environment:
      MEMORY_TRANSPORT: http
      MEMORY_DIR: /data
    volumes: ["memory-data:/data"]

  mcp-graph:
    image: ghcr.io/arkestone/mcp-graph:latest
    ports: ["8085:8085"]
    environment:
      GRAPH_TRANSPORT: http
      GRAPH_DATA_FILE: /data/graph.json
    volumes: ["graph-data:/data"]

volumes:
  memory-data:
  graph-data:

Then connect your client over HTTP:

{
  "mcpServers": {
    "instructions": {
      "type": "http",
      "url": "http://localhost:8080/mcp"
    }
  }
}

LLM Optimization

All content servers (instructions, skills, prompts, ADRs) optionally consolidate multiple sources using an OpenAI-compatible LLM endpoint. This deduplicates and logically organizes content from multiple repositories.

mcp-instructions \
  -dirs . \
  -llm.enabled \
  -llm.endpoint https://api.openai.com/v1 \
  -llm.api-key $LLM_API_KEY \
  -llm.model gpt-4o-mini

Supported providers: OpenAI, Azure OpenAI, Ollama, LM Studio, and any OpenAI-compatible endpoint.

See config.llm.example.yaml for a complete configuration example.

GitHub Authentication (Optional)

A GitHub token is optional. Public repositories work without authentication. For private repositories, provide a token (highest priority first):

Method Example
CLI flag -github-token ghp_xxx
Prefixed env var INSTRUCTIONS_GITHUB_TOKEN=ghp_xxx
Global env var GITHUB_TOKEN=ghp_xxx
YAML config github_token: ghp_xxx

Network & Proxy

All servers work on-premise, in private/public cloud, with direct internet or through HTTP/HTTPS proxies. See the Network & Proxy Guide for firewall rules, proxy configuration, and custom CA certificates.

Architecture

.
├── servers/
│   ├── mcp-instructions/   # custom instructions server  (:8080)
│   ├── mcp-skills/         # skills server               (:8081)
│   ├── mcp-prompts/        # prompt files server         (:8082)
│   ├── mcp-adr/            # ADR server                  (:8083)
│   ├── mcp-memory/         # persistent memory server    (:8084)
│   └── mcp-graph/          # knowledge graph server      (:8085)
├── pkg/
│   ├── config/             # shared configuration loading (YAML → env → flags)
│   ├── github/             # GitHub Contents API client
│   ├── httputil/           # proxy, TLS, header propagation
│   ├── optimizer/          # shared LLM optimization layer (OpenAI-compatible)
│   ├── server/             # MCP server bootstrap helpers
│   └── syncer/             # background repo sync
├── docs/
│   └── network.md          # network / proxy / firewall guide
├── examples/               # client configuration examples
└── AGENTS.md               # AI coding assistant guide

Each content server follows the same layered design:

  1. Config — YAML → environment variables → CLI flags (each layer overrides the previous)
  2. Loader / Scanner — discovers content from local directories and GitHub repositories
  3. Optimizer — optional LLM-based consolidation via pkg/optimizer
  4. MCP Server — exposes content as Resources, Prompts, and Tools over stdio or Streamable HTTP

Shared Packages

Package Description
pkg/config Unified config loading: YAML → env vars → CLI flags
pkg/github GitHub Contents API client with proxy and header pass-through
pkg/httputil Proxy support, custom TLS/CA certificates, header propagation
pkg/optimizer OpenAI-compatible LLM client for optional content consolidation
pkg/server MCP server bootstrap and /healthz endpoint helpers
pkg/syncer Background periodic sync for remote GitHub repositories

Development

make build              # build all servers into ./bin/
make test               # run unit tests
make test-integration   # run integration tests (requires LLM_ENDPOINT)
make docker             # build Docker images for all servers
make lint               # run golangci-lint
make cover              # generate coverage report

Contributing

Contributions are welcome! Please see CONTRIBUTING.md for guidelines.

License

This project is licensed under the MIT License — see the LICENSE file for details.

Directories

Path Synopsis
pkg
cache
Package cache provides a generic in-memory list cache with TTL-based invalidation, designed to protect loaders from repeated expensive disk scans within a single agent turn.
Package cache provides a generic in-memory list cache with TTL-based invalidation, designed to protect loaders from repeated expensive disk scans within a single agent turn.
config
Package config provides unified configuration loading for all MCP servers.
Package config provides unified configuration loading for all MCP servers.
filter
Package filter provides keyword relevance scoring for MCP item filtering.
Package filter provides keyword relevance scoring for MCP item filtering.
github
Package github provides a client for the GitHub Contents API.
Package github provides a client for the GitHub Contents API.
glob
Package glob provides glob pattern matching with ** support and brace expansion.
Package glob provides glob pattern matching with ** support and brace expansion.
httputil
Package httputil provides shared HTTP infrastructure for proxy support, custom TLS, and header propagation across all MCP servers.
Package httputil provides shared HTTP infrastructure for proxy support, custom TLS, and header propagation across all MCP servers.
optimizer
Package optimizer uses an OpenAI-compatible LLM endpoint to merge, deduplicate, and consolidate multiple instruction files into a single coherent output.
Package optimizer uses an OpenAI-compatible LLM endpoint to merge, deduplicate, and consolidate multiple instruction files into a single coherent output.
server
Package server provides shared MCP server utilities.
Package server provides shared MCP server utilities.
syncer
Package syncer provides a reusable background sync goroutine.
Package syncer provides a reusable background sync goroutine.
testutil
Package testutil provides shared test helpers across the mcp monorepo.
Package testutil provides shared test helpers across the mcp monorepo.
servers
mcp-adr/internal/scanner
Package scanner provides on-demand access to Architecture Decision Records from local directories and GitHub repositories.
Package scanner provides on-demand access to Architecture Decision Records from local directories and GitHub repositories.
mcp-graph/internal/graph
Package graph provides a thread-safe in-memory knowledge graph backed by a JSON file for persistence.
Package graph provides a thread-safe in-memory knowledge graph backed by a JSON file for persistence.
mcp-instructions/internal/loader
Package loader provides on-demand access to Copilot custom instruction files from local directories and GitHub repositories.
Package loader provides on-demand access to Copilot custom instruction files from local directories and GitHub repositories.
mcp-memory/internal/store
Package store provides a file-based persistent memory store.
Package store provides a file-based persistent memory store.
mcp-prompts/internal/loader
Package loader provides on-demand access to Copilot prompt files from local directories and GitHub repositories.
Package loader provides on-demand access to Copilot prompt files from local directories and GitHub repositories.
mcp-skills/internal/scanner
Package scanner provides on-demand access to Copilot skill definitions from local directories and GitHub repositories.
Package scanner provides on-demand access to Copilot skill definitions from local directories and GitHub repositories.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL