This commit is contained in:
373
README.md
373
README.md
@@ -65,204 +65,209 @@ The coordinator automatically manages external MCP servers based on configuratio
|
||||
}
|
||||
```
|
||||
|
||||
## Prerequisites
|
||||
## Setup
|
||||
|
||||
Choose one of these installation methods:
|
||||
|
||||
[Docker](#1-start-nats-server)
|
||||
<details>
|
||||
<summary>Docker</summary>
|
||||
|
||||
[Manual Installation](#manual-setup)
|
||||
### 1. Start NATS Server
|
||||
|
||||
- **Elixir**: 1.16+ with OTP 26+
|
||||
- **Node.js**: 18+ (for some MCP servers)
|
||||
- **uv**: If using python MCP servers
|
||||
First, start a NATS server that the Agent Coordinator can connect to:
|
||||
|
||||
### Docker Setup
|
||||
```bash
|
||||
# Start NATS server with persistent storage
|
||||
docker run -d \
|
||||
--name nats-server \
|
||||
--network agent-coordinator-net \
|
||||
-p 4222:4222 \
|
||||
-p 8222:8222 \
|
||||
-v nats_data:/data \
|
||||
nats:2.10-alpine \
|
||||
--jetstream \
|
||||
--store_dir=/data \
|
||||
--max_mem_store=1Gb \
|
||||
--max_file_store=10Gb
|
||||
|
||||
#### 1. Start NATS Server
|
||||
# Create the network first if it doesn't exist
|
||||
docker network create agent-coordinator-net
|
||||
```
|
||||
|
||||
First, start a NATS server that the Agent Coordinator can connect to:
|
||||
### 2. Configure Your AI Tools
|
||||
|
||||
```bash
|
||||
# Start NATS server with persistent storage
|
||||
docker run -d \
|
||||
--name nats-server \
|
||||
--network agent-coordinator-net \
|
||||
-p 4222:4222 \
|
||||
-p 8222:8222 \
|
||||
-v nats_data:/data \
|
||||
nats:2.10-alpine \
|
||||
--jetstream \
|
||||
--store_dir=/data \
|
||||
--max_mem_store=1Gb \
|
||||
--max_file_store=10Gb
|
||||
**For STDIO Mode (Recommended - Direct MCP Integration):**
|
||||
|
||||
# Create the network first if it doesn't exist
|
||||
docker network create agent-coordinator-net
|
||||
```
|
||||
First, create a Docker network and start the NATS server:
|
||||
|
||||
#### 2. Configure Your AI Tools
|
||||
```bash
|
||||
# Create network for secure communication
|
||||
docker network create agent-coordinator-net
|
||||
|
||||
**For STDIO Mode (Recommended - Direct MCP Integration):**
|
||||
# Start NATS server
|
||||
docker run -d \
|
||||
--name nats-server \
|
||||
--network agent-coordinator-net \
|
||||
-p 4222:4222 \
|
||||
-v nats_data:/data \
|
||||
nats:2.10-alpine \
|
||||
--jetstream \
|
||||
--store_dir=/data \
|
||||
--max_mem_store=1Gb \
|
||||
--max_file_store=10Gb
|
||||
```
|
||||
|
||||
First, create a Docker network and start the NATS server:
|
||||
Then add this configuration to your VS Code `mcp.json` configuration file via `ctrl + shift + p` → `MCP: Open User Configuration` or `MCP: Open Remote User Configuration` if running on a remote server:
|
||||
|
||||
```bash
|
||||
# Create network for secure communication
|
||||
docker network create agent-coordinator-net
|
||||
|
||||
# Start NATS server
|
||||
docker run -d \
|
||||
--name nats-server \
|
||||
--network agent-coordinator-net \
|
||||
-p 4222:4222 \
|
||||
-v nats_data:/data \
|
||||
nats:2.10-alpine \
|
||||
--jetstream \
|
||||
--store_dir=/data \
|
||||
--max_mem_store=1Gb \
|
||||
--max_file_store=10Gb
|
||||
```
|
||||
|
||||
Then add this configuration to your VS Code `mcp.json` configuration file via `ctrl + shift + p` → `MCP: Open User Configuration` or `MCP: Open Remote User Configuration` if running on a remote server:
|
||||
|
||||
```json
|
||||
{
|
||||
"servers": {
|
||||
"agent-coordinator": {
|
||||
"command": "docker",
|
||||
"args": [
|
||||
"run",
|
||||
"--network=agent-coordinator-net",
|
||||
"-v=./mcp_servers.json:/app/mcp_servers.json:ro",
|
||||
"-v=/path/to/your/workspace:/workspace:rw",
|
||||
"-e=NATS_HOST=nats-server",
|
||||
"-e=NATS_PORT=4222",
|
||||
"-i",
|
||||
"--rm",
|
||||
"ghcr.io/rooba/agentcoordinator:latest"
|
||||
],
|
||||
"type": "stdio"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Important Notes for File System Access:**
|
||||
|
||||
If you're using MCP filesystem servers, mount the directories they need access to:
|
||||
|
||||
```json
|
||||
{
|
||||
"args": [
|
||||
"run",
|
||||
"--network=agent-coordinator-net",
|
||||
"-v=./mcp_servers.json:/app/mcp_servers.json:ro",
|
||||
"-v=/home/user/projects:/home/user/projects:rw",
|
||||
"-v=/path/to/workspace:/workspace:rw",
|
||||
"-e=NATS_HOST=nats-server",
|
||||
"-e=NATS_PORT=4222",
|
||||
"-i",
|
||||
"--rm",
|
||||
"ghcr.io/rooba/agentcoordinator:latest"
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
**For HTTP/WebSocket Mode (Alternative - Web API Access):**
|
||||
|
||||
If you prefer to run as a web service instead of stdio:
|
||||
|
||||
```bash
|
||||
# Create network first
|
||||
docker network create agent-coordinator-net
|
||||
|
||||
# Start NATS server
|
||||
docker run -d \
|
||||
--name nats-server \
|
||||
--network agent-coordinator-net \
|
||||
-p 4222:4222 \
|
||||
-v nats_data:/data \
|
||||
nats:2.10-alpine \
|
||||
--jetstream \
|
||||
--store_dir=/data \
|
||||
--max_mem_store=1Gb \
|
||||
--max_file_store=10Gb
|
||||
|
||||
# Run Agent Coordinator in HTTP mode
|
||||
docker run -d \
|
||||
--name agent-coordinator \
|
||||
--network agent-coordinator-net \
|
||||
-p 8080:4000 \
|
||||
-v ./mcp_servers.json:/app/mcp_servers.json:ro \
|
||||
-v /path/to/workspace:/workspace:rw \
|
||||
-e NATS_HOST=nats-server \
|
||||
-e NATS_PORT=4222 \
|
||||
-e MCP_INTERFACE_MODE=http \
|
||||
-e MCP_HTTP_PORT=4000 \
|
||||
ghcr.io/rooba/agentcoordinator:latest
|
||||
```
|
||||
|
||||
Then access via HTTP API at `http://localhost:8080/mcp` or configure your MCP client to use the HTTP endpoint.
|
||||
|
||||
Create or edit `mcp_servers.json` in your project directory to configure external MCP servers:
|
||||
|
||||
```json
|
||||
{
|
||||
"servers": {
|
||||
"mcp_filesystem": {
|
||||
"type": "stdio",
|
||||
"command": "bunx",
|
||||
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/workspace"],
|
||||
"auto_restart": true
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Manual Setup
|
||||
|
||||
#### Clone the Repository
|
||||
|
||||
> It is suggested to install Elixir (and Erlang) via [asdf](https://asdf-vm.com/) for easy version management.
|
||||
> NATS can be found at [nats.io](https://github.com/nats-io/nats-server/releases/latest), or via Docker
|
||||
|
||||
```bash
|
||||
git clone https://github.com/rooba/agentcoordinator.git
|
||||
cd agentcoordinator
|
||||
mix deps.get
|
||||
mix compile
|
||||
```
|
||||
|
||||
#### Start the MCP Server directly
|
||||
|
||||
```bash
|
||||
# Start the MCP server directly
|
||||
export MCP_INTERFACE_MODE=stdio # or http / websocket
|
||||
# export MCP_HTTP_PORT=4000 # if using http mode
|
||||
|
||||
./scripts/mcp_launcher.sh
|
||||
|
||||
# Or in development mode
|
||||
mix run --no-halt
|
||||
```
|
||||
|
||||
### Run via VS Code or similar tools
|
||||
|
||||
Add this to your `mcp.json` or `mcp_servers.json` depending on your tool:
|
||||
|
||||
```json
|
||||
{
|
||||
"servers": {
|
||||
"agent-coordinator": {
|
||||
"command": "/path/to/agent_coordinator/scripts/mcp_launcher.sh",
|
||||
"args": [],
|
||||
"env": {
|
||||
"MIX_ENV": "prod",
|
||||
"NATS_HOST": "localhost",
|
||||
"NATS_PORT": "4222"
|
||||
```json
|
||||
{
|
||||
"servers": {
|
||||
"agent-coordinator": {
|
||||
"command": "docker",
|
||||
"args": [
|
||||
"run",
|
||||
"--network=agent-coordinator-net",
|
||||
"-v=./mcp_servers.json:/app/mcp_servers.json:ro",
|
||||
"-v=/path/to/your/workspace:/workspace:rw",
|
||||
"-e=NATS_HOST=nats-server",
|
||||
"-e=NATS_PORT=4222",
|
||||
"-i",
|
||||
"--rm",
|
||||
"ghcr.io/rooba/agentcoordinator:latest"
|
||||
],
|
||||
"type": "stdio"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
```
|
||||
|
||||
**Important Notes for File System Access:**
|
||||
|
||||
If you're using MCP filesystem servers, mount the directories they need access to:
|
||||
|
||||
```json
|
||||
{
|
||||
"args": [
|
||||
"run",
|
||||
"--network=agent-coordinator-net",
|
||||
"-v=./mcp_servers.json:/app/mcp_servers.json:ro",
|
||||
"-v=/home/user/projects:/home/user/projects:rw",
|
||||
"-v=/path/to/workspace:/workspace:rw",
|
||||
"-e=NATS_HOST=nats-server",
|
||||
"-e=NATS_PORT=4222",
|
||||
"-i",
|
||||
"--rm",
|
||||
"ghcr.io/rooba/agentcoordinator:latest"
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
**For HTTP/WebSocket Mode (Alternative - Web API Access):**
|
||||
|
||||
If you prefer to run as a web service instead of stdio:
|
||||
|
||||
```bash
|
||||
# Create network first
|
||||
docker network create agent-coordinator-net
|
||||
|
||||
# Start NATS server
|
||||
docker run -d \
|
||||
--name nats-server \
|
||||
--network agent-coordinator-net \
|
||||
-p 4222:4222 \
|
||||
-v nats_data:/data \
|
||||
nats:2.10-alpine \
|
||||
--jetstream \
|
||||
--store_dir=/data \
|
||||
--max_mem_store=1Gb \
|
||||
--max_file_store=10Gb
|
||||
|
||||
# Run Agent Coordinator in HTTP mode
|
||||
docker run -d \
|
||||
--name agent-coordinator \
|
||||
--network agent-coordinator-net \
|
||||
-p 8080:4000 \
|
||||
-v ./mcp_servers.json:/app/mcp_servers.json:ro \
|
||||
-v /path/to/workspace:/workspace:rw \
|
||||
-e NATS_HOST=nats-server \
|
||||
-e NATS_PORT=4222 \
|
||||
-e MCP_INTERFACE_MODE=http \
|
||||
-e MCP_HTTP_PORT=4000 \
|
||||
ghcr.io/rooba/agentcoordinator:latest
|
||||
```
|
||||
|
||||
Then access via HTTP API at `http://localhost:8080/mcp` or configure your MCP client to use the HTTP endpoint.
|
||||
|
||||
Create or edit `mcp_servers.json` in your project directory to configure external MCP servers:
|
||||
|
||||
```json
|
||||
{
|
||||
"servers": {
|
||||
"mcp_filesystem": {
|
||||
"type": "stdio",
|
||||
"command": "bunx",
|
||||
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/workspace"],
|
||||
"auto_restart": true
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary>Manual Setup</summary>
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- **Elixir**: 1.16+ with OTP 26+
|
||||
- **Node.js**: 18+ (for some MCP servers)
|
||||
- **uv**: If using python MCP servers
|
||||
|
||||
### Clone the Repository
|
||||
|
||||
It is suggested to install Elixir (and Erlang) via [asdf](https://asdf-vm.com/) for easy version management.
|
||||
|
||||
NATS can be found at [nats.io](https://github.com/nats-io/nats-server/releases/latest), or via Docker
|
||||
|
||||
```bash
|
||||
git clone https://github.com/rooba/agentcoordinator.git
|
||||
cd agentcoordinator
|
||||
mix deps.get
|
||||
mix compile
|
||||
```
|
||||
|
||||
### Start the MCP Server directly
|
||||
|
||||
```bash
|
||||
# Start the MCP server directly
|
||||
export MCP_INTERFACE_MODE=stdio # or http / websocket
|
||||
# export MCP_HTTP_PORT=4000 # if using http mode
|
||||
|
||||
./scripts/mcp_launcher.sh
|
||||
|
||||
# Or in development mode
|
||||
mix run --no-halt
|
||||
```
|
||||
|
||||
### Run via VS Code or similar tools
|
||||
|
||||
Add this to your `mcp.json` or `mcp_servers.json` depending on your tool:
|
||||
|
||||
```json
|
||||
{
|
||||
"servers": {
|
||||
"agent-coordinator": {
|
||||
"command": "/path/to/agent_coordinator/scripts/mcp_launcher.sh",
|
||||
"args": [],
|
||||
"env": {
|
||||
"MIX_ENV": "prod",
|
||||
"NATS_HOST": "localhost",
|
||||
"NATS_PORT": "4222"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
</details>
|
||||
|
||||
@@ -40,8 +40,10 @@ defmodule AgentCoordinator.ActivityTracker do
|
||||
"move_file" ->
|
||||
source = Map.get(args, "source")
|
||||
dest = Map.get(args, "destination")
|
||||
files = [source, dest] |> Enum.filter(&(&1))
|
||||
{"Moving #{Path.basename(source || "file")} to #{Path.basename(dest || "destination")}", files}
|
||||
files = [source, dest] |> Enum.filter(& &1)
|
||||
|
||||
{"Moving #{Path.basename(source || "file")} to #{Path.basename(dest || "destination")}",
|
||||
files}
|
||||
|
||||
# VS Code operations
|
||||
"vscode_read_file" ->
|
||||
@@ -54,6 +56,7 @@ defmodule AgentCoordinator.ActivityTracker do
|
||||
|
||||
"vscode_set_editor_content" ->
|
||||
file_path = Map.get(args, "file_path")
|
||||
|
||||
if file_path do
|
||||
{"Editing #{Path.basename(file_path)} in VS Code", [file_path]}
|
||||
else
|
||||
@@ -114,6 +117,7 @@ defmodule AgentCoordinator.ActivityTracker do
|
||||
# Test operations
|
||||
"runTests" ->
|
||||
files = Map.get(args, "files", [])
|
||||
|
||||
if files != [] do
|
||||
file_names = Enum.map(files, &Path.basename/1)
|
||||
{"Running tests in #{Enum.join(file_names, ", ")}", files}
|
||||
@@ -153,6 +157,7 @@ defmodule AgentCoordinator.ActivityTracker do
|
||||
# HTTP/Web operations
|
||||
"fetch_webpage" ->
|
||||
urls = Map.get(args, "urls", [])
|
||||
|
||||
if urls != [] do
|
||||
{"Fetching #{length(urls)} webpages", []}
|
||||
else
|
||||
@@ -162,6 +167,7 @@ defmodule AgentCoordinator.ActivityTracker do
|
||||
# Development operations
|
||||
"get_errors" ->
|
||||
files = Map.get(args, "filePaths", [])
|
||||
|
||||
if files != [] do
|
||||
file_names = Enum.map(files, &Path.basename/1)
|
||||
{"Checking errors in #{Enum.join(file_names, ", ")}", files}
|
||||
@@ -180,6 +186,7 @@ defmodule AgentCoordinator.ActivityTracker do
|
||||
|
||||
"elixir-docs" ->
|
||||
modules = Map.get(args, "modules", [])
|
||||
|
||||
if modules != [] do
|
||||
{"Getting docs for #{Enum.join(modules, ", ")}", []}
|
||||
else
|
||||
@@ -196,6 +203,7 @@ defmodule AgentCoordinator.ActivityTracker do
|
||||
|
||||
"pylanceFileSyntaxErrors" ->
|
||||
file_uri = Map.get(args, "fileUri")
|
||||
|
||||
if file_uri do
|
||||
file_path = uri_to_path(file_uri)
|
||||
{"Checking syntax errors in #{Path.basename(file_path)}", [file_path]}
|
||||
@@ -268,10 +276,11 @@ defmodule AgentCoordinator.ActivityTracker do
|
||||
defp extract_file_path(args) do
|
||||
# Try various common parameter names for file paths
|
||||
args["path"] || args["filePath"] || args["file_path"] ||
|
||||
args["source"] || args["destination"] || args["fileUri"] |> uri_to_path()
|
||||
args["source"] || args["destination"] || args["fileUri"] |> uri_to_path()
|
||||
end
|
||||
|
||||
defp uri_to_path(nil), do: nil
|
||||
|
||||
defp uri_to_path(uri) when is_binary(uri) do
|
||||
if String.starts_with?(uri, "file://") do
|
||||
String.replace_prefix(uri, "file://", "")
|
||||
|
||||
@@ -55,21 +55,25 @@ defmodule AgentCoordinator.Agent do
|
||||
workspace_path = Keyword.get(opts, :workspace_path)
|
||||
|
||||
# Use smart codebase identification
|
||||
codebase_id = case Keyword.get(opts, :codebase_id) do
|
||||
nil when workspace_path ->
|
||||
# Auto-detect from workspace
|
||||
case AgentCoordinator.CodebaseIdentifier.identify_codebase(workspace_path) do
|
||||
%{canonical_id: canonical_id} -> canonical_id
|
||||
_ -> Path.basename(workspace_path || "default")
|
||||
end
|
||||
codebase_id =
|
||||
case Keyword.get(opts, :codebase_id) do
|
||||
nil when workspace_path ->
|
||||
# Auto-detect from workspace
|
||||
case AgentCoordinator.CodebaseIdentifier.identify_codebase(workspace_path) do
|
||||
%{canonical_id: canonical_id} -> canonical_id
|
||||
_ -> Path.basename(workspace_path || "default")
|
||||
end
|
||||
|
||||
nil ->
|
||||
"default"
|
||||
nil ->
|
||||
"default"
|
||||
|
||||
explicit_id ->
|
||||
# Normalize the provided ID
|
||||
AgentCoordinator.CodebaseIdentifier.normalize_codebase_reference(explicit_id, workspace_path)
|
||||
end
|
||||
explicit_id ->
|
||||
# Normalize the provided ID
|
||||
AgentCoordinator.CodebaseIdentifier.normalize_codebase_reference(
|
||||
explicit_id,
|
||||
workspace_path
|
||||
)
|
||||
end
|
||||
|
||||
%__MODULE__{
|
||||
id: UUID.uuid4(),
|
||||
@@ -99,23 +103,21 @@ defmodule AgentCoordinator.Agent do
|
||||
timestamp: DateTime.utc_now()
|
||||
}
|
||||
|
||||
new_history = [activity_entry | agent.activity_history]
|
||||
|> Enum.take(10)
|
||||
new_history =
|
||||
[activity_entry | agent.activity_history]
|
||||
|> Enum.take(10)
|
||||
|
||||
%{agent |
|
||||
current_activity: activity,
|
||||
current_files: files,
|
||||
activity_history: new_history,
|
||||
last_heartbeat: DateTime.utc_now()
|
||||
%{
|
||||
agent
|
||||
| current_activity: activity,
|
||||
current_files: files,
|
||||
activity_history: new_history,
|
||||
last_heartbeat: DateTime.utc_now()
|
||||
}
|
||||
end
|
||||
|
||||
def clear_activity(agent) do
|
||||
%{agent |
|
||||
current_activity: nil,
|
||||
current_files: [],
|
||||
last_heartbeat: DateTime.utc_now()
|
||||
}
|
||||
%{agent | current_activity: nil, current_files: [], last_heartbeat: DateTime.utc_now()}
|
||||
end
|
||||
|
||||
def assign_task(agent, task_id) do
|
||||
|
||||
@@ -12,15 +12,15 @@ defmodule AgentCoordinator.CodebaseIdentifier do
|
||||
require Logger
|
||||
|
||||
@type codebase_info :: %{
|
||||
canonical_id: String.t(),
|
||||
display_name: String.t(),
|
||||
workspace_path: String.t(),
|
||||
repository_url: String.t() | nil,
|
||||
git_remote: String.t() | nil,
|
||||
branch: String.t() | nil,
|
||||
commit_hash: String.t() | nil,
|
||||
identification_method: :git_remote | :git_local | :folder_name | :custom
|
||||
}
|
||||
canonical_id: String.t(),
|
||||
display_name: String.t(),
|
||||
workspace_path: String.t(),
|
||||
repository_url: String.t() | nil,
|
||||
git_remote: String.t() | nil,
|
||||
branch: String.t() | nil,
|
||||
commit_hash: String.t() | nil,
|
||||
identification_method: :git_remote | :git_local | :folder_name | :custom
|
||||
}
|
||||
|
||||
@doc """
|
||||
Identify a codebase from a workspace path, generating a canonical ID.
|
||||
@@ -56,6 +56,7 @@ defmodule AgentCoordinator.CodebaseIdentifier do
|
||||
}
|
||||
"""
|
||||
def identify_codebase(workspace_path, opts \\ [])
|
||||
|
||||
def identify_codebase(nil, opts) do
|
||||
custom_id = Keyword.get(opts, :custom_id, "default")
|
||||
build_custom_codebase_info(nil, custom_id)
|
||||
@@ -128,15 +129,16 @@ defmodule AgentCoordinator.CodebaseIdentifier do
|
||||
|
||||
defp identify_git_codebase(workspace_path) do
|
||||
with {:ok, git_info} <- get_git_info(workspace_path) do
|
||||
canonical_id = case git_info.remote_url do
|
||||
nil ->
|
||||
# Local git repo without remote
|
||||
"git-local:#{git_info.repo_name}"
|
||||
canonical_id =
|
||||
case git_info.remote_url do
|
||||
nil ->
|
||||
# Local git repo without remote
|
||||
"git-local:#{git_info.repo_name}"
|
||||
|
||||
remote_url ->
|
||||
# Extract canonical identifier from remote URL
|
||||
extract_canonical_from_remote(remote_url)
|
||||
end
|
||||
remote_url ->
|
||||
# Extract canonical identifier from remote URL
|
||||
extract_canonical_from_remote(remote_url)
|
||||
end
|
||||
|
||||
%{
|
||||
canonical_id: canonical_id,
|
||||
@@ -183,6 +185,7 @@ defmodule AgentCoordinator.CodebaseIdentifier do
|
||||
end
|
||||
|
||||
defp git_repository?(workspace_path) when is_nil(workspace_path), do: false
|
||||
|
||||
defp git_repository?(workspace_path) do
|
||||
File.exists?(Path.join(workspace_path, ".git"))
|
||||
end
|
||||
@@ -201,26 +204,34 @@ defmodule AgentCoordinator.CodebaseIdentifier do
|
||||
commit_hash = String.trim(commit_hash)
|
||||
|
||||
# Try to get remote URL
|
||||
{remote_info, _remote_result_use_me?} = case System.cmd("git", ["remote", "-v"], cd: workspace_path) do
|
||||
{output, 0} when output != "" ->
|
||||
# Parse remote output to extract origin URL
|
||||
lines = String.split(String.trim(output), "\n")
|
||||
origin_line = Enum.find(lines, fn line ->
|
||||
String.starts_with?(line, "origin") and String.contains?(line, "(fetch)")
|
||||
end)
|
||||
{remote_info, _remote_result_use_me?} =
|
||||
case System.cmd("git", ["remote", "-v"], cd: workspace_path) do
|
||||
{output, 0} when output != "" ->
|
||||
# Parse remote output to extract origin URL
|
||||
lines = String.split(String.trim(output), "\n")
|
||||
|
||||
case origin_line do
|
||||
nil -> {nil, :no_origin}
|
||||
line ->
|
||||
# Extract URL from "origin <url> (fetch)"
|
||||
url = line
|
||||
|> String.split()
|
||||
|> Enum.at(1)
|
||||
{url, :ok}
|
||||
end
|
||||
origin_line =
|
||||
Enum.find(lines, fn line ->
|
||||
String.starts_with?(line, "origin") and String.contains?(line, "(fetch)")
|
||||
end)
|
||||
|
||||
_ -> {nil, :no_remotes}
|
||||
end
|
||||
case origin_line do
|
||||
nil ->
|
||||
{nil, :no_origin}
|
||||
|
||||
line ->
|
||||
# Extract URL from "origin <url> (fetch)"
|
||||
url =
|
||||
line
|
||||
|> String.split()
|
||||
|> Enum.at(1)
|
||||
|
||||
{url, :ok}
|
||||
end
|
||||
|
||||
_ ->
|
||||
{nil, :no_remotes}
|
||||
end
|
||||
|
||||
git_info = %{
|
||||
repo_name: repo_name,
|
||||
@@ -267,6 +278,7 @@ defmodule AgentCoordinator.CodebaseIdentifier do
|
||||
case Regex.run(regex, url) do
|
||||
[_, owner, repo] ->
|
||||
"github.com/#{owner}/#{repo}"
|
||||
|
||||
_ ->
|
||||
"github.com/unknown"
|
||||
end
|
||||
@@ -279,6 +291,7 @@ defmodule AgentCoordinator.CodebaseIdentifier do
|
||||
case Regex.run(regex, url) do
|
||||
[_, owner, repo] ->
|
||||
"gitlab.com/#{owner}/#{repo}"
|
||||
|
||||
_ ->
|
||||
"gitlab.com/unknown"
|
||||
end
|
||||
|
||||
@@ -14,11 +14,11 @@ defmodule AgentCoordinator.HttpInterface do
|
||||
require Logger
|
||||
alias AgentCoordinator.{MCPServer, ToolFilter, SessionManager}
|
||||
|
||||
plug Plug.Logger
|
||||
plug :match
|
||||
plug Plug.Parsers, parsers: [:json], json_decoder: Jason
|
||||
plug :put_cors_headers
|
||||
plug :dispatch
|
||||
plug(Plug.Logger)
|
||||
plug(:match)
|
||||
plug(Plug.Parsers, parsers: [:json], json_decoder: Jason)
|
||||
plug(:put_cors_headers)
|
||||
plug(:dispatch)
|
||||
|
||||
@doc """
|
||||
Start the HTTP server on the specified port.
|
||||
@@ -109,9 +109,10 @@ defmodule AgentCoordinator.HttpInterface do
|
||||
all_tools = MCPServer.get_tools()
|
||||
filtered_tools = ToolFilter.filter_tools(all_tools, context)
|
||||
|
||||
tool_allowed = Enum.any?(filtered_tools, fn tool ->
|
||||
Map.get(tool, "name") == tool_name
|
||||
end)
|
||||
tool_allowed =
|
||||
Enum.any?(filtered_tools, fn tool ->
|
||||
Map.get(tool, "name") == tool_name
|
||||
end)
|
||||
|
||||
if not tool_allowed do
|
||||
send_json_response(conn, 403, %{
|
||||
@@ -159,6 +160,7 @@ defmodule AgentCoordinator.HttpInterface do
|
||||
|
||||
unexpected ->
|
||||
IO.puts(:stderr, "Unexpected MCP response: #{inspect(unexpected)}")
|
||||
|
||||
send_json_response(conn, 500, %{
|
||||
error: %{
|
||||
code: -32603,
|
||||
@@ -187,6 +189,7 @@ defmodule AgentCoordinator.HttpInterface do
|
||||
case method do
|
||||
"tools/call" ->
|
||||
tool_name = get_in(enhanced_request, ["params", "name"])
|
||||
|
||||
if tool_allowed_for_context?(tool_name, context) do
|
||||
execute_mcp_request(conn, enhanced_request, context)
|
||||
else
|
||||
@@ -275,20 +278,25 @@ defmodule AgentCoordinator.HttpInterface do
|
||||
case validate_session_for_method("stream/subscribe", conn, context) do
|
||||
{:ok, session_info} ->
|
||||
# Set up SSE headers
|
||||
conn = conn
|
||||
|> put_resp_content_type("text/event-stream")
|
||||
|> put_mcp_headers()
|
||||
|> put_resp_header("cache-control", "no-cache")
|
||||
|> put_resp_header("connection", "keep-alive")
|
||||
|> put_resp_header("access-control-allow-credentials", "true")
|
||||
|> send_chunked(200)
|
||||
conn =
|
||||
conn
|
||||
|> put_resp_content_type("text/event-stream")
|
||||
|> put_mcp_headers()
|
||||
|> put_resp_header("cache-control", "no-cache")
|
||||
|> put_resp_header("connection", "keep-alive")
|
||||
|> put_resp_header("access-control-allow-credentials", "true")
|
||||
|> send_chunked(200)
|
||||
|
||||
# Send initial connection event
|
||||
{:ok, conn} = chunk(conn, format_sse_event("connected", %{
|
||||
session_id: Map.get(session_info, :agent_id, "anonymous"),
|
||||
protocol_version: "2025-06-18",
|
||||
timestamp: DateTime.utc_now() |> DateTime.to_iso8601()
|
||||
}))
|
||||
{:ok, conn} =
|
||||
chunk(
|
||||
conn,
|
||||
format_sse_event("connected", %{
|
||||
session_id: Map.get(session_info, :agent_id, "anonymous"),
|
||||
protocol_version: "2025-06-18",
|
||||
timestamp: DateTime.utc_now() |> DateTime.to_iso8601()
|
||||
})
|
||||
)
|
||||
|
||||
# Start streaming loop
|
||||
stream_mcp_events(conn, session_info, context)
|
||||
@@ -307,10 +315,15 @@ defmodule AgentCoordinator.HttpInterface do
|
||||
# Send periodic heartbeat for now
|
||||
try do
|
||||
:timer.sleep(1000)
|
||||
{:ok, conn} = chunk(conn, format_sse_event("heartbeat", %{
|
||||
timestamp: DateTime.utc_now() |> DateTime.to_iso8601(),
|
||||
session_id: Map.get(session_info, :agent_id, "anonymous")
|
||||
}))
|
||||
|
||||
{:ok, conn} =
|
||||
chunk(
|
||||
conn,
|
||||
format_sse_event("heartbeat", %{
|
||||
timestamp: DateTime.utc_now() |> DateTime.to_iso8601(),
|
||||
session_id: Map.get(session_info, :agent_id, "anonymous")
|
||||
})
|
||||
)
|
||||
|
||||
# Continue streaming (this would be event-driven in production)
|
||||
stream_mcp_events(conn, session_info, context)
|
||||
@@ -347,10 +360,11 @@ defmodule AgentCoordinator.HttpInterface do
|
||||
|
||||
defp cowboy_dispatch do
|
||||
[
|
||||
{:_, [
|
||||
{"/mcp/ws", AgentCoordinator.WebSocketHandler, []},
|
||||
{:_, Plug.Cowboy.Handler, {__MODULE__, []}}
|
||||
]}
|
||||
{:_,
|
||||
[
|
||||
{"/mcp/ws", AgentCoordinator.WebSocketHandler, []},
|
||||
{:_, Plug.Cowboy.Handler, {__MODULE__, []}}
|
||||
]}
|
||||
]
|
||||
end
|
||||
|
||||
@@ -379,8 +393,10 @@ defmodule AgentCoordinator.HttpInterface do
|
||||
cond do
|
||||
forwarded_for ->
|
||||
forwarded_for |> String.split(",") |> List.first() |> String.trim()
|
||||
|
||||
real_ip ->
|
||||
real_ip
|
||||
|
||||
true ->
|
||||
conn.remote_ip |> :inet.ntoa() |> to_string()
|
||||
end
|
||||
@@ -394,27 +410,37 @@ defmodule AgentCoordinator.HttpInterface do
|
||||
conn
|
||||
|> put_resp_header("access-control-allow-origin", allowed_origin)
|
||||
|> put_resp_header("access-control-allow-methods", "GET, POST, OPTIONS")
|
||||
|> put_resp_header("access-control-allow-headers", "content-type, authorization, mcp-session-id, mcp-protocol-version, x-session-id")
|
||||
|> put_resp_header(
|
||||
"access-control-allow-headers",
|
||||
"content-type, authorization, mcp-session-id, mcp-protocol-version, x-session-id"
|
||||
)
|
||||
|> put_resp_header("access-control-expose-headers", "mcp-protocol-version, server")
|
||||
|> put_resp_header("access-control-max-age", "86400")
|
||||
end
|
||||
|
||||
defp validate_origin(nil), do: "*" # No origin header (direct API calls)
|
||||
# No origin header (direct API calls)
|
||||
defp validate_origin(nil), do: "*"
|
||||
|
||||
defp validate_origin(origin) do
|
||||
# Allow localhost and development origins
|
||||
case URI.parse(origin) do
|
||||
%URI{host: host} when host in ["localhost", "127.0.0.1", "::1"] -> origin
|
||||
%URI{host: host} when host in ["localhost", "127.0.0.1", "::1"] ->
|
||||
origin
|
||||
|
||||
%URI{host: host} when is_binary(host) ->
|
||||
# Allow HTTPS origins and known development domains
|
||||
if String.starts_with?(origin, "https://") or
|
||||
String.contains?(host, ["localhost", "127.0.0.1", "dev", "local"]) do
|
||||
String.contains?(host, ["localhost", "127.0.0.1", "dev", "local"]) do
|
||||
origin
|
||||
else
|
||||
# For production, be more restrictive
|
||||
IO.puts(:stderr, "Potentially unsafe origin: #{origin}")
|
||||
"*" # Fallback for now, could be more restrictive
|
||||
# Fallback for now, could be more restrictive
|
||||
"*"
|
||||
end
|
||||
_ -> "*"
|
||||
|
||||
_ ->
|
||||
"*"
|
||||
end
|
||||
end
|
||||
|
||||
@@ -434,9 +460,10 @@ defmodule AgentCoordinator.HttpInterface do
|
||||
defp validate_mcp_request(params) when is_map(params) do
|
||||
required_fields = ["jsonrpc", "method"]
|
||||
|
||||
missing_fields = Enum.filter(required_fields, fn field ->
|
||||
not Map.has_key?(params, field)
|
||||
end)
|
||||
missing_fields =
|
||||
Enum.filter(required_fields, fn field ->
|
||||
not Map.has_key?(params, field)
|
||||
end)
|
||||
|
||||
cond do
|
||||
not Enum.empty?(missing_fields) ->
|
||||
@@ -460,15 +487,16 @@ defmodule AgentCoordinator.HttpInterface do
|
||||
{session_id, session_info} = get_session_info(conn)
|
||||
|
||||
# Add context metadata to request params
|
||||
enhanced_params = Map.get(mcp_request, "params", %{})
|
||||
|> Map.put("_session_id", session_id)
|
||||
|> Map.put("_session_info", session_info)
|
||||
|> Map.put("_client_context", %{
|
||||
connection_type: context.connection_type,
|
||||
security_level: context.security_level,
|
||||
remote_ip: get_remote_ip(conn),
|
||||
user_agent: context.user_agent
|
||||
})
|
||||
enhanced_params =
|
||||
Map.get(mcp_request, "params", %{})
|
||||
|> Map.put("_session_id", session_id)
|
||||
|> Map.put("_session_info", session_info)
|
||||
|> Map.put("_client_context", %{
|
||||
connection_type: context.connection_type,
|
||||
security_level: context.security_level,
|
||||
remote_ip: get_remote_ip(conn),
|
||||
user_agent: context.user_agent
|
||||
})
|
||||
|
||||
Map.put(mcp_request, "params", enhanced_params)
|
||||
end
|
||||
@@ -479,17 +507,21 @@ defmodule AgentCoordinator.HttpInterface do
|
||||
[session_token] when byte_size(session_token) > 0 ->
|
||||
case SessionManager.validate_session(session_token) do
|
||||
{:ok, session_info} ->
|
||||
{session_info.agent_id, %{
|
||||
token: session_token,
|
||||
agent_id: session_info.agent_id,
|
||||
capabilities: session_info.capabilities,
|
||||
expires_at: session_info.expires_at,
|
||||
validated: true
|
||||
}}
|
||||
{session_info.agent_id,
|
||||
%{
|
||||
token: session_token,
|
||||
agent_id: session_info.agent_id,
|
||||
capabilities: session_info.capabilities,
|
||||
expires_at: session_info.expires_at,
|
||||
validated: true
|
||||
}}
|
||||
|
||||
{:error, reason} ->
|
||||
IO.puts(:stderr, "Invalid MCP session token: #{reason}")
|
||||
# Fall back to generating anonymous session
|
||||
anonymous_id = "http_anonymous_" <> (:crypto.strong_rand_bytes(8) |> Base.encode16(case: :lower))
|
||||
anonymous_id =
|
||||
"http_anonymous_" <> (:crypto.strong_rand_bytes(8) |> Base.encode16(case: :lower))
|
||||
|
||||
{anonymous_id, %{validated: false, reason: reason}}
|
||||
end
|
||||
|
||||
@@ -498,9 +530,12 @@ defmodule AgentCoordinator.HttpInterface do
|
||||
case get_req_header(conn, "x-session-id") do
|
||||
[session_id] when byte_size(session_id) > 0 ->
|
||||
{session_id, %{validated: false, legacy: true}}
|
||||
|
||||
_ ->
|
||||
# No session header, generate anonymous session
|
||||
anonymous_id = "http_anonymous_" <> (:crypto.strong_rand_bytes(8) |> Base.encode16(case: :lower))
|
||||
anonymous_id =
|
||||
"http_anonymous_" <> (:crypto.strong_rand_bytes(8) |> Base.encode16(case: :lower))
|
||||
|
||||
{anonymous_id, %{validated: false, anonymous: true}}
|
||||
end
|
||||
end
|
||||
@@ -512,27 +547,31 @@ defmodule AgentCoordinator.HttpInterface do
|
||||
case Map.get(session_info, :validated, false) do
|
||||
true ->
|
||||
{:ok, session_info}
|
||||
|
||||
false ->
|
||||
reason = Map.get(session_info, :reason, "Session not authenticated")
|
||||
{:error, %{
|
||||
code: -32001,
|
||||
message: "Authentication required",
|
||||
data: %{reason: reason}
|
||||
}}
|
||||
|
||||
{:error,
|
||||
%{
|
||||
code: -32001,
|
||||
message: "Authentication required",
|
||||
data: %{reason: reason}
|
||||
}}
|
||||
end
|
||||
end
|
||||
|
||||
defp validate_session_for_method(method, conn, context) do
|
||||
# Define which methods require authenticated sessions
|
||||
authenticated_methods = MapSet.new([
|
||||
"agents/register",
|
||||
"agents/unregister",
|
||||
"agents/heartbeat",
|
||||
"tasks/create",
|
||||
"tasks/complete",
|
||||
"codebase/register",
|
||||
"stream/subscribe"
|
||||
])
|
||||
authenticated_methods =
|
||||
MapSet.new([
|
||||
"agents/register",
|
||||
"agents/unregister",
|
||||
"agents/heartbeat",
|
||||
"tasks/create",
|
||||
"tasks/complete",
|
||||
"codebase/register",
|
||||
"stream/subscribe"
|
||||
])
|
||||
|
||||
if MapSet.member?(authenticated_methods, method) do
|
||||
require_authenticated_session(conn, context)
|
||||
@@ -560,6 +599,7 @@ defmodule AgentCoordinator.HttpInterface do
|
||||
|
||||
unexpected ->
|
||||
IO.puts(:stderr, "Unexpected MCP response: #{inspect(unexpected)}")
|
||||
|
||||
send_json_response(conn, 500, %{
|
||||
jsonrpc: "2.0",
|
||||
id: Map.get(mcp_request, "id"),
|
||||
|
||||
@@ -102,7 +102,10 @@ defmodule AgentCoordinator.InterfaceManager do
|
||||
metrics: initialize_metrics()
|
||||
}
|
||||
|
||||
IO.puts(:stderr, "Interface Manager starting with config: #{inspect(config.enabled_interfaces)}")
|
||||
IO.puts(
|
||||
:stderr,
|
||||
"Interface Manager starting with config: #{inspect(config.enabled_interfaces)}"
|
||||
)
|
||||
|
||||
# Start enabled interfaces
|
||||
{:ok, state, {:continue, :start_interfaces}}
|
||||
@@ -111,17 +114,18 @@ defmodule AgentCoordinator.InterfaceManager do
|
||||
@impl GenServer
|
||||
def handle_continue(:start_interfaces, state) do
|
||||
# Start each enabled interface
|
||||
updated_state = Enum.reduce(state.config.enabled_interfaces, state, fn interface_type, acc ->
|
||||
case start_interface_server(interface_type, state.config, acc) do
|
||||
{:ok, interface_info} ->
|
||||
IO.puts(:stderr, "Started #{interface_type} interface")
|
||||
%{acc | interfaces: Map.put(acc.interfaces, interface_type, interface_info)}
|
||||
updated_state =
|
||||
Enum.reduce(state.config.enabled_interfaces, state, fn interface_type, acc ->
|
||||
case start_interface_server(interface_type, state.config, acc) do
|
||||
{:ok, interface_info} ->
|
||||
IO.puts(:stderr, "Started #{interface_type} interface")
|
||||
%{acc | interfaces: Map.put(acc.interfaces, interface_type, interface_info)}
|
||||
|
||||
{:error, reason} ->
|
||||
IO.puts(:stderr, "Failed to start #{interface_type} interface: #{reason}")
|
||||
acc
|
||||
end
|
||||
end)
|
||||
{:error, reason} ->
|
||||
IO.puts(:stderr, "Failed to start #{interface_type} interface: #{reason}")
|
||||
acc
|
||||
end
|
||||
end)
|
||||
|
||||
{:noreply, updated_state}
|
||||
end
|
||||
@@ -224,9 +228,11 @@ defmodule AgentCoordinator.InterfaceManager do
|
||||
@impl GenServer
|
||||
def handle_call(:get_metrics, _from, state) do
|
||||
# Collect metrics from all running interfaces
|
||||
interface_metrics = Enum.map(state.interfaces, fn {interface_type, interface_info} ->
|
||||
{interface_type, get_interface_metrics(interface_type, interface_info)}
|
||||
end) |> Enum.into(%{})
|
||||
interface_metrics =
|
||||
Enum.map(state.interfaces, fn {interface_type, interface_info} ->
|
||||
{interface_type, get_interface_metrics(interface_type, interface_info)}
|
||||
end)
|
||||
|> Enum.into(%{})
|
||||
|
||||
metrics = %{
|
||||
interfaces: interface_metrics,
|
||||
@@ -369,11 +375,21 @@ defmodule AgentCoordinator.InterfaceManager do
|
||||
interface_mode = System.get_env("MCP_INTERFACE_MODE", "stdio")
|
||||
|
||||
case interface_mode do
|
||||
"stdio" -> [:stdio]
|
||||
"http" -> [:http]
|
||||
"websocket" -> [:websocket]
|
||||
"all" -> [:stdio, :http, :websocket]
|
||||
"remote" -> [:http, :websocket]
|
||||
"stdio" ->
|
||||
[:stdio]
|
||||
|
||||
"http" ->
|
||||
[:http]
|
||||
|
||||
"websocket" ->
|
||||
[:websocket]
|
||||
|
||||
"all" ->
|
||||
[:stdio, :http, :websocket]
|
||||
|
||||
"remote" ->
|
||||
[:http, :websocket]
|
||||
|
||||
_ ->
|
||||
# Check for comma-separated list
|
||||
if String.contains?(interface_mode, ",") do
|
||||
@@ -400,14 +416,17 @@ defmodule AgentCoordinator.InterfaceManager do
|
||||
end
|
||||
|
||||
defp update_http_config_from_env(config) do
|
||||
config = case System.get_env("MCP_HTTP_PORT") do
|
||||
nil -> config
|
||||
port_str ->
|
||||
case Integer.parse(port_str) do
|
||||
{port, ""} -> put_in(config, [:http, :port], port)
|
||||
_ -> config
|
||||
end
|
||||
end
|
||||
config =
|
||||
case System.get_env("MCP_HTTP_PORT") do
|
||||
nil ->
|
||||
config
|
||||
|
||||
port_str ->
|
||||
case Integer.parse(port_str) do
|
||||
{port, ""} -> put_in(config, [:http, :port], port)
|
||||
_ -> config
|
||||
end
|
||||
end
|
||||
|
||||
case System.get_env("MCP_HTTP_HOST") do
|
||||
nil -> config
|
||||
@@ -472,7 +491,8 @@ defmodule AgentCoordinator.InterfaceManager do
|
||||
# WebSocket is handled by the HTTP server, so just mark it as enabled
|
||||
interface_info = %{
|
||||
type: :websocket,
|
||||
pid: :embedded, # Embedded in HTTP server
|
||||
# Embedded in HTTP server
|
||||
pid: :embedded,
|
||||
started_at: DateTime.utc_now(),
|
||||
config: config.websocket
|
||||
}
|
||||
@@ -583,16 +603,18 @@ defmodule AgentCoordinator.InterfaceManager do
|
||||
"message" => "Parse error: #{Exception.message(e)}"
|
||||
}
|
||||
}
|
||||
|
||||
IO.puts(Jason.encode!(error_response))
|
||||
|
||||
e ->
|
||||
# Try to get the ID from the malformed request
|
||||
id = try do
|
||||
partial = Jason.decode!(json_line)
|
||||
Map.get(partial, "id")
|
||||
rescue
|
||||
_ -> nil
|
||||
end
|
||||
id =
|
||||
try do
|
||||
partial = Jason.decode!(json_line)
|
||||
Map.get(partial, "id")
|
||||
rescue
|
||||
_ -> nil
|
||||
end
|
||||
|
||||
error_response = %{
|
||||
"jsonrpc" => "2.0",
|
||||
@@ -602,6 +624,7 @@ defmodule AgentCoordinator.InterfaceManager do
|
||||
"message" => "Internal error: #{Exception.message(e)}"
|
||||
}
|
||||
}
|
||||
|
||||
IO.puts(Jason.encode!(error_response))
|
||||
end
|
||||
end
|
||||
@@ -628,7 +651,8 @@ defmodule AgentCoordinator.InterfaceManager do
|
||||
defp get_interface_metrics(:websocket, interface_info) do
|
||||
%{
|
||||
type: :websocket,
|
||||
status: :running, # Embedded in HTTP server
|
||||
# Embedded in HTTP server
|
||||
status: :running,
|
||||
uptime: DateTime.diff(DateTime.utc_now(), interface_info.started_at, :second),
|
||||
embedded: true
|
||||
}
|
||||
@@ -678,17 +702,17 @@ defmodule AgentCoordinator.InterfaceManager do
|
||||
# Check if running in Docker environment
|
||||
defp docker_environment? do
|
||||
# Check common Docker environment indicators
|
||||
System.get_env("DOCKER_CONTAINER") != nil or
|
||||
System.get_env("container") != nil or
|
||||
System.get_env("DOCKERIZED") != nil or
|
||||
File.exists?("/.dockerenv") or
|
||||
File.exists?("/proc/1/cgroup") and
|
||||
(File.read!("/proc/1/cgroup") |> String.contains?("docker")) or
|
||||
String.contains?(to_string(System.get_env("PATH", "")), "/app/") or
|
||||
# Check if we're running under a container init system
|
||||
case File.read("/proc/1/comm") do
|
||||
{:ok, comm} -> String.trim(comm) in ["bash", "sh", "docker-init", "tini"]
|
||||
_ -> false
|
||||
end
|
||||
System.get_env("DOCKER_CONTAINER") != nil or
|
||||
System.get_env("container") != nil or
|
||||
System.get_env("DOCKERIZED") != nil or
|
||||
File.exists?("/.dockerenv") or
|
||||
(File.exists?("/proc/1/cgroup") and
|
||||
File.read!("/proc/1/cgroup") |> String.contains?("docker")) or
|
||||
String.contains?(to_string(System.get_env("PATH", "")), "/app/") or
|
||||
case File.read("/proc/1/comm") do
|
||||
{:ok, comm} -> String.trim(comm) in ["bash", "sh", "docker-init", "tini"]
|
||||
_ -> false
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
@@ -11,7 +11,18 @@ defmodule AgentCoordinator.MCPServer do
|
||||
|
||||
use GenServer
|
||||
require Logger
|
||||
alias AgentCoordinator.{TaskRegistry, Inbox, Agent, Task, CodebaseRegistry, VSCodeToolProvider, ToolFilter, SessionManager, ActivityTracker}
|
||||
|
||||
alias AgentCoordinator.{
|
||||
TaskRegistry,
|
||||
Inbox,
|
||||
Agent,
|
||||
Task,
|
||||
CodebaseRegistry,
|
||||
VSCodeToolProvider,
|
||||
ToolFilter,
|
||||
SessionManager,
|
||||
ActivityTracker
|
||||
}
|
||||
|
||||
# State for tracking external servers and agent sessions
|
||||
defstruct [
|
||||
@@ -26,7 +37,8 @@ defmodule AgentCoordinator.MCPServer do
|
||||
@mcp_tools [
|
||||
%{
|
||||
"name" => "register_agent",
|
||||
"description" => "Register a new agent with the coordination system. Each agent must choose a unique identifier (e.g., 'Green Platypus', 'Blue Koala') and include their agent_id in all subsequent tool calls to identify themselves.",
|
||||
"description" =>
|
||||
"Register a new agent with the coordination system. Each agent must choose a unique identifier (e.g., 'Green Platypus', 'Blue Koala') and include their agent_id in all subsequent tool calls to identify themselves.",
|
||||
"inputSchema" => %{
|
||||
"type" => "object",
|
||||
"properties" => %{
|
||||
@@ -38,7 +50,11 @@ defmodule AgentCoordinator.MCPServer do
|
||||
"enum" => ["coding", "testing", "documentation", "analysis", "review"]
|
||||
}
|
||||
},
|
||||
"codebase_id" => %{"type" => "string", "description" => "If the project is found locally on the machine, use the name of the directory in which you are currently at (.). If it is remote, use the git registered codebase ID, if it is a multicodebase project, and there is no apparently folder to base as the rootmost -- ask."},
|
||||
"codebase_id" => %{
|
||||
"type" => "string",
|
||||
"description" =>
|
||||
"If the project is found locally on the machine, use the name of the directory in which you are currently at (.). If it is remote, use the git registered codebase ID, if it is a multicodebase project, and there is no apparently folder to base as the rootmost -- ask."
|
||||
},
|
||||
"workspace_path" => %{"type" => "string"},
|
||||
"cross_codebase_capable" => %{"type" => "boolean"}
|
||||
},
|
||||
@@ -71,7 +87,11 @@ defmodule AgentCoordinator.MCPServer do
|
||||
"title" => %{"type" => "string"},
|
||||
"description" => %{"type" => "string"},
|
||||
"priority" => %{"type" => "string", "enum" => ["low", "normal", "high", "urgent"]},
|
||||
"codebase_id" => %{"type" => "string", "description" => "If the project is found locally on the machine, use the name of the directory in which you are currently at (.). If it is remote, use the git registered codebase ID, if it is a multicodebase project, and there is no apparently folder to base as the rootmost -- ask."},
|
||||
"codebase_id" => %{
|
||||
"type" => "string",
|
||||
"description" =>
|
||||
"If the project is found locally on the machine, use the name of the directory in which you are currently at (.). If it is remote, use the git registered codebase ID, if it is a multicodebase project, and there is no apparently folder to base as the rootmost -- ask."
|
||||
},
|
||||
"file_paths" => %{"type" => "array", "items" => %{"type" => "string"}},
|
||||
"required_capabilities" => %{
|
||||
"type" => "array",
|
||||
@@ -110,7 +130,13 @@ defmodule AgentCoordinator.MCPServer do
|
||||
"enum" => ["sequential", "parallel", "leader_follower"]
|
||||
}
|
||||
},
|
||||
"required" => ["agent_id", "title", "description", "primary_codebase_id", "affected_codebases"]
|
||||
"required" => [
|
||||
"agent_id",
|
||||
"title",
|
||||
"description",
|
||||
"primary_codebase_id",
|
||||
"affected_codebases"
|
||||
]
|
||||
}
|
||||
},
|
||||
%{
|
||||
@@ -334,7 +360,8 @@ defmodule AgentCoordinator.MCPServer do
|
||||
},
|
||||
%{
|
||||
"name" => "discover_codebase_info",
|
||||
"description" => "Intelligently discover codebase information from workspace path, including git repository details, canonical ID generation, and project identification.",
|
||||
"description" =>
|
||||
"Intelligently discover codebase information from workspace path, including git repository details, canonical ID generation, and project identification.",
|
||||
"inputSchema" => %{
|
||||
"type" => "object",
|
||||
"properties" => %{
|
||||
@@ -422,8 +449,9 @@ defmodule AgentCoordinator.MCPServer do
|
||||
tool_name = Map.get(request, "params", %{}) |> Map.get("name")
|
||||
|
||||
# Allow certain MCP system calls and register_agent to proceed without agent_id
|
||||
allowed_without_agent = method in ["initialize", "tools/list", "notifications/initialized"] or
|
||||
(method == "tools/call" and tool_name == "register_agent")
|
||||
allowed_without_agent =
|
||||
method in ["initialize", "tools/list", "notifications/initialized"] or
|
||||
(method == "tools/call" and tool_name == "register_agent")
|
||||
|
||||
IO.puts(:stderr, "#{method} #{inspect(request)} #{tool_name}")
|
||||
|
||||
@@ -434,6 +462,7 @@ defmodule AgentCoordinator.MCPServer do
|
||||
else
|
||||
# Log the rejected call for debugging
|
||||
IO.puts(:stderr, "Rejected call without agent_id: method=#{method}, tool=#{tool_name}")
|
||||
|
||||
error_response = %{
|
||||
"jsonrpc" => "2.0",
|
||||
"id" => Map.get(request, "id"),
|
||||
@@ -442,6 +471,7 @@ defmodule AgentCoordinator.MCPServer do
|
||||
"message" => error_message
|
||||
}
|
||||
}
|
||||
|
||||
{:reply, error_response, state}
|
||||
end
|
||||
|
||||
@@ -466,15 +496,17 @@ defmodule AgentCoordinator.MCPServer do
|
||||
update_session_activity(agent_context[:agent_id])
|
||||
|
||||
# Add heartbeat metadata to successful responses
|
||||
enhanced_response = case response do
|
||||
%{"result" => _} = success ->
|
||||
Map.put(success, "_heartbeat_metadata", %{
|
||||
agent_id: agent_context[:agent_id],
|
||||
timestamp: DateTime.utc_now()
|
||||
})
|
||||
error_result ->
|
||||
error_result
|
||||
end
|
||||
enhanced_response =
|
||||
case response do
|
||||
%{"result" => _} = success ->
|
||||
Map.put(success, "_heartbeat_metadata", %{
|
||||
agent_id: agent_context[:agent_id],
|
||||
timestamp: DateTime.utc_now()
|
||||
})
|
||||
|
||||
error_result ->
|
||||
error_result
|
||||
end
|
||||
|
||||
{:reply, enhanced_response, state}
|
||||
else
|
||||
@@ -487,10 +519,12 @@ defmodule AgentCoordinator.MCPServer do
|
||||
all_tools = get_all_unified_tools_from_state(state)
|
||||
|
||||
# Apply tool filtering if client context is provided
|
||||
filtered_tools = case client_context do
|
||||
nil -> all_tools # No filtering for nil context (backward compatibility)
|
||||
context -> ToolFilter.filter_tools(all_tools, context)
|
||||
end
|
||||
filtered_tools =
|
||||
case client_context do
|
||||
# No filtering for nil context (backward compatibility)
|
||||
nil -> all_tools
|
||||
context -> ToolFilter.filter_tools(all_tools, context)
|
||||
end
|
||||
|
||||
{:reply, filtered_tools, state}
|
||||
end
|
||||
@@ -624,8 +658,12 @@ defmodule AgentCoordinator.MCPServer do
|
||||
|
||||
# Start inbox for the agent (handle already started case)
|
||||
case Inbox.start_link(agent.id) do
|
||||
{:ok, _pid} -> :ok
|
||||
{:error, {:already_started, _pid}} -> :ok
|
||||
{:ok, _pid} ->
|
||||
:ok
|
||||
|
||||
{:error, {:already_started, _pid}} ->
|
||||
:ok
|
||||
|
||||
{:error, reason} ->
|
||||
IO.puts(:stderr, "Failed to start inbox for agent #{agent.id}: #{inspect(reason)}")
|
||||
:ok
|
||||
@@ -645,13 +683,14 @@ defmodule AgentCoordinator.MCPServer do
|
||||
# Track the session if we have caller info
|
||||
track_agent_session(agent.id, name, capabilities)
|
||||
|
||||
{:ok, %{
|
||||
agent_id: agent.id,
|
||||
codebase_id: agent.codebase_id,
|
||||
status: "registered",
|
||||
session_token: session_token,
|
||||
expires_at: DateTime.add(DateTime.utc_now(), 60, :minute) |> DateTime.to_iso8601()
|
||||
}}
|
||||
{:ok,
|
||||
%{
|
||||
agent_id: agent.id,
|
||||
codebase_id: agent.codebase_id,
|
||||
status: "registered",
|
||||
session_token: session_token,
|
||||
expires_at: DateTime.add(DateTime.utc_now(), 60, :minute) |> DateTime.to_iso8601()
|
||||
}}
|
||||
|
||||
{:error, reason} ->
|
||||
IO.puts(:stderr, "Failed to create session for agent #{agent.id}: #{inspect(reason)}")
|
||||
@@ -1136,7 +1175,9 @@ defmodule AgentCoordinator.MCPServer do
|
||||
|
||||
# NEW: Codebase discovery function
|
||||
|
||||
defp discover_codebase_info(%{"agent_id" => agent_id, "workspace_path" => workspace_path} = args) do
|
||||
defp discover_codebase_info(
|
||||
%{"agent_id" => agent_id, "workspace_path" => workspace_path} = args
|
||||
) do
|
||||
custom_id = Map.get(args, "custom_id")
|
||||
|
||||
# Use the CodebaseIdentifier to analyze the workspace
|
||||
@@ -1145,32 +1186,37 @@ defmodule AgentCoordinator.MCPServer do
|
||||
case AgentCoordinator.CodebaseIdentifier.identify_codebase(workspace_path, opts) do
|
||||
codebase_info ->
|
||||
# Also check if this codebase is already registered
|
||||
existing_codebase = case CodebaseRegistry.get_codebase(codebase_info.canonical_id) do
|
||||
{:ok, codebase} -> codebase
|
||||
{:error, :not_found} -> nil
|
||||
end
|
||||
existing_codebase =
|
||||
case CodebaseRegistry.get_codebase(codebase_info.canonical_id) do
|
||||
{:ok, codebase} -> codebase
|
||||
{:error, :not_found} -> nil
|
||||
end
|
||||
|
||||
# Check for other agents working on same codebase
|
||||
agents = TaskRegistry.list_agents()
|
||||
related_agents = Enum.filter(agents, fn agent ->
|
||||
agent.codebase_id == codebase_info.canonical_id and agent.id != agent_id
|
||||
end)
|
||||
|
||||
related_agents =
|
||||
Enum.filter(agents, fn agent ->
|
||||
agent.codebase_id == codebase_info.canonical_id and agent.id != agent_id
|
||||
end)
|
||||
|
||||
response = %{
|
||||
codebase_info: codebase_info,
|
||||
already_registered: existing_codebase != nil,
|
||||
existing_codebase: existing_codebase,
|
||||
related_agents: Enum.map(related_agents, fn agent ->
|
||||
%{
|
||||
agent_id: agent.id,
|
||||
name: agent.name,
|
||||
capabilities: agent.capabilities,
|
||||
status: agent.status,
|
||||
workspace_path: agent.workspace_path,
|
||||
online: Agent.is_online?(agent)
|
||||
}
|
||||
end),
|
||||
recommendations: generate_codebase_recommendations(codebase_info, existing_codebase, related_agents)
|
||||
related_agents:
|
||||
Enum.map(related_agents, fn agent ->
|
||||
%{
|
||||
agent_id: agent.id,
|
||||
name: agent.name,
|
||||
capabilities: agent.capabilities,
|
||||
status: agent.status,
|
||||
workspace_path: agent.workspace_path,
|
||||
online: Agent.is_online?(agent)
|
||||
}
|
||||
end),
|
||||
recommendations:
|
||||
generate_codebase_recommendations(codebase_info, existing_codebase, related_agents)
|
||||
}
|
||||
|
||||
{:ok, response}
|
||||
@@ -1181,26 +1227,39 @@ defmodule AgentCoordinator.MCPServer do
|
||||
recommendations = []
|
||||
|
||||
# Recommend registration if not already registered
|
||||
recommendations = if existing_codebase == nil do
|
||||
["Consider registering this codebase with register_codebase for better coordination" | recommendations]
|
||||
else
|
||||
recommendations
|
||||
end
|
||||
recommendations =
|
||||
if existing_codebase == nil do
|
||||
[
|
||||
"Consider registering this codebase with register_codebase for better coordination"
|
||||
| recommendations
|
||||
]
|
||||
else
|
||||
recommendations
|
||||
end
|
||||
|
||||
# Recommend coordination if other agents are working on same codebase
|
||||
recommendations = if length(related_agents) > 0 do
|
||||
agent_names = Enum.map(related_agents, & &1.name) |> Enum.join(", ")
|
||||
["Other agents working on this codebase: #{agent_names}. Consider coordination." | recommendations]
|
||||
else
|
||||
recommendations
|
||||
end
|
||||
recommendations =
|
||||
if length(related_agents) > 0 do
|
||||
agent_names = Enum.map(related_agents, & &1.name) |> Enum.join(", ")
|
||||
|
||||
[
|
||||
"Other agents working on this codebase: #{agent_names}. Consider coordination."
|
||||
| recommendations
|
||||
]
|
||||
else
|
||||
recommendations
|
||||
end
|
||||
|
||||
# Recommend git setup if local folder without git
|
||||
recommendations = if codebase_info.identification_method == :folder_name do
|
||||
["Consider initializing git repository for better distributed coordination" | recommendations]
|
||||
else
|
||||
recommendations
|
||||
end
|
||||
recommendations =
|
||||
if codebase_info.identification_method == :folder_name do
|
||||
[
|
||||
"Consider initializing git repository for better distributed coordination"
|
||||
| recommendations
|
||||
]
|
||||
else
|
||||
recommendations
|
||||
end
|
||||
|
||||
Enum.reverse(recommendations)
|
||||
end
|
||||
@@ -1343,7 +1402,11 @@ defmodule AgentCoordinator.MCPServer do
|
||||
{:ok, tools}
|
||||
|
||||
{:ok, unexpected} ->
|
||||
IO.puts(:stderr, "Unexpected tools response from #{server_info.name}: #{inspect(unexpected)}")
|
||||
IO.puts(
|
||||
:stderr,
|
||||
"Unexpected tools response from #{server_info.name}: #{inspect(unexpected)}"
|
||||
)
|
||||
|
||||
{:ok, []}
|
||||
|
||||
{:error, reason} ->
|
||||
@@ -1369,7 +1432,11 @@ defmodule AgentCoordinator.MCPServer do
|
||||
{:ok, response}
|
||||
|
||||
{:error, %Jason.DecodeError{} = error} ->
|
||||
IO.puts(:stderr, "JSON decode error for server #{server_info.name}: #{Exception.message(error)}")
|
||||
IO.puts(
|
||||
:stderr,
|
||||
"JSON decode error for server #{server_info.name}: #{Exception.message(error)}"
|
||||
)
|
||||
|
||||
{:error, "JSON decode error: #{Exception.message(error)}"}
|
||||
end
|
||||
end
|
||||
@@ -1379,9 +1446,11 @@ defmodule AgentCoordinator.MCPServer do
|
||||
receive do
|
||||
{^port, {:data, data}} ->
|
||||
new_acc = acc <> data
|
||||
|
||||
case extract_json_from_data(new_acc) do
|
||||
{json_message, _remaining} when json_message != nil ->
|
||||
json_message
|
||||
|
||||
{nil, remaining} ->
|
||||
collect_external_response(port, remaining, timeout)
|
||||
end
|
||||
@@ -1403,6 +1472,7 @@ defmodule AgentCoordinator.MCPServer do
|
||||
case json_lines do
|
||||
[] ->
|
||||
last_line = List.last(lines) || ""
|
||||
|
||||
if String.trim(last_line) != "" and not String.ends_with?(data, "\n") do
|
||||
{nil, last_line}
|
||||
else
|
||||
@@ -1411,6 +1481,7 @@ defmodule AgentCoordinator.MCPServer do
|
||||
|
||||
_ ->
|
||||
json_data = Enum.join(json_lines, "\n")
|
||||
|
||||
case Jason.decode(json_data) do
|
||||
{:ok, _} -> {json_data, ""}
|
||||
{:error, _} -> {nil, data}
|
||||
@@ -1461,36 +1532,45 @@ defmodule AgentCoordinator.MCPServer do
|
||||
required = Map.get(input_schema, "required", [])
|
||||
|
||||
# Add agent_id to properties if not already present
|
||||
updated_properties = Map.put_new(properties, "agent_id", %{
|
||||
"type" => "string",
|
||||
"description" => "ID of the agent making the tool call"
|
||||
})
|
||||
updated_properties =
|
||||
Map.put_new(properties, "agent_id", %{
|
||||
"type" => "string",
|
||||
"description" => "ID of the agent making the tool call"
|
||||
})
|
||||
|
||||
# Add agent_id to required fields if not already present
|
||||
updated_required = if "agent_id" in required do
|
||||
required
|
||||
else
|
||||
["agent_id" | required]
|
||||
end
|
||||
updated_required =
|
||||
if "agent_id" in required do
|
||||
required
|
||||
else
|
||||
["agent_id" | required]
|
||||
end
|
||||
|
||||
# Update the input schema
|
||||
updated_schema = Map.merge(input_schema, %{
|
||||
"properties" => updated_properties,
|
||||
"required" => updated_required
|
||||
})
|
||||
updated_schema =
|
||||
Map.merge(input_schema, %{
|
||||
"properties" => updated_properties,
|
||||
"required" => updated_required
|
||||
})
|
||||
|
||||
# Return the tool with updated schema
|
||||
Map.put(tool, "inputSchema", updated_schema)
|
||||
rescue
|
||||
error ->
|
||||
IO.puts(:stderr, "Failed to transform tool schema for #{inspect(tool)}: #{inspect(error)}")
|
||||
tool # Return original tool if transformation fails
|
||||
IO.puts(
|
||||
:stderr,
|
||||
"Failed to transform tool schema for #{inspect(tool)}: #{inspect(error)}"
|
||||
)
|
||||
|
||||
# Return original tool if transformation fails
|
||||
tool
|
||||
end
|
||||
end
|
||||
|
||||
defp transform_external_tool_schema(tool) do
|
||||
IO.puts(:stderr, "Received non-map tool: #{inspect(tool)}")
|
||||
tool # Return as-is if not a map
|
||||
# Return as-is if not a map
|
||||
tool
|
||||
end
|
||||
|
||||
defp create_external_pid_file(server_name, os_pid) do
|
||||
@@ -1540,20 +1620,21 @@ defmodule AgentCoordinator.MCPServer do
|
||||
# Check if it's a coordinator tool first
|
||||
coordinator_tool_names = Enum.map(@mcp_tools, & &1["name"])
|
||||
|
||||
result = cond do
|
||||
tool_name in coordinator_tool_names ->
|
||||
handle_coordinator_tool(tool_name, args)
|
||||
result =
|
||||
cond do
|
||||
tool_name in coordinator_tool_names ->
|
||||
handle_coordinator_tool(tool_name, args)
|
||||
|
||||
# Check if it's a VS Code tool
|
||||
String.starts_with?(tool_name, "vscode_") ->
|
||||
# Route to VS Code Tool Provider with agent context
|
||||
context = if agent_id, do: %{agent_id: agent_id}, else: %{}
|
||||
VSCodeToolProvider.handle_tool_call(tool_name, args, context)
|
||||
# Check if it's a VS Code tool
|
||||
String.starts_with?(tool_name, "vscode_") ->
|
||||
# Route to VS Code Tool Provider with agent context
|
||||
context = if agent_id, do: %{agent_id: agent_id}, else: %{}
|
||||
VSCodeToolProvider.handle_tool_call(tool_name, args, context)
|
||||
|
||||
true ->
|
||||
# Try to route to external server
|
||||
route_to_external_server(tool_name, args, state)
|
||||
end
|
||||
true ->
|
||||
# Try to route to external server
|
||||
route_to_external_server(tool_name, args, state)
|
||||
end
|
||||
|
||||
# Clear agent activity after tool call completes (optional - could keep until next call)
|
||||
# if agent_id do
|
||||
@@ -1616,9 +1697,10 @@ defmodule AgentCoordinator.MCPServer do
|
||||
case result do
|
||||
%{"content" => content} when is_list(content) ->
|
||||
# Return the first text content for simplicity
|
||||
text_content = Enum.find(content, fn item ->
|
||||
Map.get(item, "type") == "text"
|
||||
end)
|
||||
text_content =
|
||||
Enum.find(content, fn item ->
|
||||
Map.get(item, "type") == "text"
|
||||
end)
|
||||
|
||||
if text_content do
|
||||
case Jason.decode(Map.get(text_content, "text", "{}")) do
|
||||
@@ -1672,7 +1754,9 @@ defmodule AgentCoordinator.MCPServer do
|
||||
|
||||
defp update_session_activity(agent_id) do
|
||||
case Process.get({:agent_session, agent_id}) do
|
||||
nil -> :ok
|
||||
nil ->
|
||||
:ok
|
||||
|
||||
session_info ->
|
||||
updated_session = %{session_info | last_activity: DateTime.utc_now()}
|
||||
Process.put({:agent_session, agent_id}, updated_session)
|
||||
@@ -1684,7 +1768,8 @@ defmodule AgentCoordinator.MCPServer do
|
||||
|
||||
# For system calls, don't require agent_id
|
||||
if method in ["initialize", "tools/list", "notifications/initialized"] do
|
||||
%{agent_id: nil} # System call, no agent context needed
|
||||
# System call, no agent context needed
|
||||
%{agent_id: nil}
|
||||
else
|
||||
# Try to get agent_id from various sources for non-system calls
|
||||
cond do
|
||||
@@ -1697,7 +1782,8 @@ defmodule AgentCoordinator.MCPServer do
|
||||
|
||||
# If no explicit agent_id, return error - agents must register first
|
||||
true ->
|
||||
{:error, "Missing agent_id. Agents must register themselves using register_agent before calling other tools."}
|
||||
{:error,
|
||||
"Missing agent_id. Agents must register themselves using register_agent before calling other tools."}
|
||||
end
|
||||
end
|
||||
end
|
||||
@@ -1709,11 +1795,14 @@ defmodule AgentCoordinator.MCPServer do
|
||||
try do
|
||||
case Jason.decode!(File.read!(config_file)) do
|
||||
%{"servers" => servers} ->
|
||||
normalized_servers = Enum.into(servers, %{}, fn {name, config} ->
|
||||
normalized_config = normalize_server_config(config)
|
||||
{name, normalized_config}
|
||||
end)
|
||||
normalized_servers =
|
||||
Enum.into(servers, %{}, fn {name, config} ->
|
||||
normalized_config = normalize_server_config(config)
|
||||
{name, normalized_config}
|
||||
end)
|
||||
|
||||
%{servers: normalized_servers}
|
||||
|
||||
_ ->
|
||||
get_default_server_config()
|
||||
end
|
||||
|
||||
@@ -136,7 +136,12 @@ defmodule AgentCoordinator.SessionManager do
|
||||
session_data ->
|
||||
new_sessions = Map.delete(state.sessions, session_token)
|
||||
new_state = %{state | sessions: new_sessions}
|
||||
IO.puts(:stderr, "Invalidated session #{session_token} for agent #{session_data.agent_id}")
|
||||
|
||||
IO.puts(
|
||||
:stderr,
|
||||
"Invalidated session #{session_token} for agent #{session_data.agent_id}"
|
||||
)
|
||||
|
||||
{:reply, :ok, new_state}
|
||||
end
|
||||
end
|
||||
|
||||
@@ -20,22 +20,28 @@ defmodule AgentCoordinator.ToolFilter do
|
||||
Context information about the client connection.
|
||||
"""
|
||||
defstruct [
|
||||
:connection_type, # :local, :remote, :web
|
||||
:client_info, # Client identification
|
||||
:capabilities, # Client declared capabilities
|
||||
:security_level, # :trusted, :sandboxed, :restricted
|
||||
:origin, # For web clients, the origin domain
|
||||
:user_agent # Client user agent string
|
||||
# :local, :remote, :web
|
||||
:connection_type,
|
||||
# Client identification
|
||||
:client_info,
|
||||
# Client declared capabilities
|
||||
:capabilities,
|
||||
# :trusted, :sandboxed, :restricted
|
||||
:security_level,
|
||||
# For web clients, the origin domain
|
||||
:origin,
|
||||
# Client user agent string
|
||||
:user_agent
|
||||
]
|
||||
|
||||
@type client_context :: %__MODULE__{
|
||||
connection_type: :local | :remote | :web,
|
||||
client_info: map(),
|
||||
capabilities: [String.t()],
|
||||
security_level: :trusted | :sandboxed | :restricted,
|
||||
origin: String.t() | nil,
|
||||
user_agent: String.t() | nil
|
||||
}
|
||||
connection_type: :local | :remote | :web,
|
||||
client_info: map(),
|
||||
capabilities: [String.t()],
|
||||
security_level: :trusted | :sandboxed | :restricted,
|
||||
origin: String.t() | nil,
|
||||
user_agent: String.t() | nil
|
||||
}
|
||||
|
||||
# Tool name patterns that indicate local-only functionality (defined as function to avoid compilation issues)
|
||||
defp local_only_patterns do
|
||||
@@ -198,12 +204,16 @@ defmodule AgentCoordinator.ToolFilter do
|
||||
description = Map.get(tool, "description", "")
|
||||
|
||||
# Check against known local-only tool names
|
||||
name_is_local = tool_name in get_local_only_tool_names() or
|
||||
Enum.any?(local_only_patterns(), &Regex.match?(&1, tool_name))
|
||||
name_is_local =
|
||||
tool_name in get_local_only_tool_names() or
|
||||
Enum.any?(local_only_patterns(), &Regex.match?(&1, tool_name))
|
||||
|
||||
# Check description for local-only indicators
|
||||
description_is_local = String.contains?(String.downcase(description),
|
||||
["filesystem", "file system", "vscode", "terminal", "local file", "directory"])
|
||||
description_is_local =
|
||||
String.contains?(
|
||||
String.downcase(description),
|
||||
["filesystem", "file system", "vscode", "terminal", "local file", "directory"]
|
||||
)
|
||||
|
||||
# Check tool schema for local-only parameters
|
||||
schema_is_local = has_local_only_parameters?(tool)
|
||||
@@ -214,19 +224,39 @@ defmodule AgentCoordinator.ToolFilter do
|
||||
defp get_local_only_tool_names do
|
||||
[
|
||||
# Filesystem tools
|
||||
"read_file", "write_file", "create_file", "delete_file",
|
||||
"list_directory", "search_files", "move_file", "get_file_info",
|
||||
"list_allowed_directories", "directory_tree", "edit_file",
|
||||
"read_text_file", "read_multiple_files", "read_media_file",
|
||||
"read_file",
|
||||
"write_file",
|
||||
"create_file",
|
||||
"delete_file",
|
||||
"list_directory",
|
||||
"search_files",
|
||||
"move_file",
|
||||
"get_file_info",
|
||||
"list_allowed_directories",
|
||||
"directory_tree",
|
||||
"edit_file",
|
||||
"read_text_file",
|
||||
"read_multiple_files",
|
||||
"read_media_file",
|
||||
|
||||
# VSCode tools
|
||||
"vscode_create_file", "vscode_write_file", "vscode_read_file",
|
||||
"vscode_delete_file", "vscode_list_directory", "vscode_get_active_editor",
|
||||
"vscode_set_editor_content", "vscode_get_selection", "vscode_set_selection",
|
||||
"vscode_show_message", "vscode_run_command", "vscode_get_workspace_folders",
|
||||
"vscode_create_file",
|
||||
"vscode_write_file",
|
||||
"vscode_read_file",
|
||||
"vscode_delete_file",
|
||||
"vscode_list_directory",
|
||||
"vscode_get_active_editor",
|
||||
"vscode_set_editor_content",
|
||||
"vscode_get_selection",
|
||||
"vscode_set_selection",
|
||||
"vscode_show_message",
|
||||
"vscode_run_command",
|
||||
"vscode_get_workspace_folders",
|
||||
|
||||
# Terminal/process tools
|
||||
"run_in_terminal", "get_terminal_output", "terminal_last_command",
|
||||
"run_in_terminal",
|
||||
"get_terminal_output",
|
||||
"terminal_last_command",
|
||||
"terminal_selection"
|
||||
]
|
||||
end
|
||||
@@ -238,8 +268,10 @@ defmodule AgentCoordinator.ToolFilter do
|
||||
# Look for file path parameters or other local indicators
|
||||
Enum.any?(properties, fn {param_name, param_schema} ->
|
||||
param_name in ["path", "filePath", "file_path", "directory", "workspace_path"] or
|
||||
String.contains?(Map.get(param_schema, "description", ""),
|
||||
["file path", "directory", "workspace", "local"])
|
||||
String.contains?(
|
||||
Map.get(param_schema, "description", ""),
|
||||
["file path", "directory", "workspace", "local"]
|
||||
)
|
||||
end)
|
||||
end
|
||||
|
||||
@@ -251,20 +283,25 @@ defmodule AgentCoordinator.ToolFilter do
|
||||
Map.get(connection_info, :remote_ip) == "127.0.0.1" -> :local
|
||||
Map.get(connection_info, :remote_ip) == "::1" -> :local
|
||||
Map.has_key?(connection_info, :remote_ip) -> :remote
|
||||
true -> :local # Default to local for stdio
|
||||
# Default to local for stdio
|
||||
true -> :local
|
||||
end
|
||||
end
|
||||
|
||||
defp determine_security_level(connection_type, connection_info) do
|
||||
case connection_type do
|
||||
:local -> :trusted
|
||||
:local ->
|
||||
:trusted
|
||||
|
||||
:remote ->
|
||||
if Map.get(connection_info, :secure, false) do
|
||||
:sandboxed
|
||||
else
|
||||
:restricted
|
||||
end
|
||||
:web -> :sandboxed
|
||||
|
||||
:web ->
|
||||
:sandboxed
|
||||
end
|
||||
end
|
||||
|
||||
@@ -278,5 +315,4 @@ defmodule AgentCoordinator.ToolFilter do
|
||||
tools
|
||||
end
|
||||
end
|
||||
|
||||
end
|
||||
|
||||
@@ -21,7 +21,8 @@ defmodule AgentCoordinator.WebSocketHandler do
|
||||
:connection_info
|
||||
]
|
||||
|
||||
@heartbeat_interval 30_000 # 30 seconds
|
||||
# 30 seconds
|
||||
@heartbeat_interval 30_000
|
||||
|
||||
@impl WebSock
|
||||
def init(opts) do
|
||||
@@ -108,7 +109,11 @@ defmodule AgentCoordinator.WebSocketHandler do
|
||||
|
||||
@impl WebSock
|
||||
def terminate(reason, state) do
|
||||
IO.puts(:stderr, "WebSocket connection terminated: #{state.session_id}, reason: #{inspect(reason)}")
|
||||
IO.puts(
|
||||
:stderr,
|
||||
"WebSocket connection terminated: #{state.session_id}, reason: #{inspect(reason)}"
|
||||
)
|
||||
|
||||
cleanup_session(state)
|
||||
:ok
|
||||
end
|
||||
@@ -183,10 +188,7 @@ defmodule AgentCoordinator.WebSocketHandler do
|
||||
}
|
||||
}
|
||||
|
||||
updated_state = %{state |
|
||||
client_context: client_context,
|
||||
connection_info: connection_info
|
||||
}
|
||||
updated_state = %{state | client_context: client_context, connection_info: connection_info}
|
||||
|
||||
{:reply, {:text, Jason.encode!(response)}, updated_state}
|
||||
end
|
||||
@@ -246,6 +248,7 @@ defmodule AgentCoordinator.WebSocketHandler do
|
||||
|
||||
unexpected ->
|
||||
IO.puts(:stderr, "Unexpected MCP response: #{inspect(unexpected)}")
|
||||
|
||||
error_response = %{
|
||||
"jsonrpc" => "2.0",
|
||||
"id" => Map.get(message, "id"),
|
||||
@@ -264,7 +267,8 @@ defmodule AgentCoordinator.WebSocketHandler do
|
||||
"id" => Map.get(message, "id"),
|
||||
"error" => %{
|
||||
"code" => -32601,
|
||||
"message" => "Tool not available for #{state.client_context.connection_type} clients: #{tool_name}"
|
||||
"message" =>
|
||||
"Tool not available for #{state.client_context.connection_type} clients: #{tool_name}"
|
||||
}
|
||||
}
|
||||
|
||||
@@ -325,14 +329,15 @@ defmodule AgentCoordinator.WebSocketHandler do
|
||||
# Add session tracking info to the message
|
||||
params = Map.get(message, "params", %{})
|
||||
|
||||
enhanced_params = params
|
||||
|> Map.put("_session_id", state.session_id)
|
||||
|> Map.put("_transport", "websocket")
|
||||
|> Map.put("_client_context", %{
|
||||
connection_type: state.client_context.connection_type,
|
||||
security_level: state.client_context.security_level,
|
||||
session_id: state.session_id
|
||||
})
|
||||
enhanced_params =
|
||||
params
|
||||
|> Map.put("_session_id", state.session_id)
|
||||
|> Map.put("_transport", "websocket")
|
||||
|> Map.put("_client_context", %{
|
||||
connection_type: state.client_context.connection_type,
|
||||
security_level: state.client_context.security_level,
|
||||
session_id: state.session_id
|
||||
})
|
||||
|
||||
Map.put(message, "params", enhanced_params)
|
||||
end
|
||||
|
||||
@@ -14,16 +14,19 @@ IO.puts("=" |> String.duplicate(60))
|
||||
try do
|
||||
TaskRegistry.start_link()
|
||||
rescue
|
||||
_ -> :ok # Already started
|
||||
# Already started
|
||||
_ -> :ok
|
||||
end
|
||||
|
||||
try do
|
||||
MCPServer.start_link()
|
||||
rescue
|
||||
_ -> :ok # Already started
|
||||
# Already started
|
||||
_ -> :ok
|
||||
end
|
||||
|
||||
Process.sleep(1000) # Give services time to start
|
||||
# Give services time to start
|
||||
Process.sleep(1000)
|
||||
|
||||
# Test 1: Register two agents
|
||||
IO.puts("\n1️⃣ Registering two test agents...")
|
||||
@@ -58,23 +61,27 @@ resp1 = MCPServer.handle_mcp_request(agent1_req)
|
||||
resp2 = MCPServer.handle_mcp_request(agent2_req)
|
||||
|
||||
# Extract agent IDs
|
||||
agent1_id = case resp1 do
|
||||
%{"result" => %{"content" => [%{"text" => text}]}} ->
|
||||
data = Jason.decode!(text)
|
||||
data["agent_id"]
|
||||
_ ->
|
||||
IO.puts("❌ Failed to register agent 1: #{inspect(resp1)}")
|
||||
System.halt(1)
|
||||
end
|
||||
agent1_id =
|
||||
case resp1 do
|
||||
%{"result" => %{"content" => [%{"text" => text}]}} ->
|
||||
data = Jason.decode!(text)
|
||||
data["agent_id"]
|
||||
|
||||
agent2_id = case resp2 do
|
||||
%{"result" => %{"content" => [%{"text" => text}]}} ->
|
||||
data = Jason.decode!(text)
|
||||
data["agent_id"]
|
||||
_ ->
|
||||
IO.puts("❌ Failed to register agent 2: #{inspect(resp2)}")
|
||||
System.halt(1)
|
||||
end
|
||||
_ ->
|
||||
IO.puts("❌ Failed to register agent 1: #{inspect(resp1)}")
|
||||
System.halt(1)
|
||||
end
|
||||
|
||||
agent2_id =
|
||||
case resp2 do
|
||||
%{"result" => %{"content" => [%{"text" => text}]}} ->
|
||||
data = Jason.decode!(text)
|
||||
data["agent_id"]
|
||||
|
||||
_ ->
|
||||
IO.puts("❌ Failed to register agent 2: #{inspect(resp2)}")
|
||||
System.halt(1)
|
||||
end
|
||||
|
||||
IO.puts("✅ Agent 1 (Alpha Wolf): #{agent1_id}")
|
||||
IO.puts("✅ Agent 2 (Beta Tiger): #{agent2_id}")
|
||||
@@ -219,7 +226,7 @@ history_req1 = %{
|
||||
history_resp1 = MCPServer.handle_mcp_request(history_req1)
|
||||
IO.puts("Agent 1 history: #{inspect(history_resp1)}")
|
||||
|
||||
IO.puts("\n" <> "=" |> String.duplicate(60))
|
||||
IO.puts(("\n" <> "=") |> String.duplicate(60))
|
||||
IO.puts("🎉 AGENT-SPECIFIC TASK POOLS TEST COMPLETE!")
|
||||
IO.puts("✅ Each agent now has their own task pool")
|
||||
IO.puts("✅ No more task chaos or cross-contamination")
|
||||
|
||||
@@ -202,6 +202,7 @@ defmodule AgentTaskPoolTest do
|
||||
%{"result" => %{"content" => [%{"text" => text}]}} ->
|
||||
data = Jason.decode!(text)
|
||||
data["agent_id"]
|
||||
|
||||
_ ->
|
||||
"unknown"
|
||||
end
|
||||
|
||||
@@ -30,24 +30,27 @@ Process.sleep(1000)
|
||||
IO.puts("\n2️⃣ Creating agent-specific tasks...")
|
||||
|
||||
# Tasks for Agent 1
|
||||
task1_agent1 = Task.new("Fix auth bug", "Debug authentication issue", %{
|
||||
priority: :high,
|
||||
assigned_agent: agent1.id,
|
||||
metadata: %{agent_created: true}
|
||||
})
|
||||
task1_agent1 =
|
||||
Task.new("Fix auth bug", "Debug authentication issue", %{
|
||||
priority: :high,
|
||||
assigned_agent: agent1.id,
|
||||
metadata: %{agent_created: true}
|
||||
})
|
||||
|
||||
task2_agent1 = Task.new("Add auth tests", "Write auth tests", %{
|
||||
priority: :normal,
|
||||
assigned_agent: agent1.id,
|
||||
metadata: %{agent_created: true}
|
||||
})
|
||||
task2_agent1 =
|
||||
Task.new("Add auth tests", "Write auth tests", %{
|
||||
priority: :normal,
|
||||
assigned_agent: agent1.id,
|
||||
metadata: %{agent_created: true}
|
||||
})
|
||||
|
||||
# Tasks for Agent 2
|
||||
task1_agent2 = Task.new("Write API docs", "Document endpoints", %{
|
||||
priority: :normal,
|
||||
assigned_agent: agent2.id,
|
||||
metadata: %{agent_created: true}
|
||||
})
|
||||
task1_agent2 =
|
||||
Task.new("Write API docs", "Document endpoints", %{
|
||||
priority: :normal,
|
||||
assigned_agent: agent2.id,
|
||||
metadata: %{agent_created: true}
|
||||
})
|
||||
|
||||
# Add tasks to respective inboxes
|
||||
Inbox.add_task(agent1.id, task1_agent1)
|
||||
@@ -76,7 +79,12 @@ IO.puts("\n4️⃣ Checking remaining tasks...")
|
||||
status1 = Inbox.get_status(agent1.id)
|
||||
status2 = Inbox.get_status(agent2.id)
|
||||
|
||||
IO.puts("Agent 1: #{status1.pending_count} pending, current: #{if status1.current_task, do: status1.current_task.title, else: "none"}")
|
||||
IO.puts("Agent 2: #{status2.pending_count} pending, current: #{if status2.current_task, do: status2.current_task.title, else: "none"}")
|
||||
IO.puts(
|
||||
"Agent 1: #{status1.pending_count} pending, current: #{if status1.current_task, do: status1.current_task.title, else: "none"}"
|
||||
)
|
||||
|
||||
IO.puts(
|
||||
"Agent 2: #{status2.pending_count} pending, current: #{if status2.current_task, do: status2.current_task.title, else: "none"}"
|
||||
)
|
||||
|
||||
IO.puts("\n🎉 SUCCESS! Agent-specific task pools working!")
|
||||
|
||||
@@ -90,14 +90,17 @@ defmodule SessionManagementTest do
|
||||
case Jason.decode(body) do
|
||||
{:ok, %{"result" => _result}} ->
|
||||
IO.puts(" ✅ Valid MCP response received")
|
||||
|
||||
{:ok, %{"error" => error}} ->
|
||||
IO.puts(" ⚠️ MCP error: #{inspect(error)}")
|
||||
|
||||
_ ->
|
||||
IO.puts(" ❌ Invalid response format")
|
||||
end
|
||||
|
||||
{:ok, %HTTPoison.Response{status_code: status_code, body: body}} ->
|
||||
IO.puts("❌ Request failed with status #{status_code}")
|
||||
|
||||
case Jason.decode(body) do
|
||||
{:ok, parsed} -> IO.puts(" Error: #{inspect(parsed)}")
|
||||
_ -> IO.puts(" Body: #{body}")
|
||||
|
||||
@@ -10,6 +10,7 @@ Process.sleep(1000)
|
||||
|
||||
# Test 1: Initialize call (system call, should work without agent_id)
|
||||
IO.puts("Testing initialize call...")
|
||||
|
||||
init_request = %{
|
||||
"jsonrpc" => "2.0",
|
||||
"id" => 1,
|
||||
@@ -31,6 +32,7 @@ IO.puts("Initialize response: #{inspect(init_response)}")
|
||||
|
||||
# Test 2: Tools/list call (system call, should work without agent_id)
|
||||
IO.puts("\nTesting tools/list call...")
|
||||
|
||||
tools_request = %{
|
||||
"jsonrpc" => "2.0",
|
||||
"id" => 2,
|
||||
@@ -42,6 +44,7 @@ IO.puts("Tools/list response: #{inspect(tools_response)}")
|
||||
|
||||
# Test 3: Register agent call (should work)
|
||||
IO.puts("\nTesting register_agent call...")
|
||||
|
||||
register_request = %{
|
||||
"jsonrpc" => "2.0",
|
||||
"id" => 3,
|
||||
@@ -59,7 +62,8 @@ register_response = GenServer.call(AgentCoordinator.MCPServer, {:mcp_request, re
|
||||
IO.puts("Register agent response: #{inspect(register_response)}")
|
||||
|
||||
# Test 4: Try a call that requires agent_id (should fail without agent_id)
|
||||
IO.puts("\nTesting call that requires agent_id (should fail)...")
|
||||
IO.puts("Testing call that requires agent_id (should fail)...")
|
||||
|
||||
task_request = %{
|
||||
"jsonrpc" => "2.0",
|
||||
"id" => 4,
|
||||
@@ -76,4 +80,4 @@ task_request = %{
|
||||
task_response = GenServer.call(AgentCoordinator.MCPServer, {:mcp_request, task_request})
|
||||
IO.puts("Task creation response: #{inspect(task_response)}")
|
||||
|
||||
IO.puts("\n✅ All tests completed!")"
|
||||
IO.puts("All tests completed!")
|
||||
|
||||
@@ -11,14 +11,17 @@ IO.puts("Testing VS Code tool integration...")
|
||||
|
||||
# Check if VS Code tools are available
|
||||
tools = AgentCoordinator.MCPServer.get_tools()
|
||||
vscode_tools = Enum.filter(tools, fn tool ->
|
||||
case Map.get(tool, "name") do
|
||||
"vscode_" <> _ -> true
|
||||
_ -> false
|
||||
end
|
||||
end)
|
||||
|
||||
vscode_tools =
|
||||
Enum.filter(tools, fn tool ->
|
||||
case Map.get(tool, "name") do
|
||||
"vscode_" <> _ -> true
|
||||
_ -> false
|
||||
end
|
||||
end)
|
||||
|
||||
IO.puts("Found #{length(vscode_tools)} VS Code tools:")
|
||||
|
||||
Enum.each(vscode_tools, fn tool ->
|
||||
IO.puts(" - #{tool["name"]}")
|
||||
end)
|
||||
|
||||
Reference in New Issue
Block a user