Archia Server
A runtime engine that powers secure MCP agent deployment at scale.
Overview
Archia Server is a high-performance runtime written in Rust that orchestrates MCP agents in production environments. It handles the complex infrastructure requirements of running AI agents with access to sensitive systems and data.
Key Responsibilities:
- Process lifecycle management for MCP servers
- Security isolation and credential injection
- Multi-agent orchestration and routing
- Real-time streaming and response handling
- Resource management and monitoring
Architecture
Security Model
Each layer provides defense in depth:
- API Layer: Authentication, rate limiting, CORS
- Agent Layer: Scoped permissions, prompt validation
- MCP Layer: Process isolation, resource limits
- System Layer: Sandboxing, audit logging
Configuration System
Archia uses a modular configuration approach:
- Server settings: Configured in
config.toml(network, local inference) - Agents: Individual TOML files in
~/.archia/agents/ - Tools: Individual TOML files in
~/.archia/tools/ - Prompts: Markdown files in
~/.archia/prompts/
Agent Definition
Agents are configured as individual TOML files combining models, prompts, and MCP tool access:
# ~/.archia/agents/researcher.toml
name = "researcher"
model_name = "claude-sonnet-4-5-20250929"
enabled = true
description = "Expert researcher with tool access"
system_prompt = """
You are an expert researcher with access to:
- Web search for current information
- Academic papers via arxiv
- Document storage for notes
"""
# Fine-grained tool access
[mcp_tools]
web_search = null # All tools
arxiv = ["search", "get_paper"]
filesystem = ["read_file", "write_file"]
Tool Configuration
Tools (MCP servers) are configured as TOML files:
Local STDIO Tools
# ~/.archia/tools/user/database/tool.toml
identifier = "database"
name = "Database Tool"
version = "1.0.0"
type = "mcp"
[local]
cmd = "mcp-sqlite"
args = ["--database", "/data/production.db"]
timeout_secs = 30
[local.env]
LOG_LEVEL = "info"
MAX_CONNECTIONS = "10"
Remote HTTP Tools
# ~/.archia/tools/user/cloud-api/tool.toml
identifier = "cloud-api"
name = "Cloud API"
version = "1.0.0"
type = "mcp"
[remote]
url = "https://api.example.com/mcp"
transport = "streaming_http"
auth_type = "bearer"
auth_token = "${API_TOKEN}"
timeout_secs = 60
Dynamic MCP Management
MCP servers are started on-demand and managed throughout their lifecycle:
Idle → Starting → Ready → Active → Shutting Down → Terminated
↑ ↓
└─Error─┘
Features:
- Lazy initialization to save resources
- Automatic restart on failure
- Graceful shutdown on idle timeout
- Health checking and recovery
Security Best Practices
1. Principle of Least Privilege
# Give agents only required tools
# ~/.archia/agents/readonly-analyst.toml
name = "readonly-analyst"
model_name = "claude-sonnet-4-5-20250929"
enabled = true
[mcp_tools]
database = ["query", "list_tables"] # Read-only operations only
2. Secure Secrets Management
# Use environment variables or secret managers
export DATABASE_PASSWORD=$(vault read secret/db/password)
archiad config.toml
3. Network Isolation
# Bind to localhost for internal use
[network]
host = "127.0.0.1" # Not 0.0.0.0
port = 8080
Advanced Topics
Multi-Region Deployment
Deploy globally with regional routing:
Global Load Balancer
│
┌───────┴───────┐
▼ ▼
US-East EU-West
Archia Archia
│ │
MCPs MCPs
Troubleshooting
Common Issues
MCP Server Won’t Start
- Check executable path and permissions
- Verify environment variables
- Review logs
High Memory Usage
- Set resource limits per MCP server
- Enable idle timeout for cleanup
- Monitor
Slow Response Times
- Check model latency
- Optimize MCP server performance
- Enable response streaming
Migration Guide
Coming soon.
What’s Next?
- Interactive API Docs (Swagger) → - Interactive OpenAPI documentation
- Configuration → - Server and agent configuration
- Agent Configuration → - Complete agent setup guide
- Tool Configuration → - MCP tool setup guide
- API Reference → - Complete REST API documentation
Ready to deploy? Archia Server provides the production foundation your AI agents need.