MCP Protocol Integration
Nefesh provides a native Model Context Protocol (MCP) server that gives AI agents direct access to human state data. Your AI agent can check a user's stress level, send biometric signals, and query psychological trigger profiles — all through standard MCP tool calls.
Server URL
https://mcp.nefesh.ai/mcp
Transport
Streamable HTTP — no local installation, no Docker, no stdio. Just add the URL to your agent's config and it works.
Authentication
Option A: Let your agent get a key (fastest). Connect without a key — two self-provisioning tools work without authentication:
{
"mcpServers": {
"nefesh": {
"url": "https://mcp.nefesh.ai/mcp"
}
}
}
Then ask your agent: "Get me a Nefesh API key for [email protected]" — the agent calls request_api_key, you click one email link, done. After that, add the key to your config for future sessions.
Option B: Sign up first. Get a key at nefesh.ai/signup, then add it to the config:
{
"mcpServers": {
"nefesh": {
"url": "https://mcp.nefesh.ai/mcp",
"headers": {
"X-Nefesh-Key": "YOUR_API_KEY"
}
}
}
}
Supported Agents
Works with Claude Desktop, Cursor, Windsurf, VS Code (Copilot), Cline, Kiro, OpenClaw, JetBrains IDEs, Zed, ChatGPT Desktop, Gemini CLI, Goose CLI, Augment, Replit, Continue.dev, LibreChat, Roo Code, and any other MCP-compatible client.
Available Tools
| Tool | Auth | Description |
|---|---|---|
request_api_key | No | Request a free API key by email — your agent handles the flow |
check_api_key_status | No | Poll for API key activation after email verification |
get_human_state | Yes | Get current stress state, score (0-100), confidence, and LLM behavior recommendation |
ingest | Yes | Send biometric signals — heart rate, voice, face, text, and 50+ more fields |
get_trigger_memory | Yes | Retrieve psychological trigger profile — active vs. resolved stress topics |
get_session_history | Yes | Get chronological state history for trend analysis |
See MCP Tools Reference for detailed parameters and examples.
CLI (@nefesh/cli)
Official command-line interface. Install with npm install -g @nefesh/cli and interact with the API directly from your terminal.
nefesh auth login --key YOUR_KEY
nefesh ingest --session test --heart-rate 72 --tone calm
nefesh state test --json
nefesh simulate --scenario stressed
nefesh setup cursor
Every command supports --json for machine-readable output. AI agents in the terminal can use the CLI with 10-32x lower token cost than MCP.
GitHub: nefesh-ai/nefesh-cli
A2A Integration (Agent-to-Agent Protocol v1.0)
Nefesh is also available as an A2A-compatible agent. While MCP handles tool-calling (your agent calls Nefesh directly), A2A enables agent-collaboration — other AI agents can communicate with Nefesh as a peer.
Agent Card:
https://mcp.nefesh.ai/.well-known/agent-card.json
A2A Endpoint:
POST https://mcp.nefesh.ai/a2a
Same authentication as MCP (X-Nefesh-Key header). Supports message/send, message/sendStream (SSE), tasks/get, and tasks/cancel.
Quick Test
After adding the config, ask your AI agent:
"What tools do you have from Nefesh?"
It should list the 6 tools above. Then try:
"Check the human state for session test-123"
Cognitive Compute Router (Gateway)
Adapts any LLM based on real-time biometric state. Three integration modes: OpenAI-compatible (/v1/chat/completions), Anthropic native passthrough (/v1/messages), and Unified Anthropic format for any backend. 11 LLM providers supported.
Change your LLM base URL to gateway.nefesh.ai. Zero code changes. See Gateway Documentation for full details.
curl https://gateway.nefesh.ai/v1/chat/completions \
-H "X-Nefesh-Key: YOUR_KEY" \
-H "X-Nefesh-Subject: usr_demo" \
-H "X-LLM-Key: your-llm-key" \
-d '{"model":"gpt-4o","messages":[{"role":"user","content":"Hello"}]}'
Response includes X-Nefesh-Adapted: true and X-Nefesh-State headers.
Open source on GitHub
Browse the source, file issues, or contribute to the project.
github.com/nefesh-ai/nefesh-mcp-server →