EY.ai Cost Oversight MCP Server
Model Context Protocol (MCP) Server built with Python & FastAPI
This is a Model Context Protocol (MCP) server that provides Fabric workspace cost oversight capabilities for AI agents and LLM applications.
Available MCP Tools (1)
This MCP server exposes the following tools through the Model Context Protocol. These tools can be accessed by any MCP-compatible client such as Claude Desktop or through the MCP Inspector for testing.
get_workspace_cost
Get cost data for a specific workspace. Args: workspace_id: The workspace ID (e.g., '68b9c0d7e5650d15879d6be7') granularity: Cost granularity - 'yearly' for annual totals or 'monthly-01' for a single month bearer_token: Optional bearer token for authentication. Falls back to FABRIC_BEARER_TOKEN env var. Returns: JSON string with workspace cost data
# Example parameters
{
"workspace_id": "68b9c0d7e5650d15879d6be7",
"granularity": "yearly"
}
Parameters:
• workspace_id
(string, required)
• granularity
(string, optional)
• bearer_token
(string, optional)
MCP Inspector Testing
Use the MCP Inspector to connect to the streamable MCP endpoint and test tools interactively. The /mcp endpoint is live and ready to accept connections!
Launch MCP Inspector
The MCP Inspector provides a visual interface to test tool calls and inspect server responses. Simply run this command to connect to the streamable HTTP endpoint.
# Launch MCP Inspector (connects to http://localhost:/mcp)
npx @modelcontextprotocol/inspector \
http://localhost:/mcp
Quick Start
Get started with this Model Context Protocol (MCP) server locally or with Docker. The server provides both a web app and streamable MCP endpoint.
Local Development (Python)
# Install dependencies
make install
# Start server (web app + MCP endpoint on port )
make run
# Open web app
open http://python-mcp-pablo-poc-dev-hvagf.matts-capability-lab.002.eastus2.containers.sbp.eyclienthub.com/
Docker Deployment
# Build Docker image
make docker-build
# Run container (maps port :8080)
make docker-run