Custom SDK and MCP for Undocumented API
A critical enterprise API had fragmented documentation, no official SDK, and complex authentication — making it impossible for AI tools to interact with the system reliably.
The Challenge
The organization’s facilities management platform was central to daily operations — thousands of work orders, hundreds of locations, complex vendor relationships. But its API was a mess:
- Documentation was scattered across three separate API approaches (base, query, and command) with no unified guide
- Tenant-specific endpoint discovery was completely undocumented — each customer’s API lived at a dynamically resolved URL
- OAuth token management and refresh logic was implicit, not specified
- The command payload structures for work order lifecycle operations weren’t clearly defined
- Concurrency handling (required for safe updates) was buried in implementation details
No one could build reliable integrations against this API without weeks of reverse engineering. And without reliable integrations, AI tools couldn’t access or operate on facilities data.
The Approach
I solved this in two layers: a Python SDK that tamed the API, and an MCP server that exposed it to AI agents.
Layer 1: Python SDK
The SDK provides what the API documentation should have:
- Automatic endpoint discovery — queries the API locator service to resolve tenant-specific URLs, with region-based fallbacks
- Transparent OAuth token management — handles token refresh with a 60-second buffer before expiry, so callers never deal with auth state
- Fluent query builder — chainable
.select(),.where(),.order_by(),.limit()methods that hide the complex query expression JSON the API actually expects - High-level resource managers —
client.work_orders.create(),client.customers.update(),client.locations.get_with_attributes()instead of raw HTTP calls - Automatic concurrency handling — retries on optimistic locking conflicts, which the API uses for all update operations
- Equipment attribute resolution — a convenience method that merges location and asset attribute data, performing the join that the API doesn’t expose natively
The SDK also includes a CLI tool with multi-profile config support, so ops teams can query the system directly from the terminal.
Layer 2: MCP Server
Built with FastMCP on top of the SDK, the MCP server exposes 23 tools covering:
- Equipment lookup — find equipment by store, get detailed specs (make, model, serial), or scan NFC/barcode tags
- Full work order lifecycle — create, assign, pick up, start, pause, complete, hold, reopen, cancel — with the state machine logic encoded in the tool definitions
- Filtered queries — list work orders by status, customer, brand, or assignee
- Entity management — create and update customers, contacts, employees, and locations
- Generic query interface — for ad-hoc queries across any entity type with filtering, sorting, and pagination
The MCP server also provides 6 prompt templates (triage, diagnosis, status updates, troubleshooting) and 8 resource URIs for direct lookups.
A critical design constraint shaped the entire architecture: the MCP server needed to work with a voice agent platform that enforces OpenAI’s strict mode — meaning every tool parameter must be required, with no optional fields. This required a custom compatibility transform layer that adjusts tool schemas at registration time without changing the underlying implementation.
The Results
The SDK eliminated weeks of integration work for every new project that needs to talk to the facilities platform. What used to require reverse-engineering the API from scratch is now a pip install and a few lines of configuration.
The MCP server powers multiple downstream integrations: a voice agent for inbound facilities calls, a chat interface embedded in the organization’s BI platform, and internal automation workflows. Each consumer uses the same 23 tools through the same authentication layer, with per-client bearer tokens for access control.
The two-layer architecture — SDK then MCP — turned out to be the right pattern. The SDK handles the API’s quirks (endpoint discovery, token refresh, concurrency). The MCP server handles the AI integration concerns (tool schemas, prompt templates, authentication). Neither layer knows about the other’s problems.
Services: MCP Server & Integration Development · AI Agent Development