Remember when every phone had a different charger? Proprietary connectors, adapters everywhere, nothing worked with anything else. Then USB-C showed up and the entire industry standardized overnight.
That's what Model Context Protocol (MCP) is doing for AI — and ServiceNow just plugged in.
If you're a ServiceNow practitioner who's been watching the AI agent explosion from the sidelines, this is your on-ramp. MCP is the single most important integration standard to hit enterprise AI this year, and it's about to change how every tool, platform, and LLM talks to your ServiceNow instance.
What Is MCP (And Why Should You Care)?
Model Context Protocol is an open standard created by Anthropic (the company behind Claude) that defines how AI models connect to external tools and data sources. Think of it as a universal adapter between any AI agent and any enterprise system.
Before MCP, every AI integration was bespoke. Want Claude to query your CMDB? Custom API code. Want ChatGPT to create incidents? Different custom API code. Want Gemini to pull KB articles? Yet another integration. Every combination of AI model × enterprise tool required its own plumbing.
MCP eliminates that. Build one MCP server for ServiceNow, and every MCP-compatible AI client can use it — Claude Desktop, VS Code Copilot, Cursor, custom agents, whatever comes next. One integration. Universal access.
That's the USB-C moment.
How MCP Actually Works
The architecture is dead simple, which is why it's winning:
AI Client MCP Server ServiceNow
(Claude, <-> (translates AI <-> Instance
Copilot, requests into
Cursor) SN API calls)
Protocol: JSON-RPC over stdio or HTTP
Backend: Table API, Stats API, Attachments
The MCP server exposes tools — discrete actions that an AI model can invoke. Each tool has a name, a description, and a typed input schema. The AI model reads the tool descriptions, decides which ones to call based on the user's request, and sends structured JSON-RPC messages.
For ServiceNow, that means tools like:
list_records— Query any table with filters, pagination, field selectioncreate_record— Create records on any tableupdate_record— Update existing records by sys_idget_record— Fetch a single record with full field dataaggregate_stats— Run COUNT, AVG, MIN, MAX, SUM across any tableget_schema— Introspect table schemas, field types, and choices
The AI doesn't need to know ServiceNow's REST API syntax. It just calls list_records with {"table": "incident", "query": "priority=1^state=2"} and gets back structured data.
ServiceNow's Native MCP: sn_mcp_server
ServiceNow shipped native MCP support through the sn_mcp_server plugin, available on Zurich Patch 4+ and Yokohama Patch 11+. Here's what you need to know:
What It Does
- Exposes Now Assist Skills and Custom Skills (via NASK) as MCP tools
- Supports Streamable HTTP and SSE transport (remote server model)
- OAuth authentication required
- Endpoint:
https://<instance>/sncapps/mcp-server/mcp/<name>
What It Requires
This is where it gets real:
- Now Assist SKU — Pro Plus or Enterprise Plus subscription
- AI Agents store app v6.x+
- MCP Client store app v1.1+
- Tools consume action-based assists (skills as tools = assists + 1)
- Not available on GCC (Government Community Cloud)
Translation: if your org is already invested in Now Assist, native MCP is the path of least resistance. If you're not? That's where the open-source option comes in.
The licensing implications here are significant — and they tie directly into the larger shift away from seat-based pricing that's reshaping the ServiceNow ecosystem.
The Open-Source Alternative: @onlyflows/servicenow-mcp
Full disclosure: we built this. @onlyflows/servicenow-mcp is a free, open-source MCP server for ServiceNow that we published on npm. Here's why:
Not every org has Now Assist licensing. A lot of ServiceNow customers — especially in the mid-market — are running Xanadu or Washington without AI add-ons. They shouldn't be locked out of the MCP ecosystem because of a licensing gate.
How It Works
One command to install, zero configuration beyond your instance credentials:
npx @onlyflows/servicenow-mcpIt runs as a stdio MCP server — meaning it works natively with Claude Desktop, VS Code, Cursor, and any MCP-compatible client without deploying a web server or configuring OAuth flows.
Add it to your Claude Desktop config:
{
"mcpServers": {
"servicenow": {
"command": "npx",
"args": ["@onlyflows/servicenow-mcp"],
"env": {
"SN_INSTANCE": "yourinstance.service-now.com",
"SN_USERNAME": "your.user",
"SN_PASSWORD": "your-password"
}
}
}
}That's it. Open Claude Desktop and start asking questions about your ServiceNow data.
What You Get
- Full CRUD on any ServiceNow table (Table API)
- Aggregate analytics — COUNT, AVG, MIN, MAX, SUM via Stats API
- Schema introspection — discover tables, fields, types, and choice lists
- Attachment management — list, read, and upload attachments
- Zero dependencies — native fetch, no external HTTP libraries
- Runs locally — no hosted service or third-party SaaS account required
Native vs. Open Source: When to Use Each
🏢 Native (sn_mcp_server)
- Cost: Requires Now Assist SKU (Pro Plus or Enterprise Plus)
- Transport: HTTP/SSE (remote server)
- Auth: OAuth 2.0
- Tools: Now Assist Skills + NASK custom skills
- Best for: Orgs with Now Assist already deployed
- Setup time: ~30 min (OAuth configuration + store app installs)
🔓 Open Source (@onlyflows/servicenow-mcp)
- Cost: Free (MIT license)
- Transport: stdio (local process)
- Auth: Basic auth / API key
- Tools: Table API + Stats API + Schema introspection + Attachments
- Best for: Everyone else — local dev, POCs, mid-market orgs without Now Assist
- Setup time: ~2 min (npx + environment variables)
They're not mutually exclusive. You might use the native MCP server for production agent workflows that leverage Now Assist skills, and the open-source version for developer tooling, ad-hoc queries, and local prototyping.
Real-World Use Cases
1. Incident Triage Copilot
Point Claude at your instance via MCP. Ask: "Show me all P1 incidents opened in the last 24 hours with their assignment groups and short descriptions." Get a structured answer in seconds — no GlideRecord scripting, no report building.
2. CMDB Data Quality Audit
Use the aggregate tools: "How many CIs in the cmdb_ci_server table have no assigned support group? Break it down by environment." Instant analytics without writing a scheduled job or a Performance Analytics widget.
3. Knowledge-Powered Resolution
Combine MCP with RAG: feed your KB articles into the AI context, then let it cross-reference live incident data. "Based on similar past incidents, what's the likely root cause for INC0012345?"
4. Change Risk Assessment
Query change requests, related CIs, and historical outage data in a single conversation. Let the AI synthesize risk factors that would take a human analyst hours to compile.
Getting Started in 10 Minutes
- Install Node.js 18+ if you don't have it
- Create a ServiceNow service account with appropriate table ACLs (read-only is fine for starters)
- Run the server:
npx @onlyflows/servicenow-mcp - Configure your AI client (Claude Desktop, VS Code, etc.) with the MCP server config above
- Start querying: "List all active incidents assigned to my group"
The entire setup takes less time than writing a single GlideRecord script.
Where This Is Heading
MCP is gaining momentum fast. IBM, AWS, Vercel, and Google have all published MCP implementations or integrations in the last month. The protocol is being submitted for formal standardization.
For ServiceNow, this means:
- More AI clients will "just work" with your instance out of the box
- Custom integrations get simpler — one MCP server replaces dozens of bespoke connectors
- The ecosystem grows — community-built MCP tools will emerge for specialized use cases (ITOM discovery, SecOps, HR Service Delivery)
The practitioners who understand MCP now will be designing the agent architectures that enterprises adopt in 2027. This is ground-floor stuff.
The Bottom Line
MCP is the standardization moment that enterprise AI has been waiting for. ServiceNow adopting it — both natively and through open-source tooling — means the platform is ready for the agent era.
Whether you use ServiceNow's native sn_mcp_server or spin up @onlyflows/servicenow-mcp in two minutes, the important thing is to start building. The USB-C port is here. Time to plug in.
Brandon Wilson is a ServiceNow Certified Technical Architect (CTA) and founder of OnlyFlows.tech. The @onlyflows/servicenow-mcp package is free and open source on npm and GitHub.