Connect AI Agents to NetSuite & SAP Concur via MCP Servers
Learn how to expose Oracle NetSuite and SAP Concur data to AI agents via MCP servers. Handle legacy auth, rate limits, and complex schemas safely without building custom connectors.
If you are building a B2B SaaS product that needs AI agents to read, write, or reason over financial data inside Oracle NetSuite or SAP Concur, you are likely staring down a massive technical debt problem. ERP systems are the last mile for agentic workflows that touch real money—invoices, expense reports, purchase orders, journal entries—and the APIs behind these systems are some of the most painful in enterprise software.
Your product team wants to ship an AI agent that can autonomously reconcile expenses, draft purchase orders, or answer complex financial queries. To do that, the Large Language Model (LLM) needs read and write access to your customers' ERP data. The architectural reality of building this connectivity from scratch is punishing. Writing custom API connectors for legacy enterprise systems means dealing with fragmented API surfaces, brutal concurrency limits, and antiquated authentication models.
The industry is moving rapidly toward standardizing these connections. The Model Context Protocol (MCP) has emerged as the definitive standard for exposing external tools to AI models. Instead of building point-to-point LangChain connectors for every new LLM, engineering teams are deploying MCP servers that any compatible agent can consume.
This guide breaks down the technical architecture required to expose complex ERP data to AI agents via MCP servers. We will examine the specific engineering challenges of NetSuite and SAP Concur, explore how dynamic tool generation works, and detail the exact patterns required to handle enterprise rate limits without breaking agent reasoning.
The Rise of Agentic ERP Workflows
The demand for AI agents that can read, write, and reason over financial data has accelerated dramatically. The financial data locked inside ERPs is exactly where AI agents can deliver the most measurable business impact. Automated reconciliation, intelligent expense parsing, and natural-language financial reporting are not demos anymore—they are production use cases.
The data backing this shift is definitive. Gartner forecasts that adoption of AI-enabled cloud ERP tools will progress rapidly, with 62% of cloud ERP spending on AI-enabled solutions by 2027, up from 14% in 2024. This investment is driven by hard ROI metrics. Gartner predicts that finance organizations using cloud ERP applications with embedded AI assistants will see a 30% faster financial close by 2028.
On the software side, Forrester predicts thirty percent of enterprise app vendors will launch their own MCP servers in 2026, and it expects half of enterprise resource planning vendors to introduce autonomous governance modules within their suites.
The direction is clear: ERPs are becoming API-first platforms for AI. For B2B SaaS companies, this means your customers expect your platform's AI features to interact directly with their general ledgers. Primary use cases include:
- Automated Order-to-Cash: An AI agent monitors an external e-commerce platform. When an order is placed, it automatically queries the ERP for the correct customer record, generates an invoice, and logs the payment, keeping the ledger perfectly synced.
- Intelligent Expense Parsing: A Slackbot ingests a receipt, uses a vision model to extract the total and vendor, queries SAP Concur or NetSuite to find the matching account category, and creates an expense record with the attached image.
- Automated Bank Reconciliation: An agent fetches raw bank transactions and heuristically matches them against open invoices, proposing reconciliation pairs to the finance team.
But the gap between "ERP vendors talk about AI" and "your AI agent can actually create an invoice in your customer's NetSuite instance" is enormous. That gap is the integration layer—and it is where most teams burn months of engineering time.
The Nightmare of Building Custom Connectors for ERPs
If you have ever integrated with a well-documented REST API like Stripe and assumed enterprise ERPs would be similar, you are in for a rude awakening. NetSuite and SAP Concur are two of the most integration-hostile platforms in the enterprise stack, each with their own distinct flavor of pain.
If you decide to build your own MCP server from scratch—rather than adopting one of the best MCP servers for Oracle NetSuite—you must first build the underlying API integrations. For these systems, this requires a dedicated engineering team.
The Oracle NetSuite API Reality
NetSuite does not give you a single, clean REST API. Unlike modern systems, NetSuite requires orchestrating across three distinct API surfaces to achieve full coverage: SuiteTalk REST, RESTlet (SuiteScript), and the legacy SOAP API.
And the ground is shifting under existing integrations. Starting in 2026.1, NetSuite will stop releasing new SOAP endpoints. Any new features or capabilities introduced in NetSuite will only be available through REST. The timeline is aggressive: Starting with the 2026.1 NetSuite release, all newly built integrations should use REST web services with OAuth 2.0. With the 2027.1 release, you will not be able to build any new integrations using SOAP web services. And by 2028.2, SOAP web services will be fully removed from NetSuite.
If your integration relies on SOAP for anything, you are on a three-year countdown to rewrite thousands of lines of custom integration code. Moving to the REST API does not solve all problems. A production-grade NetSuite integration requires solving several architectural hurdles:
- Token-Based Authentication (TBA) vs OAuth 2.0: NetSuite traditionally relies on a highly specific TBA scheme that requires generating an HMAC-SHA256 signature for every single HTTP request. This signature includes a timestamp and nonce, meaning a slight clock drift on your servers will result in rejected requests. Although OAuth 2.0 is the preferred authentication method, only new integrations must use OAuth 2.0, starting with the 2027.1 NetSuite release. This means you may need to support both TBA and OAuth 2.0 simultaneously during the multi-year transition.
- Feature-Adaptive Queries: NetSuite instances vary wildly between customers. A query that works on a standard edition will fail on a OneWorld edition if it does not account for subsidiary routing. Multi-currency versus single-currency changes the schema. Your integration must dynamically detect the customer's edition at connection time and adjust its SQL JOINs accordingly.
- SuiteQL as the Primary Data Layer: Nearly all read operations should use SuiteQL (NetSuite's SQL-like query language) instead of the standard REST record API. SuiteQL enables multi-table JOINs and complex filtering that the REST API simply cannot handle.
- Polymorphic Resource Routing: A single unified "contacts" resource in your application might need to dynamically route to either a
vendororcustomerNetSuite record based on the context of the agent's request. - SuiteScript Deployments: Certain operations, like generating a Purchase Order PDF or extracting dynamic form field metadata (including mandatory flags), are impossible through REST or SuiteQL. These require deploying custom SuiteScript to the customer's instance.
The SAP Concur API Reality
SAP Concur presents a different set of challenges. The API surface is cleaner than NetSuite's, but the operational quirks, primarily centered around enterprise security and rate limiting, will catch you off guard.
The biggest gotcha is geolocation-aware routing. The Partner must have Data Center Geo Awareness related to the token. SAP Concur currently has 3 Data Centers and the API endpoints change based on these Data Centers, so it is imperative the proper token management is followed. Every OAuth2 token response includes a geolocation field, and this value must be used for all subsequent calls. Hardcoding a regional URL will cause failures for tenants hosted in other regions.
Token lifetimes are aggressively short and unforgiving. An access token is valid for only one hour. Refresh tokens are valid for six months from the day and time issued. If your token refresh logic has a bug, you have a six-month time bomb sitting in production.
Rate limiting is perhaps the most frustrating part. SAP Concur does not publicly document specific numeric rate limits. No explicit requests-per-minute or daily quota figures are published. You are flying blind—you hit a wall, get a 429, and have to guess at the backoff strategy. Furthermore, Concur's API responses often mix legacy XML patterns with modern JSON, requiring extensive normalization logic before the data can be safely passed to an LLM.
And the API versioning story is a mess. SAP Concur has deprecated multiple API versions (v1.0, v3.0, v3.1) over the years, because the older APIs only support the previous authentication method, which is not best security practice and does not meet the OAuth2 standards.
Building an MCP server that manually handles NetSuite's TBA signatures, SuiteQL query construction, and Concur's token refreshes is a massive distraction from building your actual AI product.
Architecting the MCP Server Integration
The Model Context Protocol gives you a standardized way to expose ERP operations as callable tools for LLMs. Instead of writing point-to-point code, modern engineering teams use a unified API platform to handle the underlying API complexity, while exposing the normalized resources via an automatically generated MCP server.
Instead of every AI agent needing its own NetSuite connector, an MCP server presents a set of tools that any MCP-compatible client (whether using a managed MCP for Claude, ChatGPT, Cursor, or your own custom agent) can discover and invoke.
Dynamic Tool Generation
The key architectural insight for managed MCP server platforms is that tool definitions should not be hand-coded per integration. A well-designed platform derives MCP tools dynamically from integration resource definitions and documentation records.
A tool only appears in the MCP server if it has a corresponding documentation entry. This acts as a strict quality gate, ensuring that only well-documented, tested endpoints are exposed to the LLM.
When the agent requests the available tools via the tools/list protocol method, the server iterates over every resource and method defined for that specific integration. It generates descriptive, snake_case tool names designed for LLM comprehension:
list_all_net_suite_purchase_ordersget_single_sap_concur_expense_by_idcreate_a_net_suite_journal_entry
For list operations, pagination fields like limit and next_cursor are automatically injected into the schemas.
The Flat Input Namespace
One of the primary challenges of function calling is that LLMs struggle with deeply nested, complex JSON schemas. They perform best when provided with a flat list of arguments.
When an MCP client calls a tool via tools/call, all arguments arrive as a single flat object. The MCP server is responsible for splitting these arguments into query parameters and body parameters using the schemas' property keys.
For example, if an agent wants to update a specific Concur expense report, it passes the id and the total_amount in a single flat JSON object. The server routes the id to the URL path or query string, and the total_amount into the JSON body payload before executing the proxy API request against Concur.
sequenceDiagram
participant Agent as AI Agent (Claude/ChatGPT)
participant MCP as MCP Server Endpoint
participant Proxy as Proxy API Layer
participant ERP as NetSuite / Concur
Agent->>MCP: POST /mcp/:token<br>method: tools/call<br>name: create_a_net_suite_invoice
Note over Agent,MCP: Passes flat JSON arguments
MCP->>MCP: Validate token & load context
MCP->>MCP: Split flat args into Query & Body schemas
MCP->>Proxy: Execute standardized API request
Proxy->>ERP: Generate HMAC-SHA256 signature<br>Execute SuiteTalk REST call
ERP-->>Proxy: 201 Created (Native Response)
Proxy-->>MCP: Normalized JSON Response
MCP-->>Agent: JSON-RPC 2.0 Tool ResultScoping Access with Method and Tag Filters
Not every AI agent should have write access to your customer's general ledger. MCP servers should support fine-grained access control:
- Method filters: Restrict a server to
readoperations only (justgetandlist),writeoperations only, or specific methods likecreate. - Tag-based grouping: Group tools by functional area (e.g.,
accounting,support,directory) and create MCP servers scoped to specific tags.
For example, an expense-parsing agent might get an MCP server filtered to ["read", "create"] methods on ["expenses"] tags—it can read expense categories and create new expense entries, but cannot delete invoices or modify journal entries.
Handling Enterprise API Rate Limits and Pagination
Exposing enterprise APIs to autonomous agents introduces a severe risk of rate limit exhaustion. LLMs can execute loops and parallel function calls at speeds far exceeding human operators. If your agent decides to paginate through 10,000 NetSuite transaction records, it will hit concurrency limits almost immediately.
The Gateway Must Not Absorb 429s
Many integration platforms attempt to be "helpful" by automatically catching HTTP 429 (Too Many Requests) errors, holding the connection open, applying exponential backoff, and retrying the request under the hood.
This is a catastrophic anti-pattern for AI agents.
If a gateway absorbs a 429 and waits 45 seconds to retry, the LLM client's HTTP connection will likely time out. When the timeout occurs, the agent is left in a hallucinated state—it does not know if the purchase order was actually created or if the request was dropped. It might attempt to create the record again, resulting in duplicate financial entries.
A well-architected MCP server platform does NOT retry, throttle, or apply backoff on rate limit errors. When an upstream API like NetSuite or SAP Concur returns a rate-limit error, the platform passes that error directly back to the caller immediately.
Normalized Rate Limit Headers
What the platform should do is normalize the highly fragmented rate limit information from hundreds of upstream APIs into standardized response headers based on the IETF RateLimit draft specification. NetSuite expresses rate limits via concurrency governance. SAP Concur does not publish numeric limits at all. A unified layer translates both into consistent headers:
ratelimit-limit: The maximum number of requests permitted in the current window.ratelimit-remaining: The number of requests remaining in the current window.ratelimit-reset: The number of seconds until the rate limit window resets.
Your AI agent or its orchestration framework (like LangGraph or AutoGen) is strictly responsible for reading these standardized headers and implementing its own retry and backoff logic. When the agent receives a 429, it can read the ratelimit-reset header, pause its execution thread safely, and resume exactly when the window clears. This guarantees deterministic execution and prevents duplicate data entry.
import time
def call_mcp_tool(client, tool_name, arguments, max_retries=3):
for attempt in range(max_retries):
response = client.call_tool(tool_name, arguments)
remaining = int(response.headers.get("ratelimit-remaining", 1))
reset_seconds = int(response.headers.get("ratelimit-reset", 60))
if response.status == 429:
wait = min(reset_seconds, 2 ** attempt * 5)
time.sleep(wait)
continue
# Proactively slow down when running low
if remaining < 5:
time.sleep(reset_seconds / max(remaining, 1))
return response
raise Exception(f"Rate limited after {max_retries} retries")For more details on this architectural pattern, see our guide on best practices for handling API rate limits.
Enforcing Pagination Discipline
LLMs are notoriously bad at handling pagination states. If an agent asks for a list of contacts and receives 100 records with a complex base64 encoded cursor for the next page, the LLM will often try to "helpfully" modify, decode, or guess the next cursor value in its subsequent function call.
To prevent this, the MCP server automatically injects strict cursor instructions into the query schema for list methods. The schema explicitly instructs the LLM: "Always send back exactly the cursor value you received without decoding, modifying, or parsing it. This can be found in the response of the previous tool invocation."
This forces the agent to treat the cursor as an opaque string, ensuring reliable pagination through massive ERP datasets.
Securing ERP Data: OAuth Token Management and Expirations
Granting an AI agent access to an enterprise ERP requires extreme security precautions. You cannot simply hardcode NetSuite credentials into an LLM prompt. Enterprise deployments require specific security controls:
The Multi-Tenant Security Model
Each MCP server must be scoped strictly to a single integrated account—one customer's NetSuite instance, or one company's SAP Concur tenant. The server URL contains a cryptographic token that encodes exactly which account to use and what tools are exposed. The raw tokens are never stored in the database; they are hashed using a secure signing key before being written to a distributed key-value store. An AI agent connected to Customer A's MCP server physically cannot access Customer B's data.
Automatic Token Refresh
Both NetSuite and SAP Concur use short-lived access tokens. A managed platform handles token refresh transparently—refreshing tokens before they expire so tool calls never fail due to stale credentials. If an integrated account enters a needs_reauth state, tool calls fail with a clear error rather than silently returning empty data.
Time-to-Live (TTL) Expirations
MCP servers can be created with an explicit expiration datetime. This is critical for temporary access scenarios—for example, granting a temporary audit agent access to SAP Concur for exactly 48 hours to complete a compliance review. When the expiration time is reached, scheduled cleanup tasks automatically purge the token. The MCP server URL immediately returns a 401 Unauthorized, ensuring no stale access remains.
Secondary API Token Authentication
By default, the cryptographic MCP URL is the only authentication required, making it trivial to configure clients like Claude Desktop or ChatGPT. For high-security enterprise environments, teams can enable a configuration flag that requires secondary API token authentication.
When active, the MCP client must provide a valid API token via the Authorization header in addition to using the correct URL. This guarantees that even if an MCP server URL is leaked in a log file, the tools cannot be executed unless the caller is actively authenticated within your application's session context.
Architecture Decision: Build Custom MCP Servers vs. Use a Managed Platform
You have two paths when weighing the hidden costs of custom MCP servers. Here are the honest trade-offs:
| Factor | Custom-built MCP Server | Managed Platform (e.g., Truto) |
|---|---|---|
| Time to first tool call | 4-8 weeks per ERP | Hours |
| NetSuite SOAP-to-REST migration | Your problem | Handled by the platform |
| OAuth/TBA token management | Build and maintain | Automatic refresh and lifecycle |
| Rate limit normalization | Build per-provider | Standardized IETF headers |
| Schema changes | Monitor and update manually | Platform tracks upstream changes |
| Multi-tenant isolation | Architect from scratch | Built-in per-account scoping |
| Control over tool definitions | Full control | Configurable via tags/method filters |
Building custom is the right call if you have a single ERP to support, deep domain expertise in that ERP's API, and specific tool behaviors that a generic platform cannot provide. But if you need to support NetSuite and Concur and QuickBooks and Xero across hundreds of customer tenants, the math changes fast. For a detailed comparison, see our evaluation of MCP server platforms.
Stop Writing Integration Code for LLMs
The transition to agentic software is happening faster than most engineering teams anticipated. If you are tasking your senior backend engineers with reading Oracle NetSuite SOAP deprecation notices or debugging SAP Concur OAuth refresh failures across three data centers, you are fundamentally misallocating your engineering resources.
Your product's competitive advantage lies in the reasoning capabilities, prompt engineering, and workflow orchestration of your AI agents—not in maintaining point-to-point API connectors.
By leveraging a unified API platform that automatically generates MCP servers, normalizes rate limits into IETF standard headers, and abstracts away legacy authentication schemes, you can give your agents secure, real-time access to enterprise ERPs in minutes instead of months.
Focus on what the agent does with the financial data, and let the infrastructure handle how the data is retrieved.
Frequently Asked Questions
- Is Oracle NetSuite deprecating its SOAP API?
- Yes. Starting with the 2026.1 release, NetSuite will stop releasing new SOAP endpoints. By 2027.1, no new SOAP integrations can be built, and by 2028.2, SOAP web services will be fully removed. All new integrations should use REST with OAuth 2.0.
- How do AI agents handle NetSuite and SAP Concur rate limits?
- Agents must read standardized rate limit headers (ratelimit-limit, ratelimit-remaining, ratelimit-reset) returned by the MCP server and implement their own pause/resume logic. The gateway should never absorb the 429 error, as this causes LLM timeouts.
- How does SAP Concur handle geolocation routing for API requests?
- SAP Concur has 3 global data centers. Every OAuth2 token response includes a geolocation field specifying the correct datacenter-specific base URL. This value must be used for all subsequent calls, or requests will fail.
- Can I restrict an AI agent to read-only access in an ERP?
- Yes. When generating the MCP server, you can apply method filtering to restrict the exposed tools to read-only operations, preventing the agent from modifying financial records. You can also use tag filters to scope access to specific resource groups like expenses or accounting.