Informatica tightens bond with AWS's AI development tools
Informatica is deepening its AWS integration with a slate of features aimed at teams building and running agentic AI. Following its acquisition by Salesforce, the company rolled out updates during AWS re:Invent that tie governed data to foundation models and simplify agent development.
Key releases include dedicated Model Context Protocol (MCP) servers and an Enterprise Agent Blueprint for Amazon Bedrock AgentCore (both in preview), a new Cloud Data Integration connector for Amazon SageMaker (GA), and an update that runs Informatica's Claire AI Engine using Anthropic's Claude models through Amazon Bedrock.
Why this matters for IT and engineering leaders
Data prep isn't the finish line anymore; it's part of the runtime. As Donald Farmer put it, "Rather than data management as a preparatory practice - something you do before the real AI work begins - this positions the platform as operational infrastructure for autonomous agents."
This shift targets a real failure mode in AI projects: shaky data governance at inference time. Agents don't just consume batch data. They query, interpret, and act on live inputs - and they break fast without guardrails.
What's new
- MCP servers for Amazon Bedrock AgentCore (preview): Prebuilt servers that connect Informatica's Intelligent Data Management Cloud (IDMC) to foundation models via MCP, linking governed data to agents built on Bedrock.
- Enterprise Agent Blueprint for Bedrock AgentCore (preview): A full framework - MCP servers, prebuilt connectors, and an API layer - to accelerate building and deploying agents on AWS.
- Cloud Data Integration connector for Amazon SageMaker (GA): Data scientists can pull prepared, governed datasets from Informatica into SageMaker for ML, genAI, and analytics pipelines.
- Claire AI Engine with Claude through Amazon Bedrock: Informatica's own Claire Agents now use Claude models for tasks like schema grounding, SQL optimization, and semantic query generation.
MCP + Bedrock: speed without losing governance
Building agents usually means repeating the same plumbing: locate data, clean it, connect it to the right model, feed applications, and watch the pipeline under load. MCP standardizes that integration so models can safely call external tools and data - with less custom code and fewer security gaps.
Informatica's MCP servers link IDMC to Bedrock AgentCore, so teams can pair governed data with the right models and move faster. For background on the protocol, see the Model Context Protocol site here. For AWS's agent capabilities, review the Agents for Amazon Bedrock docs here.
Who benefits
Many enterprises blend AWS services with third-party data platforms to avoid lock-in. That audience is squarely in scope here. As Rik Tamm-Daniels noted, the focus comes from customers and partners pushing into agentic AI: "Our goal is to give enterprises a clear and flexible path to build intelligent, compliant agents that can access and act on high-quality data in real time."
Kevin Petrie adds context: AWS-heavy organizations are investing aggressively in AI, measure success by customer outcomes, and see data quality as the top obstacle. Informatica's role is straightforward - make hybrid data usable and governed for production-grade agents and ML.
SageMaker connector: less friction to production
The new CDI connector for SageMaker is available now. It lets data teams pipe curated datasets from Informatica into SageMaker without manual wrangling. That shortens the path from discovery and prep to training, tuning, and deployment.
Runtime governance is the real unlock
Farmer's warning is blunt: "Without proper governance woven into that runtime layer, agents will hallucinate, make decisions on outdated information, or violate compliance requirements in ways that only surface after the damage is done." The MCP servers and the Agent Blueprint are built to keep data quality and policy in the loop while agents act.
Market signal: MCP adoption is moving fast
Petrie points to strong demand: about one-third of AWS users already run an MCP server in production, and over half of the rest are evaluating. If you're standardizing on Bedrock, this is worth piloting.
Practical steps for your roadmap
- Inventory agent use cases that need governed, real-time data access. Prioritize those with measurable business impact and clear policy requirements.
- Stand up a scoped pilot on Bedrock AgentCore using Informatica's MCP servers. Measure time-to-first-agent, data latency, and error rates.
- Wire governance into runtime: lineage, policy enforcement, prompt/response logging, and rollback paths for agent actions.
- Use the SageMaker connector to feed curated datasets into training and RAG evaluation loops; track drift and data freshness SLAs.
- Define cost checkpoints: model selection, context window strategy, retrieval design, and tool-call limits.
What's next
Short term, Informatica's update to Claire with Claude boosts its own agents. Medium term, there's a clear opening: instrument what agents do with data - not just how data reaches them. Farmer believes Informatica has the metadata and customer footprint to lead here if they move quickly.
Cross-cloud data sharing just got easier via recent AWS and Google Cloud updates. That creates a path for Informatica to offer a consistent layer across both, especially for organizations standardizing on agentic architectures.
Want to skill up your team?
If you're building with Bedrock, Claude, and MCP, structured training can help speed adoption. Explore courses by leading AI platforms here.
Your membership also unlocks: