Confluent unveils Confluent Intelligence: a unified suite for building AI agents on real-time data
Confluent just rolled out Confluent Intelligence - a managed suite for teams building and operating AI agents on streaming data. The launch also includes Confluent Private Cloud for on-prem deployments and new Tableflow integrations to simplify getting streaming data into open tables and data catalogs.
The announcement landed at Current, Confluent's user conference in New Orleans. It's a clear move to meet demand from teams shifting from simple chatbots to autonomous, context-aware agents.
Why this matters
Agents need fresh, relevant context to act well. That's Confluent's home field. As one analyst put it, "Every software vendor must have an agentic AI story or risk being left behind⦠real-time data can provide important context to agents, many of which operate on a real-time basis."
The market is converging on unified agent stacks. Databricks, Snowflake, Oracle, Teradata, and MongoDB have all moved here. Confluent's angle: streaming-first, Kafka-native, and focused on event data.
What Confluent introduced
- Confluent Intelligence (managed on Confluent Cloud)
- Real-Time Context Engine (early access): streams structured data into agents and apps. Includes a Model Context Protocol (MCP) integration to wire tools and data into agent runtimes.
- Streaming Agents (open preview): a development framework based on Apache Flink FLIP-531 with Agent Definition (create agents in a few lines), plus built-in observability and debugging.
- LLM integration: Anthropic's Claude is the default, but you can bring any model.
- Built-in ML in Flink SQL: anomaly detection, forecasting, and model inference on streams.
- Confluent Private Cloud: for teams that need on-prem or isolated environments (financial services, healthcare, sensitive workloads).
- Tableflow integrations: automatic conversion of streams into open table formats (Delta Lake, Apache Iceberg) with native hooks into Databricks Unity Catalog. AWS is supported today; Azure integration is in early access.
How it helps engineering, data, and product teams
- Faster agent delivery: predefined agent scaffolding, fewer custom pipelines, simpler debugging.
- Real-time context: push current events into agent reasoning - customer actions, transactions, IoT telemetry, fraud signals.
- Less glue code: Tableflow + catalogs reduce ETL and manual integrations.
- Governance-ready: Private Cloud gives regulated teams a path to ship with their security model intact.
What's missing (and what's next)
Analysts called out two gaps. First, fragmentation: there isn't a single vendor that solves agent orchestration end to end. Confluent's MCP support helps connect agents to data sources, but coordination across agents remains a broader problem.
Second, scope: Confluent Intelligence is focused on structured streaming data. As one analyst noted, a lot of business context lives in unstructured data like text and images. That layer is not addressed here - yet.
There's also a suggestion for the roadmap: add Agent2Agent Protocol (A2A) alongside Kafka to enable standardized, point-to-point agent communications. MCP wires tools and data; A2A could wire agents to each other.
Context from Confluent
Confluent says customer demand drove features like real-time anomaly detection. The bigger trend is an architectural shift: from batch analytics and chatbots to continuous, event-driven agents. Expect "context" to be the watchword - how teams select, shape, and stream the right signals into models.
Where this fits in your stack
- Event backbone: Kafka remains the substrate for real-time data movement. If Kafka is already core to your architecture, this reduces friction for agent projects.
- LLM flexibility: default Claude support helps teams standardize, but you can swap models to fit use cases and cost profiles.
- Lakehouse alignment: Tableflow to Delta Lake and Iceberg means your real-time tables land where BI, batch ML, and governance already live.
Practical next steps
- Validate fit: List your top agent use cases. Do they rely on structured, real-time signals? If yes, pilot Streaming Agents + Real-Time Context Engine.
- Close the gaps: Plan for unstructured data retrieval (search, vector DBs) if your agents need documents, images, or transcripts.
- Orchestration plan: Define how agents coordinate tasks, escalate, and recover. Watch for A2A patterns while MCP covers tools and data.
- Deployment model: If you're constrained by compliance or data residency, evaluate Confluent Private Cloud early.
- Data contracts: Standardize schemas, SLAs, and lineage for the streams feeding agents. Broken contracts break reasoning.
- Cost controls: Track stream volume, inference frequency, and fan-out. Add guardrails before scale.
Competitive view
Confluent squares up against AWS, Google Cloud, Microsoft, and IBM on streaming, plus Aiven and Redpanda. On the AI side, it joins Databricks, Snowflake, Oracle, Teradata, and MongoDB in offering unified agent tooling - with a distinct focus on event-driven context at the core.
Bottom line
If your agents need live signals and you're already Kafka-first, Confluent Intelligence is worth a look. The suite shortens the path from stream to action, while Private Cloud and Tableflow handle real-world constraints around governance and integration. Just plan for unstructured data and multi-agent orchestration - those pieces still require additional tooling.
Related resources
Apache Kafka overview for teams standardizing on event streaming.
If your team is upskilling on agentic AI and Claude, see this focused path: AI Certification for Claude.
Your membership also unlocks: