
Enterprise AI brokers right this moment face a elementary timing downside: They can not simply act on essential enterprise occasions as a result of they don’t seem to be all the time conscious of them in actual time.
The problem is infrastructure. Most enterprise knowledge lives in databases fed by extract-transformation-load (ETL) jobs that run hourly or every day – finally too sluggish for brokers that should reply in actual time.
One potential strategy to sort out this problem is to have brokers instantly interface with streaming knowledge methods. Among the many essential approaches used right this moment is open supply Apache Kafka and Apache Flink know-how. There are additionally a number of business functions primarily based on these applied sciences, Confluentdirected by the unique creators behind Kafka, being considered one of them.
At this time, Confluent is introducing a real-time context engine designed to unravel this latency downside. The know-how relies on Apache Kafka, the distributed occasion streaming platform that captures knowledge as occasions happen, and the open supply Apache Flink, the stream processing engine that transforms these occasions in actual time.
The corporate can also be releasing an open supply framework, Flink Brokers, developed in collaboration with Alibaba Cloud, LinkedIn and Ververica. The framework brings event-driven AI agent capabilities on to Apache Flink, permitting organizations to construct brokers that monitor knowledge streams and set off robotically primarily based on circumstances with out committing to Confluent’s managed platform.
"At this time, most enterprise AI methods can’t robotically reply to important enterprise occasions with out being requested first," Sean Falconer, Confluent’s head of AI, advised VentureBeat. "This results in misplaced income, sad clients or elevated danger when a cost fails or a community malfunctions."
The importance goes past particular Confluent merchandise. The business acknowledges that AI brokers require completely different knowledge infrastructure than conventional functions. Brokers do not simply retrieve data when requested. They should observe the continual stream of enterprise occasions and act robotically when the circumstances are warranted. This requires streaming structure, not batch pipelines.
Batch versus streaming: Why RAG alone is just not sufficient
To know the issue, it is very important distinguish between completely different approaches to transferring knowledge by way of enterprise methods and the way they are often linked to AI brokers.
In batch processing, knowledge accumulates within the supply system till a scheduled job runs. This activity extracts the info, transforms it and masses it right into a goal database or knowledge warehouse. This will occur hourly, every day and even weekly. The method works nicely for analytical workloads, however creates latency between when one thing reaches the enterprise and when methods can act on it.
Diffusion knowledge inverts this mannequin. As a substitute of ready for scheduled duties, streaming platforms like Apache Kafka seize occasions as they occur. Each database replace, consumer motion, transaction or sensor studying turns into an occasion revealed in a stream. Apache Flink then processes these streams to merge, filter and mixture knowledge in actual time. The result’s processed knowledge that displays the present state of the enterprise, up to date repeatedly as new occasions happen.
This distinction turns into important when you think about what sort of context AI brokers really need. A lot of the present dialogue of enterprise AI focuses on retrieval-augmented technology (RAG), which handles semantic searches of the information base to seek out related documentation, coverage or historic data. RAG works nicely for questions like "What’s our refund coverage?" the place the reply exists in static documentation.
However many enterprise use circumstances require what Falconer calls "structural context" — correct, up-to-date data from a number of working methods stitched collectively in actual time. Contemplate a job suggestion agent that requests consumer profile knowledge from the HR database, shopping habits within the final hour, search queries from minutes in the past and present open positions throughout a number of methods.
"The half that we’re unlocking for companies is the flexibility to basically serve that structural context that is wanted to ship the freshest model," Falconer mentioned.
MCP connection points: Stale knowledge and fragmented context
The problem is just not merely to attach AI with company knowledge. The Mannequin Context Protocol (MCP), launched by Anthropic earlier this 12 months, already standardizes how brokers entry knowledge sources. The issue is what occurs after the connection is made.
In most enterprise architectures right this moment, AI brokers join through MCP to knowledge lakes or warehouses which might be fed by batch ETL pipelines. This creates two important failures: The info is stale, reflecting yesterday’s actuality quite than present occasions, and it’s fragmented throughout a number of methods, requiring important pre-processing earlier than an agent can motive about it successfully.
The choice – placing the MCP server instantly in entrance of the operational database and APIs – creates completely different issues. These factors weren’t designed for agent consumption, which might result in excessive token prices as brokers course of an excessive amount of uncooked knowledge and a number of inference loops whereas attempting to make sense of unstructured responses.
"Enterprises have the info, however it’s usually stale, fragmented or locked in codecs that AI can’t use successfully," Falconer defined. "The actual-time context engine solves this by unifying knowledge processing, reprocessing and serving, turning steady knowledge streams into reside context for smarter, sooner and extra dependable AI choice making."
The technical structure: Three layers for real-time agent context
The Confluent platform consists of three parts that work collectively or are adopted individually.
uncommoneal-time context engine is the managed knowledge infrastructure layer on Confluent Cloud. Connectors pull knowledge from Kafka matters as occasions happen. Flink works to course of these streams "the ensuing knowledge units" — enter materialized into historic and real-time indicators. For buyer help, this might mix account historical past, present session habits and stock standing right into a single unified context object. The engine exposes this by way of a managed MCP server.
Broadcasting brokers Confluent is the proprietary framework for constructing AI brokers that run natively on Flink. These brokers monitor the info stream and set off robotically primarily based on the circumstances – they do not look ahead to prompts. The framework consists of simplified agent definitions, built-in observability and native Claude integration in Anthropic. It’s obtainable in open preview on the Confluent platform.
Flink Brokers is an open supply framework developed with Alibaba Cloud, LinkedIn and Ververica. It brings event-driven agent capabilities on to Apache Flink, permitting organizations to construct streaming brokers with out committing to Confluent’s managed platform. They deal with operational complexity themselves however keep away from vendor lock-ins.
Competitors heats up for agent-ready knowledge infrastructure
Confluent is just not alone in recognizing that AI brokers want completely different knowledge infrastructures.
The day earlier than Confluent’s announcement, its rival Redpanda launched its personal Agentic Knowledge Aircraft – combining streaming, SQL and governance particularly for AI brokers. Redpanda acquired Oxla’s distributed SQL engine to supply commonplace SQL endpoint brokers for querying knowledge in movement or at relaxation. The platform emphasizes MCP-aware connectivity, full observability of agent interactions and what it calls. "agent entry management" and superb indicators, that are short-lived.
The architectural approaches are completely different. Confluent emphasizes stream processing and Flink to create optimized knowledge units for brokers. Redpanda highlights SQL federated queries throughout completely different sources. Each acknowledge that brokers want real-time context and governance and observability.
Past direct streaming opponents, Databricks and Snowflake are essentially analytical platforms including streaming capabilities. Their strengths are complicated calls for on large knowledge, and streaming as an enchancment. Confluent and Redpanda invert this: Streaming is the inspiration, with analytical and AI workloads constructed on prime of information in movement.
How context streaming works in apply
Among the many customers of the Confluent system are transport distributors Busy. The corporate is constructing a contemporary working system for constitution bus corporations that helps them handle quotes, journeys, funds and drivers in actual time.
"Knowledge dissemination is what makes this potential," Louis Bookoff, Busie co-founder and CEO advised VentureBeat. "Utilizing Confluent, we transfer knowledge immediately between completely different components of our system as a substitute of ready for in a single day updates or batch stories. This retains the whole lot in sync and helps us ship new options sooner.
Bookoff famous that the identical basis is what’s going to make AI beneficial to its clients.
"In our case, each motion resembling sending a quote or assigning a driver turns into an occasion that flows by way of the system instantly," Bookoff mentioned. "This reside data is what’s going to let our AI instruments reply in actual time with low latency quite than simply summarizing what has already occurred."
The problem, nonetheless, is learn how to perceive context. When hundreds of occasions reside by way of the system each minute, AI fashions want related and correct knowledge with out getting overwhelmed.
"If the info is just not primarily based on what is going on in the actual world, AI can simply make fallacious assumptions and in flip take fallacious actions," Bookoff mentioned. "Course of Stream solves this by repeatedly validating and reconciling reside knowledge with exercise in Busie."
What this implies for enterprise AI technique
Sarchitectural context treaming indicators a elementary shift in the way in which AI brokers eat enterprise knowledge.
AI brokers require steady context that blends historic understanding with real-time consciousness – they should know what has occurred, what is going on and what would possibly occur subsequent, abruptly.
For enterprises evaluating this method, begin by figuring out use circumstances the place knowledge staleness breaks the financial institution. Fraud detection, anomaly investigation and real-time buyer failover intervention with hourly or every day refreshed batch pipelines. In case your brokers must act on occasions inside seconds or minutes after they happen, context streaming turns into vital quite than non-obligatory.
"When constructing functions on prime of foundational fashions, as a result of they’re probabilistic in nature, you employ knowledge and context to steer the mannequin in a route the place you need to get some form of outcome," Falconer mentioned. "The higher you are able to do this, the extra dependable and the higher the outcome."

