AI & TechArtificial IntelligenceBusinessDigital MarketingNewswireTechnology

Rebuilding Data Stacks to Power AI

▼ Summary

– Organizations face a problem with thousands of proliferating dashboards and reports that require customization, causing delays for users trying to access data.
– Databricks’ Genie allows users to ask questions in natural language and receive answers with root cause analysis based on the company’s own enterprise data.
– Databricks has introduced Lakebase, an OLTP database that separates compute and storage, designed to provide a real-time database for AI agents managing automated business processes.
– Lakebase enables agents to quickly start, run, copy, and shut down databases, offering a cost-effective and fast solution for agent orchestration and context tracking.
– For successful AI implementations, organizations must establish metrics and measurement systems, such as tracking spend optimization, to translate AI and business telemetry into measurable outcomes.

Most companies today are drowning in dashboards and reports , thousands of them proliferating across departments, each needing constant customization. The result? Business users wait far too long to actually access the data they need. AI is now changing that dynamic by making analytics far more accessible. It’s finally delivering on what data teams have long considered the holy grail: democratizing data access so the right information reaches the right people, with the right permissions, without friction.

Take Databricks’ Genie product as an example. Users can ask questions of their data in plain English , or any language , and receive contextual answers. Unlike ChatGPT, which pulls general information from the internet, Genie can tell you exactly why your April sales numbers fell short of expectations. It performs root cause analysis using your own enterprise data. This is a major step toward truly democratizing data within an organization, and it sits firmly in the OLAP world, which the Lakehouse architecture supports.

More recently, Databricks introduced the Lakebase, targeting the OLTP side of the equation. As organizations deploy AI agents to automate workflows, those agents need a place to store orchestration data and workflow context. On one side, users ask questions. On the other, the next frontier is automating entire business processes , like generating a marketing campaign, which typically involves multiple tools and steps. An agent can streamline that process, but behind the scenes, it requires a real-time database to track everything the agent does. That’s exactly what the Lakebase delivers.

The innovation here is a modern Postgres database with separated compute and storage, similar to what Databricks did with the data Lakehouse and data warehouse. On the Lakebase, data lives in a single copy within your cloud storage, while compute runs serverless and independently. Features like branching let you spin up an OLTP database quickly. Agents can start a Lakebase, keep it running, shut it down when needed, and make copies on the fly. They need speed and cost-efficiency, and this architecture delivers both.

The real beauty is that combining OLTP (Lakebase, real-time) and OLAP gives you one unified system for all your data. No more copying data around, no managing multiple permission sets, and no context switching. These AI-powered applications represent the future of business operations. They eliminate bottlenecks by automating repetitive human tasks using LLMs and other new technologies. Databricks aims to be the default platform for powering this future, believing the Lakebase will be faster, cheaper, and more secure as an AI database.

Megan: That sounds like a real game changer. We’ve touched on this a few times already , this idea of value. Senior leaders are prioritizing the commercial value of AI investments. How important is this value measurement when building AI-ready data systems, Rajan? And how can organizations track what’s delivering results versus what isn’t?

Rajan: This is of paramount importance. Most successful AI implementations , especially agentic AI , require rigorous value measurement. Let me extend the client example I mentioned earlier: a large global food products company. Think of it this way. When the digital era first arrived, analytics focused on performance management KPIs and fact-based decision-making. Those metrics evolved over time and became critical for measuring how functions and businesses performed. The same logic applies to AI value measurement.

Using the same client example, the key is to map your desired outcomes. In this case, the goal was optimizing spend on direct and indirect purchases. By applying AI, the company identified areas where spending could be reduced. One critical measure became indirect expense classification , how much spend was classified and how much could be reduced as a result. Establishing these base metrics and measurements is absolutely critical.

The beauty is that capabilities like Databricks’ metrics view and feature tools , as Bavesh mentioned , help translate AI telemetry and business telemetry from applications into measurable outcome metrics. You can then track those metrics using Genie rooms designed specifically for value management measurement.

(Source: MIT Technology Review)

Topics

data democratization 95% ai agents 93% natural language querying 92% value measurement 91% oltp lakebase 90% business process automation 89% lakehouse architecture 88% unified data system 87% real-time database 86% root cause analysis 85%