Agents of Chaos: Context, Governance, and Auditing at the Consumption Layer

Data leaders from Indeed, Pimco, definity, and Datalogz gathered at Future Frontiers to discuss how agentic AI is transforming analytics.

Agents of Chaos: Context, Governance, and Auditing at the Consumption Layer

Agentic AI is transforming how we work with data. The guardrails need to keep up.

Agents are making it possible to complete entire workflows with the push of a button, have a conversation with data, and produce analysis that previously wasn’t considered.

But for every incredible use case that captures the imagination, there is also risk. Inaccurate information, exposure of sensitive data, and wasted resources not only undermine decisionmaking, but unleash chaos that scales quickly.

Data leaders explored this rapidly evolving complexity and offered actionable tips for professionals to navigate it at the latest edition of Future Frontiers, a quarterly event series organized by Datalogz. Held at the Perkins Coie law firm in New York City, the panel discussion welcomed CDOs, CIOs, CISOs, and seasoned analytics professionals for a thought-provoking evening of discussion and networking. 

A panel discussion held during the event featured the following:

Here is a look at key takeaways from the discussion:

Context is Key For Accuracy 

For years, we’ve all heard about the democratization of data. In the AI era, it has finally arrived, and it’s on steroids, said Roy Daniel, Co-Founder and CEO of definity, an agentic data engineering company.

It’s one thing to have people accessing data. Now, we have AI models, agents, and more tools than ever working with data, producing insights, and completing projects end-to-end.

And instead of building dashboards, people are simply asking questions, and expecting results. Fast.

On the one hand, this accelerates the old problems. Every enterprise has duplicate reports, dashboards, and metrics. Humans stood a chance of catching errors. Machines aren’t so discerning.

When we ask a question, is that answer correct? People have experience and judgement that helped them get to the right answer. Machines need to be grounded with the same.

“It comes back to the question of context and what context do we feed those models,” said Daniel.

The solution is already sitting inside organizations. BI assets that have already been produced can provide that missing context. Reports and dashboards contain the insights and analysis that so often power answers. When used for training LLMs, they can also arm those models with institutional knowledge that so often makes the difference between a wise decision and a guess.

AI is Transforming Roles, Not Replacing Jobs

In profound ways, AI is demonstrating the capacity to take on more and more of the knowledge work that helps our economy run. So it’s natural to consider, will it replace skills and even jobs? 

It will be more of an evolution, Sunny Zhu, Head of Data & Analytics, Strategy & Operation at Indeed.

“In the next three to five years, it will definitely be a hybrid mode,” Zhu said. After conducting an analysis of every job and skill at her company, Zhu is convinced the key workforce challenge will be transforming roles to work effectively with AI.

Inside the company, agentic workflows are already having a massive impact on insights automation. Quarterly business reporting data that took scores of analysts many hours to gather across more than a dozen business units data can be gathered in minutes.

AI is transforming data and analytics every day. There are countless stories like this that make it feel like we’ve entered a new paradigm. 

But what hasn’t changed is in the need for trust. Data still has to be correct. Bad data will point decisionmakers in the wrong direction. Hallucinated results means powerful AI tools won’t get used.

To make sure the underlying data behind the information presented to the board is right, Indeed built a semantic layer.

“The agent has a lot of freedom to explore, but whatever they do, they need to have a handshake with the source of truth,” Zhu said. “And the human in the loop is super important, so no matter what, we still have analysts auditing everything.”

That semantic layer is a living thing that is always changing based on new information. It’s pulling together data from hundreds of systems. As new data comes in, metrics change. Drift is inevitable. The semantic layer needs sound governance to avoid redundancy and ensure consistency. Otherwise, a source of truth with the wrong data could quickly cause chaos, at scale.

Observability is No Longer Optional

If the massive data infrastructure buildouts happening today are any clue, agentic AI growth is only going to continue tomorrow.

“From a cost standpoint and a governance standpoint, observability becomes a key criteria. It’s no longer optional,” said Joay K. Singhal, former CDO at Pimco and AI transformation leader. “You have to have a good overview and control of your entire organization, whether it's data pipelines, whether it's agent pipelines, to be able to measure what's going on and assess security threats.”

With more systems accessing data, there is a greater chance that sensitive data could be exposed. 

If a company is about to undertake a layoff, the data and analytics team conducts the analysis to determine the impact. That data needed to be protected when humans were accessing it. When agents are doing the work without judgement or discernment, the need for guardrails is even greater.

“Whenever we build an agentic flow, we need to make sure the governance is there,” Zhu said.

That governance can also make the business case for AI. With such a substantial investment, data and AI spend is now a C-level conversation. Everyone is racing ahead because they don’t want to be left behind. But when the dust settles, those capacities and tokens add up. Unused resources are especially draining on a budget, and executives want proof that the outlay is producing returns.

“Tying every pipeline to a business use case and to an ROI is essential,” Singhal said. “At some point, executives are going to ask, what are we getting out of all this spend today?”

Audit at the Point Where the Decisions are Made

A semantic layer is an elegant idea. In theory, it will provide a business with a single source of truth that sets the standard for metrics, definitions, and processes across every human and agent, across the entire business.

But inside most enterprises, the reality is complexity, and chaos. On average, 400 applications are accessing data, across departments.

“You have your Oracle NetSuites with one area of revenue. You have a copy of that in Snowflake, going to a Power BI dashboard. You have a hundred people who have screenshotted revenue into PowerPoints and put them on Teams. So to define a semantic layer and enforce it across an organization sounds like an impossible task,” said Logan Havern, Founder and CEO of Datalogz. And once you define it, how long until new information throws it off course?

Here’s a more realistic step that enterprises can take: Audit the points where the data is accessed. Those tools listed above are where decisions are made, and they can become a source of confusion at the highest levels.

“In the case where you have a SVP viewing one number and they think it's the same number as the VP on their team and those numbers are misaligned, then that is a problem,” Havern said.

Auditing provides visibility. Then, you can identify conflicts, and quickly take action to work out solutions. Apply automation, and the work becomes faster.

That’s how you take control at the consumption layer.


Subscribe to Data Dive

Interesting data concepts, avant-garde ideas, and the best of data content from across the web.