Quality Over Quantity: Preparing BI for Agentic Decision-Making with PepsiCo & Georgia-Pacific
Enterprise data and analytics leaders offered insights on ending BI Sprawl at Gartner Data & Analytics Summit 2026.
When it comes to BI and AI, what got us here won’t get us there.
Over the last decade, business intelligence tools like Power BI, Tableau, and Qlik gave people the ability to aggregate data, analyze it, build reports, and make informed decisions. BI became a cornerstone of data and analytics because of an emphasis on quantity. The recipe for enterprise success was to make more data available to more business users, enabling them to create more reports and dashboards.
Today, AI is poised to put report creation and decision-making power in the hands of autonomous agents that could become capable of doing lots more. But getting there will require an emphasis on quality, not quantity.
Do you know what’s in your environment? Is your data clean? Are datasets up-to-date and certified as a source of truth? Do you have governance to prevent costs and risks from ballooning out of control?
In the AI era, answering all of these questions is the first step in transforming BI and analytics into true decision intelligence.
That was one of the top takeaways from a session at the Gartner Data & Analytics Summit featuring data and analytics leaders from PepsiCo, Georgia-Pacific, and Datalogz. Panelists for the discussion, titled “From BI Sprawl to AI-Ready Decision Intelligence,” included:
- Matt Robuck, Vice President of Data and Analytics at Georgia-Pacific
- Richard Martin, Vice President for Strategy and Transformation, PepsiCo
- Logan Havern, Founder & CEO, Datalogz
Let’s take a look at highlights from the session:
The Runaway Growth of BI
The emphasis on quantity worked. BI and analytics are growing at every enterprise.
This continued uptake is driven by a pair of converging factors. Business users get their hands on tools like Power BI and Tableau, and they see the power of analytics. They want more insights, so they create reports and dashboards. Add AI tools that reduce the time to create those reports and dashboards from months to minutes, and you’ve got a recipe for an explosion in data-driven operations across the enterprise.
“We didn't have another way to make decisions except by building reports and dashboards,” said Matt Robuck of Georgia Pacific. “In the last two years, that's changed dramatically for us. So, we have a vision of a future state where agents and generative BI experiences power the insights that our users and employees use every day to make decisions for Georgia Pacific.”
But there’s a twist to this story of growth.
“Even though there are more analytics, insights, and decisions happening, it doesn't mean data is actually better, and decisions are better,” said Logan Havern, CEO of Datalogz.
Across nearly 1 million BI assets being managed across the Datalogz customer base, the company found:
- 77% average year-over-year increase in reports and dashboards from 2025 to 2026
- 35% of assets are still built on flat files, Excel, and CSVs.
- Only 14% of datasets are powering more than one report, dashboard, or data product
- 80% of BI environments are unused
When Richard Martin dug into the environment at PepsiCo, he found echoes of the above data. The point is not that PepsiCo is unique, but that everyone is in the same boat. BI Sprawl is common across most enterprises.
Rooting Out Duplicate Reports
You might expect that no two BI reports are the same. But the reality is that too many are too similar. In most BI environments, you’ll find duplicate reports that calculate the same metric in different ways. They were sourced from the same table with joins and transformations over time, leading to a convoluted source of truth.
Knowing the duplicates are there is one thing. Finding them is another altogether.
“If you're one business unit and one platform like Tableau, maybe it's possible to do this manually. But if you're like Georgia Pacific, with 15 business units plus capabilities plus seven platforms, it is literally impossible to find duplicates across all of those systems. So that's where you've got to think about, how do I go and get that information?”
If environments are left uncontrolled, AI is poised to compound BI sprawl.
But in a moment of change, there is opportunity. Enterprises can optimize today for the future that is already here.
“We're going to an AI-powered future, and that train has already left the station,” Martin said. For BI, Martin said, that means “slimming down to golden reports with golden KPIs on golden datasets and enabling those assets with AI to tell the right personas the right insights at the right time, instantly at their fingertips.”
The first step is taking inventory of what you have. Robuck worked with Datalogz to understand the existing BI environment at Georgia-Pacific, from the teams and tools to the existing reports across every business unit.
“You can't really go to a future state without understanding where you're coming from,” Robuck said. “So, as we work over the next 12 to 24 months to really transform the experience, and we're still going to have some reports and dashboards, we're just going to have far fewer as we move to this future state. But you have to start understanding your employees, your users, and the type of decisions they're trying to make, or else you won't know where to take them in the future state.”
The Cost and Risk of a Decision
As the work of building reports shifts from humans to agents, decisions will gain more focus in the AI era. In the BI era, humans produced reports and made decisions. Increasingly, agents are capable of building reports and making decisions on their own.
This could result in more decisions being made using data. But there are other factors that could rise along with it.
“Every time a decision is made with data, there's a cost to it as well as a risk to it,” Havern said.
The cost is measured in compute required to produce insights. The risk is measured in the accuracy of the reports and who has access to the data.
With AI, the risk and cost calculus changes.
Human brains possess tools of judgment to spot conflicting data. When agents are making decisions autonomously, guardrails are needed to prevent duplication. Agents move quickly, so being wrong scales. It’s bad data in, bad decisions out.
What’s needed to prevent this chaos? A source of truth, quality data, and governance around the analytics environment, panelists said.
Introducing AI isn’t only a matter of deploying technology. It also requires people who are bought in and trained, and processes that are optimized. This transformation will require change management and reorienting priorities. New norms will emerge in the future as a result. Yet across industries and enterprises, it’s already clear that a well-governed data and analytics environment is foundational to success.
“The investment for us has really changed in the last 12 months. We are laser-focused on data quality, centralizing our governance processes and balancing the needs of business partners and IT,” said Robuck. “We don't want to be an impediment to decisions; we want to accelerate them.”