The real reason AI isn’t a bubble, according to a data center CEO

Feb 6, 2026 | AI

Quick summary:

  • The data center industry's biggest problem isn't power or chips — it's PR, and communities are mobilizing against data centers faster than companies can respond
  • AI demand is real. Every GPU deployed requires five CPUs behind it, a ratio that will jump to one to eight within two years
    • Finance and healthcare led enterprise AI adoption because they already had decades of data to train on

    Two years ago, Ryan Mallory's team was planning for 50-kilowatt server racks. Today, they're deploying 400-kilowatt racks. The eightfold jump captures how quickly AI infrastructure has outpaced every prediction.

    Mallory recently became CEO of Flexential, which operates 42 data centers across 18 markets. He joined The Disruption Is Now with host Greg Matusky fresh from Davos, where AI dominated every conversation at the World Economic Forum.

    The discussion reveals what's actually happening inside the facilities that power ChatGPT, enterprise AI, and the applications most people use daily without realizing they require massive compute infrastructure.

    Watch now:

    Key takeaways:

    AI growth is accelerating demand for traditional computing infrastructure

    While everyone talks about GPUs for AI, those are just the tip of the infrastructure iceberg.

    "Those GPU chips are made specifically for high-speed computation processing," Mallory explains. "All that processing that happens, that data output, isn't stored at the GPU. It's going back into traditional storage and CPU infrastructure."

    Flexential currently ships five CPUs for every GPU deployed. Within two years, that ratio will jump to 8:1 as the volume of AI-generated data compounds.

    This bifurcation means AI growth actually accelerates demand for traditional computing infrastructure, including CPUs, storage arrays, and networking equipment.

    Power has replaced land as the limiting factor

    AI is changing the traditional playbook for constructing data centers. "It used to be about the land and how close you could get to the populace center," Mallory says. "Now it's about, can you get power?"

    Getting power means more than cutting a check to the utility. Data centers must fund load studies, pay for substation construction, cover pro-rated shares of transmission line upgrades, and sometimes finance new generation capacity.

    Flexential's sweet spot is what Mallory calls "edge inference" — 54 megawatts of utility power supporting 36 megawatts of IT load. That's easier to source than the gigawatt campuses going into Louisiana, Mississippi, and the Dakotas for pure AI training.

    The tiered architecture matters. Flexential's model handles everything from 3-kilowatt cabinets to 400-kilowatt cabinets, which matches how enterprises actually deploy AI: variable, hybrid, and connected to existing systems.

    The data center industry's biggest problem is PR

    NIMBYs have mobilized faster than anyone expected. Mallory watched the same community embrace a data center project one year and fight it the next as narratives shifted.

    "The biggest challenge the data center industry has right now outside of power is PR," he admits. "We're not doing a good job helping to make sure communities and consumers understand that data centers, we're not the big bad wolf."

    When electricity rates increase, blame the utility, not the data center, which pays its own infrastructure costs. Data centers actually give power back to the grid through demand response programs when the utility calls.

    Fears of water shortages are also unfounded — Flexential sites operate at zero water use efficiency, meaning they don't draw from the municipal supply.

    The benefits communities miss include sales tax on equipment that funds schools, fire departments, and roads. Even with property tax breaks, the economic contribution is substantial.

    Finance and healthcare adopted AI first because they had the data

    When Mallory looks at early AI adopters, two verticals stand out.

    Financial services had massive compute requirements from trading analytics and customer modeling. Healthcare needed it for virology research and cancer treatments. Both had decades of data and existing infrastructure to build on.

    "They really embraced it because they had those large compute requirements," Mallory explains.

    Now the Fortune 1000 is catching up, applying AI to customer service, order quality, and response times. Even logistics companies use it to optimize truck routes for fuel economy and scheduling.

    Key moments

    • How cooling requirements jumped from 50kW to 400kW racks in two years (2:07)
    • Why five CPUs ship for every GPU and that ratio is growing (4:35)
    • The power access problem replacing land as the key constraint (6:48)
    • Finance and healthcare as AI's first enterprise adopters (9:12)
    • Why data centers have a PR problem with communities (11:56)
    • The zero water use efficiency myth explained (13:00)
    • Tax benefits communities miss when fighting data centers (14:00)
    • A Pennsylvania community that flipped from supporting to opposing a data center (15:18)
    • Why AI is about efficiency gains, not job replacement (18:41)
    • What surprised Mallory most about how fast the industry pivoted (23:33)

    Q&A with Ryan Mallory, CEO of Flexential

    Q: Is AI infrastructure demand a bubble?

    A: I don't think it's a bubble at all because we're living it right now. You can't have a bubble if you have bifurcation of the data components to be able to use by the common person. This isn't just for Google, Microsoft, Amazon, and Oracle. I use it every day. My kids use it every day. It's required in school.

    Q: What do communities misunderstand about data centers?

    A: Rate increases, blame your power company, not us. We're paying to help augment a grid infrastructure that hasn't been touched in several decades. I pay for my own substations. If a transmission high tension line needs upgraded, I pay my pro-rated share of that.

    Q: Will AI eliminate jobs?

    A: It's not a job loss, it's about making your time more efficient. At least Flexential doesn't ever look at it as, hey, we can use AI to reduce head count. We can use AI to make those people more efficient so we can increase productivity.

    Q: What has surprised you most about this shift?

    A: Just how quick the industry pivoted, from a data center perspective, and just how much of the resources it's consuming — chipsets, concrete, steel. It required us to be super disciplined in supply chain management.

    The real reason AI isn't a bubble, according to a data center CEO
    Google Podcasts logo
    The real reason AI isn't a bubble, according to a data center CEO
    Greg Matusky

    Recommend a guest

    Gregory invites individuals with unique insights into artificial intelligence to join the conversation. Interested participants are encouraged to appear as guests on The Disruption Is Now podcast. Fill out the form to recommend someone who may be a good fit.