AI Agents Build Infrastructure at Business Speed

AI Agents Build Infrastructure at Business Speed

With a keen eye for turning vast datasets into compelling visual stories, Chloe Maraina has established herself as a leading voice in business intelligence and data science. Her work focuses on the future of data management and the strategic integration of AI into the very fabric of enterprise infrastructure. In our conversation, Chloe unpacks the seismic shift from rigid, legacy IT systems to a fluid, AI-driven composable model. We explore the business drivers behind this transformation, the practical differences between AI orchestration and simple automation, and the critical need for new governance frameworks. Chloe also provides a candid look at the cultural and organizational hurdles that keep so many AI initiatives in pilot mode and outlines the new skills architects must develop to thrive in a world of human-AI collaboration.

Many enterprise architects find the traditional cycle of provisioning, maintaining, and retiring IT infrastructure is no longer sustainable. What specific pressures, especially from agentic AI, are causing this breakdown, and what are the first practical steps an organization should take to begin this transformation?

That old cycle is something we all know intimately—it’s predictable, comfortable, and now, completely broken. The pressure is immense. We’re seeing business environments driven by what I call agentic experiences; everything is reactive, dynamic, and happening everywhere at once. The core issue is that our legacy infrastructure was never designed for these kinds of demand patterns. It’s static. Tying these new, fluid AI-driven demands to that old, rigid foundation is a recipe for failure. The first practical step isn’t a technical one, but a mental one. You have to stop seeing your infrastructure as a monolithic block. Start by identifying a single, high-impact business function and begin the work of breaking its supporting systems into discrete, modular pieces. It’s about moving beyond infrastructure-as-code, which was just the first inflection point, and truly embracing modularity as the foundation for everything that comes next.

The global market for composable infrastructure is projected to grow at nearly 25% annually. For IT leaders trying to build a business case, what key metrics—such as agility, resilience, or technical debt reduction—should they emphasize to justify the significant investment and challenge of this architectural shift?

The numbers themselves tell a powerful story. We’re looking at a market that was $8.3 billion in 2025 and is expected to grow at a staggering 24.9% annually through 2032. But to make that real for the C-suite, you must connect it to business outcomes. I always emphasize three core pillars backed by solid research. First is agility and speed to market. IBM’s research is very clear that companies with modular architectures are fundamentally faster. Second is resilience. In a world of geopolitical disruptions and supply chain chaos, the ability to reconfigure on the fly isn’t a luxury; it’s a survival mechanism. Finally, and this one really resonates with CFOs, is the reduction of technical debt. That legacy baggage slows everyone down and stifles innovation. A composable model isn’t just about building new things faster; it’s about paying down the debt that’s holding your entire organization back.

AI agents are often described as being able to “reason” and orchestrate complex workflows. How does this differ from traditional automation scripts? Can you provide a real-world example of how an agent might dynamically reconfigure infrastructure components in response to an unexpected business event?

This is the most crucial distinction to understand. Traditional automation is like a player piano; it follows a pre-written, rigid script. If anything unexpected happens, the music stops or becomes a jumbled mess. An AI agent, on the other hand, is like a jazz musician. It understands the goal—the key, the tempo, the desired feeling—but it improvises based on what’s happening in the moment. It can reason about what to assemble and when to reconfigure. Imagine a large e-commerce platform during a flash sale that suddenly goes viral. A traditional script might just fall over. An AI agent, however, would see the incoming traffic spike, understand the business intent—maintain sub-100ms latency for 99.99% availability—and start orchestrating. It would dynamically pull compute resources from a private cloud, scale up edge nodes in the specific geographic regions seeing the surge, and re-route data pipelines to prioritize transaction processing, all without human intervention. It’s not just following orders; it’s making intelligent decisions to achieve an outcome.

With 88% of organizations using AI but most stuck in pilot mode, it’s clear that legacy infrastructure is a major bottleneck. Beyond technical limitations, what are the primary organizational or cultural hurdles that prevent companies from scaling their AI initiatives, and how does composable architecture help overcome them?

It’s a frustrating statistic, isn’t it? 88% are in the game, but very few are actually scoring. While the technical bottleneck of legacy infrastructure is very real, the cultural hurdles are just as significant. The biggest one is a risk-averse mindset tied to traditional change management. We’re used to monolithic, tightly controlled systems where every change requires a mountain of approvals. An agent-driven, composable world is fluid and dynamic, and that terrifies organizations built on rigid processes. There’s also a skills gap; managers are used to directing people, not orchestrating human-AI teams. Composable architecture directly addresses the technical issue by providing the flexible foundation needed to scale. But it also acts as a catalyst for cultural change. When you break systems into modular components, you empower smaller, more autonomous teams to own their services. This fosters a culture of distributed ownership and faster decision-making, which is exactly the environment where AI can thrive beyond the pilot stage.

As AI agents gain more autonomy, new governance models are required. How can leaders establish effective “guardrails” that provide control and ensure compliance without stifling the adaptability that makes this technology so powerful? Please describe what a policy-based composition framework looks like in practice.

This is the high-wire act for leadership. You can’t just let agents run wild, but if you micromanage them, you lose all the benefits of their autonomy. The answer is a shift from approving individual actions to defining the boundaries of acceptable behavior. That’s policy-based composition. In practice, this looks like a “governance hub” or a “model registry.” Instead of a developer requesting a specific server configuration, an architect defines a policy: “For any customer-facing workload, data must remain within this geographic region, security protocols must meet this standard, and the budget cannot exceed X.” The AI agent then has the freedom to compose and re-compose the optimal infrastructure within those guardrails. It’s about setting the rules of the playground, not dictating which swings to use. This way, you ensure compliance and control while still allowing the agent the flexibility to adapt and optimize in real time.

The convergence of edge and cloud is seen as essential for agentic AI. How does a unified resource pool enable AI agents to optimize workloads more effectively? Could you walk us through a scenario where an agent leverages both edge and cloud resources to deliver a low-latency user experience?

Treating edge and cloud as separate kingdoms requiring manual integration is an outdated model. A unified resource pool is foundational because it gives an AI agent a complete menu of options to choose from. It can then make intelligent trade-offs between latency, cost, and processing power. A perfect scenario is a large-scale, interactive augmented reality application, like a virtual try-on for a retail brand. To deliver a seamless, real-time experience, you need incredibly low latency. An agent would orchestrate this by pushing the core inference models and user-facing rendering to edge compute nodes, located physically close to the user, for instant responsiveness. Simultaneously, it would use the massive processing power of the central cloud to handle the heavy lifting of training the AI models on new product data and aggregating user analytics. The agent isn’t just picking one or the other; it’s composing a hybrid solution, using each resource for what it does best to deliver the optimal outcome.

With managers evolving into orchestrators of human-AI teams, the nature of work itself is changing. What new skills will architects and their teams need to develop to collaborate effectively with AI agents, and how should organizations rethink career paths to support this new operational model?

The role of the architect is becoming less about being the master builder and more about being the master city planner. You’re not laying every brick; you’re designing the systems, the zones, and the rules that allow the city to grow and adapt organically. The key new skill is strategic orchestration—the ability to define business intent and translate it into policies that agents can execute. Communication and collaboration skills become even more critical, but now the conversation includes your AI partners. We need to get good at “prompting” our infrastructure and interpreting its feedback. Organizations need to respond by creating dual career paths. We’ll still need deep, AI-augmented specialists, but we desperately need generalist orchestrators who can manage these hybrid human-AI teams. The old ladder where you become a manager of people is evolving; the new path might lead to becoming the orchestrator of a portfolio of highly effective agents.

What is your forecast for adaptive infrastructure?

My forecast is that within the next five to ten years, the term “adaptive infrastructure” will simply become “infrastructure.” The idea of building a static system that you hope will meet future needs will seem as antiquated as a manually operated telephone switchboard. The companies that win will be those who master composability and treat their entire tech stack as a fluid pool of resources that intelligent agents can orchestrate in real time. The focus will shift entirely from managing servers and networks to managing business outcomes and policies. Infrastructure will finally stop being a constraint and become a true competitive advantage, a living, breathing entity that adapts at the speed of the business itself.

Subscribe to our weekly news digest.

Join now and become a part of our fast-growing community.

Invalid Email Address
Thanks for Subscribing!
We'll be sending you our best soon!
Something went wrong, please try again later