Back to News

The Quiet Reinvention of a Communications Giant: From Utility to Agentic AI Infrastructure

technologybusinessartificial-intelligence

A Turnaround Built on Discipline and Focus

When a company traditionally known as a communications utility suddenly posts one of the strongest quarters in its history, with revenue growth accelerating to 20% year-over-year from a previous 7%, the temptation is to credit external tailwinds. The reality is more interesting: this kind of acceleration is almost always a self-help story. The recipe is unglamorous but effective — running the business better, exerting tight financial discipline, and channeling investment dollars into a focused innovation pipeline rather than spraying them across every adjacent opportunity.

The voice channel in particular has become a bright spot, accelerating for six consecutive quarters. That is not a one-off; it is the kind of sustained pattern that signals real operational repair underneath the headline numbers. The raised guidance — on revenue, on operating income, on free cash flow — reinforces that this is execution coming through, not a sugar high from a single product cycle.

Why the Data Layer Was Worth Defending

One of the more consequential strategic decisions in recent memory was the refusal to spin out the customer data platform layer (Segment) under heavy activist pressure. At the time, the conviction was simple even if the future shape of AI was not yet fully visible: any communication that lands on the other side of a channel needs deep contextual personalization to actually be useful. A message is just noise without the context layer that tells you who the recipient is, what they care about, and how they prefer to be reached.

What looked like a stubborn refusal to streamline now looks prescient. As voice AI startups have proliferated over the past couple of years, every one of them has discovered the same thing: activating an AI workload requires a context layer. Companies like Sierra AI illustrate the pattern — using voice infrastructure to reach customers, but pairing it with an intelligence layer to make those interactions personal rather than generic. The data asset that was almost divested has become the connective tissue between raw communications plumbing and intelligent, personalized conversation.

From Channels to Conversations

The deeper shift underway is from one-way communications to genuine two-way conversations. Most business communications today are still fundamentally one-way: a notification, an alert, an outbound message. The next generation requires back-and-forth — the actual definition of a conversation — and that requires four ingredients working together: intelligence, memory, orchestration, and agent-building tools.

A newly launched conversation suite bundles these capabilities together. Consider how the PGA reaches its golf pros: it is not enough to send a blast to every pro on every channel. The intelligent version reaches each pro on the channel they prefer, at the time they want to be reached, with the context they care about. The chief technology officer of a sports organization thinking about how to leverage AI is the same persona one finds across every industry now — someone trying to move faster on personalization without rebuilding infrastructure from scratch.

The Agentic AI Opportunity

The term "agentic AI infrastructure" is one of those phrases that risks being dismissed as marketing language, but the underlying claim is substantive. External research suggests there could be well north of a billion AI agents in the wild by 2029, executing human-to-agent, agent-to-human, and increasingly agent-to-agent interactions. Projections that put a single platform at the center of 100 million such agents may actually be understating the opportunity. There are already millions of agents operating today.

Every one of those agents needs to communicate. They need voice, messaging, and channel infrastructure. They need context about who they are talking to. They need memory of past interactions. They need orchestration across channels. The infrastructure layer that serves them is poised to be one of the most consequential pieces of plumbing in the next decade of computing.

The Power of a Usage-Based Model

What makes this strategically defensible is the usage-based revenue model, which stands in sharp contrast to the seat-based pricing of traditional SaaS software. A usage-based infrastructure provider only makes money when its customers make money. The interests are tightly aligned: success accrues to both sides simultaneously, and there is no cap imposed by headcount.

This matters enormously for the AI transition. If AI agents proliferate the way the research suggests, every additional agent generates additional usage, which generates additional revenue. There is no need to renegotiate seat licenses or upsell a new tier. The business model scales naturally with the underlying phenomenon. And critically, the AI contribution to revenue is still quite modest today — meaning the tailwind has barely begun to blow.

From Utility to Intelligence Backbone

The classification question is a real one. If a company began life as a communications utility but is increasingly the data and intelligence layer that powers billions of AI-driven conversations, is "utility" still the right label? The honest answer is that "infrastructure company" remains the most accurate frame — but the substance of that infrastructure is changing. Raw communications pipes alone are commodities. Pipes plus context plus orchestration plus agent-building tools are something quite different: the operating substrate of an emerging agentic economy.

What is genuinely striking is the alignment between current execution and long-term positioning. The financial discipline produces the resources to invest. The investments fund the conversation suite and the agentic infrastructure. The infrastructure attracts both AI-native startups and mainstream businesses through a product-led, self-serve channel. The usage-based model ensures that every additional user, every additional agent, every additional conversation feeds back into the engine. Each piece reinforces the others.

The Profound Shift Ahead

The most profound implication is the one most easily overlooked. When communications transition from one-way to two-way at scale, the volume of interactions does not double — it explodes by orders of magnitude. A conversation, by definition, requires multiple exchanges. Multiply that by billions of agents, by every personal and business interaction in our lives that becomes mediated by AI, and the addressable surface becomes vast.

For the infrastructure provider sitting underneath all of that, the value proposition is no longer about cheaper messaging or faster voice routing. It is about being the substrate on which a new mode of computing actually runs. That is a meaningfully different business than a communications utility — and the market appears to be starting to recognize the difference, with the stock rallying more than 68% over a recent 30-trading-day stretch to reach a four-year high. The reinvention is no longer quiet.

Comments