Your web browser is out of date. Update your browser for more security, speed and the best experience on this site.

Update your browser
Posted Apr 17, 2026

How the SaaSpocalypse Clarified What Makes an AI Partner Worth Keeping

The February 2026 SaaSpocalypse wiped approximately $285 billion from SaaS company valuations in 48 hours. Most market watchers saw AI-driven volatility. But for GPs, it raised a harder question: are the firms running our fund operations built to last in this environment, or are they among the companies the market is writing off?

Our CEO, Alex Robinson, has been anticipating this question for two years. In a recent episode of The Distribution by Juniper Square, he offered a framework for thinking it through.

His conclusion is counterintuitive: most GPs should not be thinking about AI at all. At least not in the way the conversation usually goes.

Why more AI means more demand, not less

The most common anxiety among GPs about AI is a version of the same question: what happens to the people doing this work? Robinson addressed this directly, citing Jevons Paradox—a 19th-century economic principle holding that when a resource becomes more efficient to use, total consumption of it typically increases.

The evidence from software engineering makes the case. As AI begins writing the majority of code at firms—he noted that most engineers now oversee agents producing 80 to 90% of the code they previously wrote themselves—software engineering job postings climbed to multi-year highs. Humans moved higher up the abstraction stack, orchestrating agents and setting priorities rather than executing the underlying work.

"Hold on tightly to your job," Robinson advised knowledge workers, "and loosely to its definition." For GPs, the same reframing applies to their own IR teams, fund accounting staff, and portfolio operations professionals. The role may change; the need for it does not.

What GPs actually said

When Robinson asked GPs directly at an event how they were thinking about building AI capabilities internally, the response was nearly unanimous.

"I don't want to deal with this," he recounted them saying. "I don't want to be thinking about deploying models and orchestrating different models for different scenarios...I'm an investor. That's how I create value in the world."

More than 40% of PE GPs now have an AI strategy for their own businesses. But the execution question—who builds it, maintains it, and ensures it stays compliant—is one most GPs are not equipped to answer alone, nor should they be. For most GPs, it’s a partner selection question.

The five-ingredient test

Robinson laid out a specific framework for evaluating any partner claiming to bring AI capabilities to a GP's operations. 

Technical depth. The pace at which foundational AI models are advancing makes it difficult for most organizations to absorb. A partner that cannot integrate new model capabilities and deploy them reliably is not a durable choice.

Private markets domain knowledge. Technical competence without private markets context is, as Robinson put it, "functionally useless." Agents need to understand what a capital call is, how LP onboarding works, what a waterfall looks like—not in the abstract, but in the operational detail of how GPs actually run their funds.

Connected agents, tools, and data. AI agents produce meaningful work only when they have access to the right data, tools, and context. An agent disconnected from a GP's investor records, fund documents, and operational systems cannot do the work that matters. Building and maintaining those connections is not a trivial infrastructure problem.

Compliance and security. The private markets are a regulated industry. GPs have made commitments to their investors and regulators about how data is handled. A partner deploying AI in their fund operations must meet those commitments without asking the GP to trade off compliance for capability.

Services delivery. Most GPs outsource fund administration because they do not want to operate it themselves. That preference does not change because AI is available. What changes is the expectation that outsourced providers are actively applying the technology. The right partner wraps AI capabilities inside a service model.

What survives the SaaSpocalypse

The technology companies under the most pressure share a common profile: per-seat pricing, use cases well-suited to AI commoditization, and no deep system-of-record relationship with customers. The companies best positioned share a different profile: brand equity built over years, deep data relationships, distribution at scale, and regulatory infrastructure that takes institutional trust—and time—to develop.

For GPs evaluating their operations partners, the same rubric applies. The question is not whether a firm claims to offer AI capabilities—most do now. The question is whether the underlying business has the technical depth, domain expertise, and customer relationships to sustain those capabilities over time, and whether the pricing model will hold up as AI makes routine tasks cheaper to perform.

In closing

Robinson's three-year horizon is specific: there will be AI agents covering every facet of knowledge work inside a GP—investor relations, fundraising support, portfolio operations, treasury, fraud monitoring, and data insights. Not as a replacement for the people doing that work today, but as infrastructure that frees them for the judgment-intensive, relationship-driven work that still requires a human.