The definitive first-principles analysis of how LLM inference is absorbing the entire software industry, and what it means for every company that builds, buys, or sells software.
This guide is written by Yuma Heymans, founder of o-mega.ai. As the solo founder of an AI agent platform, Yuma has experienced the consolidation firsthand: building an entire CRM, admin panel, content pipeline, and multi-agent orchestration system without a team, without buying SaaS tools, and without raising venture capital. This article is the result of watching, from the inside, as the economics of software undergo the most fundamental restructuring since the invention of the web browser.
The global SaaS market was valued at $358 billion in 2025. Within eighteen months, the conversation shifted from "which SaaS tool should we buy" to "should we just build it ourselves." That shift is not a trend. It is a structural phase transition in how software gets created, distributed, and valued. The underlying cause is not AI hype. It is the emergence of a single, general-purpose computational substrate, LLM inference, that is absorbing the functions of tens of thousands of specialized software products.
I call that substrate the Big Pipe.
Software has always been plumbing. Data flows through it. Customer records, transactions, messages, documents, analytics events. All of it is data flowing through pipes. For two decades, that data flowed through thousands of smaller, specialized pipes: a CRM pipe, a marketing automation pipe, an analytics pipe, a project management pipe, an invoicing pipe. Each pipe had its own shape, its own connectors, its own data format. And each pipe existed not because anyone wanted 30,000 separate pipes, but because there was no general-purpose pipe available. The cost of building a custom pipeline exceeded the cost of conforming to someone else's pre-built one. That equation has now inverted.
LLM inference is the general-purpose pipe. All the data that used to require separate, specialized channels can now flow through a single conduit. And the critical difference is this: inside the Big Pipe, the data does not need to be reformatted, re-qualified, re-enriched, or reorganized before it can be processed. The intelligence is in the pipe itself. The inference layer handles the transformation natively. That is what makes this consolidation so fast and so total.
What This Guide Covers
This guide breaks down the consolidation of software into LLM inference from first principles. It is not a prediction piece. It is not an AI hype article. It is a structural analysis of what is happening, why it is happening at the level of fundamental economics and technology, and what the second and third-order consequences are for every participant in the software economy.
Contents
- The Historical Function of Software Pipes
- Why Thousands of Pipes Existed
- What Changed: The Economics of the Build-Buy Equation
- The Big Pipe: LLM Inference as Universal Solvent
- The Conformity Tax: The Hidden Cost Nobody Talked About
- The Market Evidence: SaaS vs AI in Hard Numbers
- The SaaSpocalypse: Company-by-Company Breakdown
- The 100x Factor: Why the Shift is Larger Than People Think
- What the Big Pipe Cannot Replace
- The New Scarcity: Where Value Concentrates Now
- Second-Order Effects: What Happens After the Pipes Collapse
- The Infrastructure Layer: Who Owns the Big Pipe
- Implications for Founders, Investors, and Operators
- The Philosophical Shift: From Products to Capabilities
- Conclusion: The Shape of What Comes Next
1. The Historical Function of Software Pipes
To understand what is collapsing, you first have to understand what was built and why.
Every piece of commercial software that has ever been sold performs the same fundamental function: it takes a human intention ("I want to track my customers," "I want to send emails at scale," "I want to schedule meetings") and converts it into a structured computational operation. The software serves as a translation layer between what a human wants and what a computer can do. That translation layer is what we call a pipe.
The first generation of pipes was simple. Spreadsheets translated the intention "I want to organize data" into rows and columns. Word processors translated "I want to write a document" into formatted text files. Databases translated "I want to store and retrieve information reliably" into structured query operations. These were horizontal pipes, general enough to serve many intentions, but requiring significant human effort to configure them for any specific one.
The second generation was the rise of vertical pipes. Salesforce did not sell you a database. It sold you a pre-configured translation layer specifically for the intention "I want to manage customer relationships." HubSpot sold you one for "I want to run inbound marketing." Workday sold you one for "I want to manage human resources." Each of these products took a general-purpose capability (data storage, email sending, form processing) and wrapped it in a domain-specific abstraction that eliminated the configuration work. The product was the abstraction itself.
By 2024, the software industry had produced an estimated 30,000+ SaaS products - Statista. Each one was a pipe. Each one existed because the cost of building that specific translation layer from scratch was higher than the cost of buying someone else's pre-built version. This was rational. Building a CRM from scratch in 2015 would have taken a team of engineers months and cost hundreds of thousands of dollars. Buying a Salesforce license cost $25 per user per month. The math was obvious.
But nobody actually wanted 30,000 pipes. No company ever said "I wish I had more software subscriptions." No operations manager dreamed of managing 130 different tools across departments, each with its own login, its own data model, its own pricing tier, its own integration requirements. The proliferation of SaaS was a supply-side phenomenon, not a demand-side one. Companies tolerated the complexity because they had no alternative. The market was waiting for a general-purpose pipe. It just did not exist yet.
Now it does. And the math is no longer obvious.
2. Why Thousands of Pipes Existed
The proliferation of SaaS products was not random. It was driven by three structural forces that are worth understanding individually, because each one is now being undermined by the same cause.
Force 1: Implementation complexity. Building software required specialized knowledge. You needed to understand programming languages, database design, API architectures, deployment infrastructure, security practices, and dozens of other technical domains. This knowledge was scarce and expensive. A company that wanted a custom CRM needed to either hire engineers (expensive, slow) or buy one off the shelf (cheap, fast). The gap between the cost of hiring and the cost of subscribing created the entire SaaS business model. As long as implementation was hard, buying was better than building.
Force 2: Maintenance burden. Software does not stay built. It requires ongoing updates for security patches, dependency changes, infrastructure migration, feature additions, and bug fixes. A SaaS vendor amortized this maintenance cost across thousands of customers. A single company building its own tool bore the full maintenance burden alone. This ongoing cost was often larger than the initial build cost, and it was the primary reason companies stayed on SaaS platforms even when they were unhappy with them. Switching costs were high not because of data lock-in (though that existed too), but because of the maintenance obligation that came with self-built alternatives.
Force 3: Workflow knowledge. The most underappreciated force. A CRM vendor did not just provide software. It provided an encoded opinion about how customer relationships should be managed. Salesforce's data model (Leads, Contacts, Accounts, Opportunities) was not just a database schema. It was a workflow philosophy. Companies that adopted Salesforce were not just buying software; they were buying a pre-packaged answer to the question "how should we structure our sales process?" This was valuable because most companies did not have strong opinions about workflow design. They wanted someone else to figure it out. The SaaS vendor served as a workflow consultant embedded in software form.
Each of these three forces pushed in the same direction: toward buying pre-built pipes rather than constructing custom ones. And for roughly twenty years (2004 to 2024), this equilibrium held. The SaaS market grew from near-zero to $358 billion. Venture capital funded thousands of new pipes every year. The entire technology economy organized itself around the production and consumption of specialized software pipes.
Then LLM inference arrived and undermined all three forces simultaneously.
3. What Changed: The Economics of the Build-Buy Equation
The build-buy equation has always been the fundamental calculation that determines whether a company purchases software or constructs it internally. For two decades, the equation heavily favored buying. Here is what changed.
Implementation complexity collapsed. An LLM can generate functional software from a natural language description. This is not a theoretical capability. It is an operational reality. Claude Code, Anthropic's AI coding agent, hit $2.5 billion ARR by February 2026, doubling since the start of the year - SaaStr. GitHub Copilot, Cursor, Windsurf, and similar tools have demonstrated that the marginal cost of writing code has dropped by an order of magnitude. 92% of US developers now use AI coding tools daily, with AI writing 41% of all code - Index.dev. The scarce resource was implementation knowledge. That resource is no longer scarce. As we explored in how to build products with AI, a solo founder in 2026 can build systems that would have required a 10-person engineering team in 2020.
Maintenance burden shifted. When an LLM can generate code, it can also modify, debug, and update code. The ongoing maintenance cost of self-built software drops proportionally to the drop in implementation cost. A security vulnerability that would have required a senior engineer to diagnose and patch can now be identified and fixed by an AI coding agent. Dependency updates, refactoring, and feature additions follow the same pattern. The maintenance argument for SaaS, that vendors amortized ongoing costs across customers, loses its force when the cost being amortized approaches zero.
Workflow knowledge became accessible. This is the most profound change and the least discussed. LLMs have been trained on the collective workflow knowledge of the entire internet. They understand CRM design patterns, marketing automation workflows, project management methodologies, and thousands of other domain-specific workflow structures. When a founder tells Claude "build me a CRM that tracks leads through our specific sales process," the LLM brings workflow knowledge that previously only existed inside SaaS products or expensive consultants. The encoded opinion that Salesforce sold, the pre-packaged answer to "how should we structure our sales process," is now embedded in the LLM itself. The SaaS vendor's role as workflow consultant is being absorbed into the model.
The Retool 2026 Build vs. Buy Report quantifies this shift with hard data from 817 builders surveyed: 35% of teams have already replaced at least one SaaS tool with a custom build, 78% expect to build more custom tools in 2026, and 51% have shipped production software using AI - Retool. The tools most at risk of replacement: workflow automations (35%), internal admin tools (33%), BI tools (29%), CRMs and form builders (25%), and project management tools (23%).
Real examples are already appearing. ClickUp built six AI tools connected to Salesforce, Zendesk, and Snowflake that automated hundreds of weekly work hours, saving $200K per year in automation software spending alone. Harmonic replaced a $20,000/year third-party tool by rebuilding internally. A former Amazon executive built a complete CRM system in one weekend using AI - Yahoo Finance.
4. The Big Pipe: LLM Inference as Universal Solvent
Every software pipe moves data. That is all it does. A CRM pipe takes raw customer interaction data and moves it into structured relationship records. An email marketing pipe takes content and audience segments and moves them into delivered messages. An analytics pipe takes raw event data and moves it into dashboards and insights. Each pipe requires the data to arrive in a specific format, through specific connectors, following specific rules. Pour the wrong data into the wrong pipe and nothing works.
LLM inference is a universal pipe. It accepts data in any format, any structure, any language, and processes it according to whatever instruction you give it at runtime. There is no schema to configure, no field mapping to set up, no integration to build. The data flows in, the inference layer understands it, transforms it, enriches it, reorganizes it, and routes it. All natively. All inside the pipe. This is structurally different from every software product that came before it.
Consider the implications. A traditional SaaS product is a pipe with a fixed diameter and fixed routing. Salesforce can process customer data through its specific workflow patterns. It cannot, without significant configuration and customization, process customer data through your specific workflow patterns if they differ from Salesforce's model. The pipe's shape is the product.
LLM inference is a pipe with variable diameter and variable routing. It can process any data through any workflow pattern, because the workflow pattern is specified in the prompt rather than encoded in the software architecture. The pipe's shape is determined by the user, not the vendor. This is the fundamental structural difference.
This is why the consolidation is happening. LLMs are not competing with SaaS products category by category. They are dissolving the category boundaries themselves. When you can describe any workflow in natural language and have it executed, the concept of a "CRM" as a distinct product category becomes arbitrary. A CRM is just a particular prompt pattern applied to customer data. An email marketing tool is just a particular prompt pattern applied to content and audience data. The distinction between these categories was an artifact of implementation complexity. Remove the complexity, and the categories dissolve.
And the market is not resisting this. The market has been waiting for exactly this. Every company that ever complained about integration hell, about paying for 10 different tools across departments, about spending months on SaaS implementations, about bending their processes around someone else's data model. They were all waiting for a single pipe that could handle everything. The convergence is happening this fast because it is what the entire demand side of the market actually wants.
5. The Conformity Tax: The Hidden Cost Nobody Talked About
Every SaaS product imposes a hidden cost on its users that has been systematically undervalued by the market: the conformity tax.
When you adopt Salesforce, you do not just adopt software. You adopt Salesforce's data model. You adopt its workflow assumptions. You adopt its object hierarchy (Leads, Contacts, Accounts, Opportunities, Cases). You adopt its reporting structure. You adopt its API patterns. You adopt its permission model. You conform your business processes to fit the software, rather than the software conforming to fit your processes.
This conformity tax is enormous, but it was invisible for decades because there was no alternative. If building a custom CRM cost $500,000 and buying Salesforce cost $300 per month, the rational choice was to pay the conformity tax. It was cheaper to bend your business around the software than to build software that bent around your business.
The conformity tax manifests in several concrete ways:
Workflow distortion. Companies restructure their internal processes to match the software's assumptions. A sales team that naturally operates with a different pipeline structure forces itself into Salesforce's pipeline model. A marketing team that thinks about campaigns differently forces itself into HubSpot's campaign model. The software shapes the business rather than the business shaping the software. Over time, companies lose track of which processes are genuinely optimal and which are artifacts of tool conformity.
Integration overhead. Because each SaaS pipe has its own data model, moving data between pipes requires integration layers (Zapier, Make, custom API middleware). These integration layers exist solely because different pipes have different shapes. The average mid-market company uses 130+ SaaS applications - Productiv. Integrating these applications is itself a multi-billion-dollar industry (iPaaS, integration platforms, middleware vendors). The entire integration industry is a tax on the conformity tax.
Innovation friction. When a company wants to try a new workflow or process, it first has to ask: "Does our software support this?" If the answer is no, the company either abandons the innovation, pays for custom development, or switches to a different tool (incurring switching costs). The software becomes a constraint on organizational evolution. Companies optimize for what their tools can do rather than for what they should do.
Data model lock-in. Each SaaS product stores data in its own schema. Switching from Salesforce to HubSpot requires a data migration that is complex, error-prone, and often lossy. Years of accumulated data are encoded in a vendor-specific format. This creates genuine lock-in that persists even when the software no longer serves the company's needs.
Now consider what happens when the build cost collapses to near-zero. The conformity tax, which was previously acceptable because the alternative was expensive, becomes the dominant cost of using SaaS software. You are no longer paying $300/month for software. You are paying $300/month for the privilege of distorting your business processes, maintaining unnecessary integrations, constraining your innovation capacity, and locking your data into someone else's schema. That value proposition does not survive scrutiny when building a custom alternative costs less than a month's subscription.
This is not a theoretical argument. It is happening in practice. The rise of AI agent platforms and coding agents is enabling companies to build exact-fit solutions that impose zero conformity tax.
6. The Market Evidence: SaaS vs AI in Hard Numbers
The Big Pipe thesis is not speculative. The evidence is visible in public market data, and the divergence between SaaS stocks and AI stocks is the clearest proof.
The Great Divergence: S&P 500 With and Without AI
The Magnificent 7 (Apple, Microsoft, Alphabet, Amazon, Meta, Nvidia, Tesla) represent 32.5% of the S&P 500 by market cap. In 2024, they contributed roughly 42% of the index's total return - Yardeni Research. The market is not broadly rising. A handful of AI-adjacent companies are dragging the index upward while the rest of the market lags.
The equal-weight S&P 500 (RSP), which gives each stock equal influence, tells the real story. In 2024, cap-weighted SPY returned +25% while equal-weight RSP returned only +12.8%, a 12.2 percentage point gap - ETF Database. In 2026, this reversed as AI names corrected: equal-weight is up 5.5% while cap-weighted is down 0.2% - Seeking Alpha.
{
"title": "S&P 500: Cap-Weighted vs Equal-Weight Annual Returns",
"subtitle": "The gap reveals how AI concentration distorts broad market perception",
"type": "line",
"xKey": "year",
"yKeys": [
{"key": "capWeighted", "label": "S&P 500 (Cap-Weighted)"},
{"key": "equalWeight", "label": "S&P 500 (Equal-Weight)"}
],
"data": [
{"year": "2024", "capWeighted": 25.0, "equalWeight": 12.8},
{"year": "2025", "capWeighted": 17.9, "equalWeight": 11.2},
{"year": "2026 YTD", "capWeighted": -0.2, "equalWeight": 5.5}
],
"yAxisFormatter": "percent",
"source": "Yardeni Research, ETF Database, Invesco",
"sourceUrl": "https://www.yardeniquicktakes.com/s-p-500-with-without-the-magnificent-7/"
}
Software ETFs: The Collapse
The iShares Expanded Tech-Software ETF (IGV), which tracks major software companies, is down 22% year-to-date in 2026 and has fallen roughly 30% from its September 2025 peak. Its forward P/E compressed from 35x to 20x in months, a valuation reset not seen since the 2008 financial crisis - Financial Charts. The WisdomTree Cloud Computing ETF (WCLD) is down 19.3% YTD, with SaaS-heavy cloud companies bearing the brunt - Stock Analysis.
For the first time in history, software now trades at a discount to the S&P 500. The median EV/Revenue multiple for public SaaS companies collapsed to 3.4x in March 2026, down from a pandemic peak of 18-19x - SaaStr.
{
"title": "Software ETF Performance: The SaaSpocalypse in Numbers",
"subtitle": "Annual returns for major software ETFs vs the S&P 500",
"type": "line",
"xKey": "year",
"yKeys": [
{"key": "sp500", "label": "S&P 500"},
{"key": "igv", "label": "IGV (Software ETF)"},
{"key": "wcld", "label": "WCLD (Cloud ETF)"}
],
"data": [
{"year": "2024", "sp500": 25.0, "igv": 23.4, "wcld": 5.2},
{"year": "2025", "sp500": 17.9, "igv": 5.6, "wcld": -6.5},
{"year": "2026 YTD", "sp500": -0.2, "igv": -22.0, "wcld": -19.3}
],
"yAxisFormatter": "percent",
"source": "Yahoo Finance, StockAnalysis, Financial Charts",
"sourceUrl": "https://www.financecharts.com/etfs/IGV/performance"
}
Meanwhile, AI infrastructure is experiencing explosive growth. Nvidia's revenue surged from $60.9 billion (FY2024) to $215.9 billion (FY2026), with data center revenue alone growing 409% in a single year - Nvidia Investor Relations. Anthropic's ARR exploded from $1 billion at end of 2024 to $30 billion by March 2026, a 1,400% year-over-year growth rate that surpassed OpenAI for the first time - Anthropic.
The venture capital flows confirm the structural shift. In 2025, AI startups captured 61% of all global VC funding ($258.7 billion), up from 33% in 2024 - Crunchbase. The smart money is following the Big Pipe.
The Catalyst: February 3, 2026
On a single day, February 3, 2026, approximately $285 billion in market capitalization was wiped from software stocks. The catalyst: Anthropic's launch of Claude Cowork, which demonstrated that AI agents could autonomously handle complex knowledge work that previously required teams of SaaS-equipped humans - Bloomberg. The market immediately understood the implication: if AI agents can do the work, you need fewer human workers, and if you need fewer human workers, you need fewer software seats. The entire per-seat pricing model of SaaS was called into question in a single trading session.
Jeffrey Favuzza of the Jefferies equity trading desk summarized it bluntly: "The draconian view is that software will be the next print media or department stores, in terms of their prospects" - AI 2 Work.
7. The SaaSpocalypse: Company-by-Company Breakdown
The decline is not evenly distributed. Companies whose value proposition is most easily replicated by LLM inference are experiencing the sharpest drops.
{
"title": "SaaS Stock Decline From All-Time Highs",
"subtitle": "Percentage decline from peak price to April 2026",
"type": "bar",
"xKey": "company",
"yKeys": [{"key": "decline", "label": "Decline from ATH"}],
"data": [
{"company": "UiPath", "decline": -89},
{"company": "Zoom", "decline": -87},
{"company": "HubSpot", "decline": -75},
{"company": "Twilio", "decline": -71},
{"company": "Salesforce", "decline": -42},
{"company": "Datadog", "decline": -37},
{"company": "Adobe", "decline": -35},
{"company": "ServiceNow", "decline": -28}
],
"yAxisFormatter": "percent",
"source": "Yahoo Finance, MacroTrends, StockAnalysis",
"sourceUrl": "https://finance.yahoo.com"
}
UiPath (PATH): The poster child of the collapse. Peak price of ~$90 in 2021, now trading at $9.99, a decline of 89%. UiPath cut its FY2025 ARR guidance from $1.725B to $1.660B and saw its stock drop 34% in a single day on May 29, 2024. The core RPA business model, automating repetitive UI interactions with scripted macros, is being directly replaced by LLM agents that can understand and interact with any interface using natural language - Motley Fool. We analyzed this structural decline in the decline of RPA and RPA vs agentic automation.
Salesforce (CRM): Down 42% from its 52-week high of $296 to $171, hitting new 52-week lows in April 2026. The company was labeled an "AI loser" by analysts. Salesforce Ben ran the headline: "AI pummels 'dying' SaaS market" - Salesforce Ben. Revenue growth slowed to 11%, and despite Agentforce reaching a $540M run rate with 330% adoption growth, the market remains skeptical about whether AI features can offset seat compression - TipRanks.
HubSpot (HUBS): Down 74.5% from its all-time high of $852 to $217. The one-year total shareholder return is -62.75%. Revenue growth decelerated to 16% in constant currency, billings growth slowed from 20% to 19%, and the company reported "no AI boost in 2025." Weak sales guidance triggered a 13%+ single-day drop - Motley Fool.
Zoom (ZM): Down 87% from its pandemic peak of $589. Five-year return: -78%. The collaboration tool category was already commoditizing before AI accelerated it.
The RPA category is the most instructive because it was the most obviously replaceable. RPA automated repetitive UI interactions, and LLMs can do the same thing with natural language instructions rather than brittle scripted macros. UiPath's stock tells the story, but the broader market agrees: Blue Prism's mindshare fell from 20.2% to 15.9% in a single year, and Microsoft Power Automate has overtaken it as the #3 player.
The "Seat Compression" Mechanism
The core threat to SaaS is not that the software is bad. It is that the pricing model is structurally incompatible with AI productivity gains. Bain & Company identifies that AI could automate 30-50% of activity across high-volume, digitized workflows - Bain. The math is simple: if 10 AI agents can do the work of 100 sales reps, you do not need 100 Salesforce seats anymore. Revenue drops 90% despite identical work output. This is why the agent economy represents a fundamentally different value framework for software.
Gartner projects that by 2030, 35% of point-product SaaS tools will be replaced by AI agents, and 40% of enterprise SaaS spend will shift to usage, agent, or outcome-based pricing - Gartner.
The SaaS IPO Freeze
In 2026, zero venture-backed SaaS unicorns have submitted a new IPO filing. The overall US IPO market is actually up 47% in Q1 2026, but SaaS debuts are completely absent - Crunchbase. The sector is described as an "unfriendly scene for companies with business models viewed as vulnerable to AI-driven displacement." The contrast with AI companies is stark: both Anthropic and OpenAI are planning IPOs in 2026-2027, with Anthropic targeting an IPO at $380B+ and OpenAI at a potential $1T valuation - CNBC.
8. The 100x Factor: Why the Shift is Larger Than People Think
Most analysis of AI's impact on software development claims a 10x productivity improvement. This is conservative to the point of being misleading. The actual factor, for greenfield development, is closer to 100x. And the reason is simpler than most people think.
The 100x is not a multiplication of individual phase improvements. It is not "5x coding speed times 5x architecture speed times 3x debugging speed." That kind of decomposition misses the point entirely. The 100x comes from one thing: AI does all the grunt work, and it does it in parallel.
A traditional software developer does not spend most of their time thinking. They spend most of their time on mechanical labor: writing boilerplate, setting up project scaffolding, configuring build tools, wiring up database connections, writing CRUD endpoints, setting up authentication flows, writing CSS, debugging typos, handling edge cases, writing tests, configuring deployment pipelines. This is not creative work. It is grunt work. And a Stripe study found that developers spend only 32% of their time actually writing code at all, with the rest consumed by meetings, coordination, and context-switching.
An AI coding agent like Claude Code does all of that grunt work. Architecture, code generation, debugging, testing, deployment configuration. It is not that each of these is individually faster. It is that the entire mass of mechanical implementation work, which used to consume weeks or months of a developer's time (or an entire team's time), is now handled by the AI in hours. The developer's job shifts from doing the work to directing the work. You describe what you want. The AI builds it. You review and adjust. The AI iterates.
And critically, it parallelizes. A traditional team of 10 engineers coordinates through meetings, ticket systems, code reviews, and handoffs. That coordination overhead is enormous. With AI agents, you can have multiple agents working on different parts of the system simultaneously, with zero coordination cost. No stand-ups. No merge conflicts from miscommunication. No waiting for someone else's PR review. The work that required 10 people working sequentially (with coordination tax) is done by one person directing parallel AI agents.
92% of US developers now use AI coding tools daily, with AI writing 41% of all code - Index.dev. GitHub reports developers completing tasks 20-55% faster with Copilot alone - GitHub Research. But these studies measure incremental improvements on existing workflows. They do not capture what happens when someone builds an entire system from scratch using AI as the primary builder. That is where the 100x manifests.
A rigorous METR study found that experienced developers were actually 19% slower with AI tools on familiar codebases - METR. This makes sense. If you already know every line of a codebase, AI assistance adds overhead rather than removing it. But the build-buy equation is not about modifying familiar code. It is about greenfield development: building new systems from scratch. That is where the grunt work is largest, where parallelization matters most, and where the 100x shows up in practice.
9. What the Big Pipe Cannot Replace
The Big Pipe thesis has limits. Identifying those limits is as important as understanding the thesis itself, because the limits define where value will concentrate in the post-consolidation landscape.
Network effects are immune. LinkedIn's value is not its software. A solo developer with Claude Code could build a professional networking platform with identical features in a week. But that platform would have zero users. LinkedIn's value is that 1 billion professionals are already on it - LinkedIn. The Big Pipe can replicate the software, but it cannot replicate the network.
Regulated trust boundaries resist. Healthcare software (Epic, Cerner), financial software (Bloomberg Terminal, core banking systems), and legal technology exist not primarily because the software is complex, but because the certification, compliance, and audit infrastructure surrounding them is complex. A hospital cannot replace Epic with a custom-built EHR system, even if the custom system is technically superior, because the custom system has not been certified, validated, audited, and approved by regulators. LLMs can generate compliant code, but they cannot generate FDA clearance, SOC 2 attestation, or HIPAA compliance certification.
Bain & Company provides a useful framework for this, categorizing SaaS into four vulnerability quadrants - Bain:
| Quadrant | Characteristics | Examples | Risk Level |
|---|---|---|---|
| Core Strongholds | Regulatory protection, human judgment | Procore cost accounting, Medidata clinical trials | Low |
| Gold Mines | Proprietary data, high automation | Cursor AI code editor, Guidewire claims | Medium-Low |
| Open Doors | Low automation, high AI penetration | HubSpot list building, Monday.com task boards | High |
| Battlegrounds | High automation, direct AI competition | Intercom support, Tipalti invoicing, ADP time-entry | Very High |
Real-time multi-party coordination is structurally different. Figma's value is not that it is a design tool. It is that multiple designers can simultaneously edit the same canvas with sub-100ms latency. This requires distributed systems architecture (CRDTs, operational transformation, real-time WebSocket infrastructure) that is fundamentally different from the request-response pattern of LLM inference. The Big Pipe is excellent at serving one user's intent. Coordinating many users' intents simultaneously is a different computational problem.
Physical-world integration has a hard boundary. The Big Pipe operates in the digital domain. It can generate code to control a robot, but it cannot be the robot. Any product whose value depends on physical-world interaction has a dimension the Big Pipe cannot absorb.
Proprietary data assets retain value. Bloomberg Terminal's software is replaceable. Bloomberg's data, decades of financial data, proprietary analytics, and real-time market feeds, is not. Any product that has accumulated a proprietary dataset that cannot be reproduced by training an LLM retains value independent of the software that wraps it.
10. The New Scarcity: Where Value Concentrates Now
If the Big Pipe dissolves the scarcity of implementation, workflow knowledge, and maintenance capability, then where does scarcity, and therefore value, move to?
Scarcity 1: Taste and judgment. When building is free, the constraint is no longer "can we build this?" but "should we build this, and in what form?" The ability to make good product decisions, to know what to build, how it should work, and what the user actually needs, becomes the scarce resource. The solo founder who can make 50 good product decisions per day and execute them instantly through AI tools is more valuable than the 50-person team that makes 5 product decisions per week through committee.
Scarcity 2: Distribution and attention. When everyone can build software, the constraint shifts from production to distribution. The ability to get your product in front of users becomes relatively more valuable than the ability to build it. This is why marketing, brand, community, and distribution channels become disproportionately important in a Big Pipe world.
Scarcity 3: Trust and credibility. In a world where anyone can build anything, the question "should I trust this?" becomes critical. Established brands, verified track records, and institutional credibility become relatively more valuable. Trust is built slowly and cannot be generated by AI.
Scarcity 4: Compute infrastructure itself. The Big Pipe requires massive computational infrastructure. Nvidia's market capitalization exceeded $4.4 trillion for a reason: the entire software industry's value is flowing toward the compute infrastructure that powers the Big Pipe - CompaniesMarketCap. Hyperscaler AI capex is approaching $660-690 billion combined for 2026, nearly doubling from 2025 - Futurum Group.
{
"title": "Hyperscaler AI Infrastructure Spending (2026 Projections)",
"subtitle": "Combined capex approaching $700 billion, approximately 75% directed at AI",
"type": "bar",
"xKey": "company",
"yKeys": [{"key": "capex", "label": "2026 Capex ($B)"}],
"data": [
{"company": "Amazon", "capex": 200},
{"company": "Google", "capex": 180},
{"company": "Microsoft", "capex": 120},
{"company": "Meta", "capex": 125},
{"company": "Oracle", "capex": 50}
],
"yAxisFormatter": "billions",
"source": "CNBC, Yahoo Finance, Futurum Group",
"sourceUrl": "https://www.cnbc.com/2026/02/06/google-microsoft-meta-amazon-ai-cash.html"
}
As we covered in the AI factory model, compute infrastructure is becoming the foundational layer of the entire digital economy.
Scarcity 5: Frontier model capability. Not all LLMs are equal. The difference between a frontier model and a commodity model is substantial. Anthropic's $380 billion valuation and OpenAI's $852 billion fundraise reflect the market's assessment of this scarcity - Anthropic, CNBC. We analyzed the Anthropic ecosystem in depth in the Anthropic ecosystem guide.
{
"title": "AI Company Valuations vs SaaS Leaders (April 2026)",
"subtitle": "Private AI companies now exceed the valuations of entire SaaS categories",
"type": "bar",
"xKey": "company",
"yKeys": [{"key": "valuation", "label": "Valuation ($B)"}],
"data": [
{"company": "OpenAI", "valuation": 852},
{"company": "Anthropic", "valuation": 380},
{"company": "Salesforce", "valuation": 165},
{"company": "ServiceNow", "valuation": 109},
{"company": "HubSpot", "valuation": 11},
{"company": "UiPath", "valuation": 5}
],
"yAxisFormatter": "billions",
"source": "CNBC, Yahoo Finance, CompaniesMarketCap"
}
11. Second-Order Effects: What Happens After the Pipes Collapse
The collapse of specialized software pipes creates second-order effects that are not immediately obvious but are potentially more transformative than the primary effect.
The integration industry disappears. If you do not have 130 separate SaaS tools, you do not need Zapier, Make, or MuleSoft to connect them. The entire integration platform category exists solely as a consequence of pipe proliferation. When pipes consolidate, the need for inter-pipe plumbing vanishes.
The implementation consulting industry contracts. Accenture, Deloitte, and hundreds of smaller consultancies generate significant revenue from implementing and customizing SaaS products. Salesforce implementation consulting alone is an estimated $20+ billion market. When companies build custom solutions instead of implementing packaged software, the need for implementation consultants diminishes proportionally.
Pricing models break. SaaS pricing is based on the premise that the vendor provides ongoing value through maintenance, updates, and hosting. Per-seat pricing works when the software is complex and the alternative is expensive. But when the alternative is free (build it yourself), per-seat pricing becomes difficult to justify. 83% of AI-native SaaS companies currently offer usage-based pricing - Deloitte. This shift from subscription to consumption pricing is not a trend; it is a structural consequence of the Big Pipe. As explored in our analysis of AI agents pricing, you do not subscribe to capabilities. You consume them.
The venture capital model for software changes. Traditional SaaS venture capital relied on a playbook: fund a product, achieve product-market fit, grow through sales, reach $100M ARR, and IPO. This playbook assumes that the product's value is durable because the alternative (building it yourself) is expensive. When the alternative becomes cheap, the product's value is ephemeral. VC is already responding: in 2025, AI startups raised at 40% higher valuations than peers at Series A, while SaaS startups' share of total VC funding continues to shrink - Eqvista, Carta.
The talent market restructures. The market shifts from valuing implementation skill (the ability to write code) to valuing architecture and judgment (the ability to decide what code should be written and why). Junior and mid-level implementation roles face the strongest pressure. Senior roles that involve system design, technical judgment, and cross-domain integration retain value because they involve the scarce resources (taste, judgment, institutional knowledge) that the Big Pipe does not replace.
12. The Infrastructure Layer: Who Owns the Big Pipe
If the Big Pipe is the central metaphor, the critical question is: who controls it? The answer has implications for market structure, competition, and power dynamics that mirror the most consequential platform shifts in technology history.
Currently, the Big Pipe is controlled by a small number of foundation model companies:
| Company | Key Model(s) | Valuation / Market Cap | Revenue (Annual) | Primary Advantage |
|---|---|---|---|---|
| Anthropic | Claude Opus, Sonnet, Haiku | $380B (private) | ~$30B ARR | Safety, coding, long context |
| OpenAI | GPT-5, o3 | $852B (private) | ~$24B | Consumer brand, ChatGPT distribution |
| Nvidia | GPU infrastructure | $4.45T (public) | $215.9B | Compute monopoly, CUDA ecosystem |
| Google DeepMind | Gemini 3 | Part of $2T Alphabet | Billions (cloud AI) | Vertical integration, data |
| Broadcom | Custom AI chips | ~$1T (public) | ~$96B est. | Custom silicon, $70B+ AI backlog |
Nvidia is the clearest winner. Revenue surged from $27 billion just two years ago to $215.9 billion in FY2026, with data center revenue alone growing triple digits. The company is now the world's most valuable, at $4.45 trillion - Nvidia IR.
Broadcom is the second infrastructure winner, with AI chip revenue growing 106% YoY and a backlog exceeding $70 billion. The CEO projects AI chip revenue alone will surpass $100 billion by 2027 - IndexBox.
The most likely market structure is an oligopoly with open-source competition, similar to cloud computing (AWS, Azure, GCP dominate, but Kubernetes enables portability). Open-source models (Llama, Mistral, Qwen) provide a floor of capability that prevents monopoly pricing. The Big Pipe will be owned by 3 to 5 major providers, with open-source models providing competitive alternatives.
13. Implications for Founders, Investors, and Operators
The a16z Counter-Narrative
Not everyone agrees that SaaS is dying. Andreessen Horowitz argues that "code has never been software's primary value source" and that competitive moats (network effects, proprietary data, brand, process power) matter more than the software itself. They predict there will be more software than ever, not less - a16z. They also draw a historical parallel: the February 2016 SaaS panic saw LinkedIn fall 44%, Tableau drop 50%, and Salesforce decline 13%. The sector recovered within months, and Microsoft acquired LinkedIn four months later for $26B.
The a16z argument has merit for certain categories. SaaS products with genuine network effects, proprietary data, or deep regulatory moats will survive and potentially thrive. But the argument misses the key structural difference: in 2016, the build alternative was still expensive. In 2026, it is not. The 2016 panic was a valuation correction within the same structural paradigm. The 2026 panic reflects a paradigm shift.
For Founders
The question is no longer "can I build this?" but "can I sustain value after I build it?" If your product is a workflow abstraction that the Big Pipe can replicate, your moat is temporary. You need at least one of: a network effect, proprietary data, regulatory certification, or deep institutional integration. If you have none of these, you are building a feature, not a company.
The positive side: the cost of experimentation has collapsed. You can test 10 product ideas in the time it used to take to test one. Speed of iteration becomes the competitive advantage. Move fast, test hypotheses, find the configuration of value that has a structural moat.
For Investors
Add a new dimension to SaaS evaluation: Big Pipe vulnerability. For any SaaS company, ask: "Could a reasonably skilled person with AI tools replicate this product's core value proposition in a week?" If the answer is yes, the company's valuation multiple should be lower than traditional models suggest.
For Enterprise Operators
The optimal strategy is to selectively unbundle your SaaS stack. Start with tools imposing the highest conformity tax and lowest switching cost: internal dashboards, reporting tools, simple workflow automations, departmental point solutions. Build a small team (or single person) whose job is to build custom tools using AI. This team replaces the budget for 5-10 SaaS subscriptions while producing better-fitting solutions.
Deloitte reports that 57% of enterprises are allocating 21-50% of digital transformation budgets to AI automation, with 20% investing over 50% - Deloitte. The shift is already underway. As we explored in agentic business process automation, the economics of building vs. buying have shifted decisively.
14. The Philosophical Shift: From Products to Capabilities
The deepest implication of the Big Pipe is philosophical. It changes what software is.
For the entire history of the software industry, software has been a product. You bought it or subscribed to it. It had a name, a brand, a feature list, and a price. It was a thing.
In a Big Pipe world, software is not a product. It is a capability. You do not buy a CRM. You invoke the capability "manage customer relationships" through a prompt. You do not buy an email marketing tool. You invoke the capability "send targeted emails at scale." The capability exists as a potential within the Big Pipe. It is actualized when you describe what you need. It does not have a name or a brand. It is not a thing. It is a verb.
This shift from noun to verb, from product to capability, has profound implications for how we think about software value. A product has a price because it is scarce: someone built it, and you need to compensate them for building and maintaining it. A capability does not have a price in the same sense. It has a cost of invocation, the compute required to actualize it, but no product margin. The value capture model shifts from product margin (build once, sell many times) to infrastructure margin (provide the compute that enables all capabilities).
This is why foundation model companies are valued so highly relative to SaaS companies. They are not selling products. They are selling the substrate from which all software capabilities can be generated. They occupy a position analogous to electric utilities: they do not sell appliances, they sell the electricity that powers all appliances. And just as the electric utility is more valuable than any individual appliance manufacturer, the Big Pipe provider is more valuable than any individual SaaS company.
Dean Shahar, Managing Director at DTCP ($3B fund), captured this precisely: "The SaaS world is dying, not software itself, but SaaS as a business category. AI has turned software into a commodity where sustainable competitive advantage is nearly impossible" - Calcalist Tech.
15. Conclusion: The Shape of What Comes Next
The Big Pipe is not coming. It is here. The consolidation of specialized software pipes into general-purpose LLM inference is an ongoing structural transformation, not a future prediction. The evidence is in the $2 trillion erased from software market caps, the 89% decline in UiPath, the zero SaaS IPOs in 2026, and the $660+ billion in hyperscaler capex flowing to AI infrastructure.
What disappears: Pure workflow abstraction products (SaaS tools that exist solely because building was expensive), integration middleware (exists because pipes were separate), implementation consulting (exists because tools were complex), and the concept of software as a static product.
What survives: Network-effect businesses, regulated systems of record, real-time multi-party coordination tools, physical-world integrations, proprietary data assets, and established trust relationships.
What emerges: A capability-based software model where compute is the primary cost, taste and judgment are the primary differentiators, and distribution and trust are the primary moats. A world where solo builders and small teams can compete with large companies because the implementation barrier has been removed. A market structure where value concentrates at the infrastructure layer (compute providers, foundation model companies) and at the human layer (judgment, creativity, taste), with the traditional middle layer (packaged software products) hollowing out.
The right response to this shift depends on where you sit. If you are a SaaS founder, assess your Big Pipe vulnerability honestly. If you are an investor, update your valuation frameworks. If you are an enterprise operator, start selectively unbundling. If you are a builder, recognize that the constraint has moved from "can I build this?" to "should I build this, and will its value endure?"
The Big Pipe does not care about your product roadmap, your feature list, or your pricing tier. It cares about nothing, because it is infrastructure. And infrastructure does not compete with products. It makes them unnecessary.
Disclaimer: This analysis reflects the state of the AI and software markets as of April 2026. The field is evolving rapidly, and specific company valuations, revenue figures, and market dynamics referenced may change. Always verify current data before making investment or strategic decisions.
Written by Yuma Heymans, founder of o-mega.ai, the AI agent workforce platform. O-mega enables teams and individuals to deploy autonomous AI agents that replace entire software stacks with intelligent, adaptive workflows. Try O-mega free.