The New Implementation Economy
AI is making it faster & cheaper to implement new systems. In a world where friction is no longer a business model, a company's talent and product velocity win out. Hence the current "AI Talent Wars."
In my Substack piece from two weeks ago, I discussed the “Next Generation of Systems of Record” – I highlighted how, by doing work that inherently requires access to data, AI tools could replace legacy systems of record, once they reach enough scale. But it’s no longer enough to be a System of Record.
In 2024, Anthropic introduced the Model Context Protocol (MCP), which acts as a universal connector for AI. MCP allows an AI model to interface with any application or database through a standardized protocol, without bespoke adapters. In practice, this means an AI integration layer can broker data in and out of systems in real-time.
In a world where data is portable between systems, the entire Implementation Economy will change. Implementation – the long, expensive, failure‑prone process of deploying, configuring, integrating, and maintaining enterprise software – has long been the largest obstacle for new market entrants trying to compete with incumbents. The web of custom workflows, data schemas, integrations, and institutional muscle memory wrapped around a software made the cost of switching systems untenably high.
The old adage in tech is that a new platform must be 10x better than its incumbent competitor to displace them. But thanks to MCP and other AI tools, implementation costs are falling precipitously. This affects the economics of switching systems drastically, and requires each member of the Implementation Economy to examine how to best position themselves moving forward.
How Implementation Historically Worked
Large systems integrators and consultancies built massive businesses around integration complexity. Firms like Accenture, Deloitte, Infosys, and thousands of boutique Systems Integrators (SIs) specialized in translating messy real‑world business processes into rigid software configurations. The work included:
Customizing workflows, fields, and permissions
Migrating years (or decades) of historical data
Building and maintaining brittle point‑to‑point integrations
Training users and managing organizational change
Ongoing maintenance and reconfiguration as the business evolved
In practice, implementation costs frequently exceeded the cost of the software itself. It was not uncommon for total deployment costs to reach 7-8x the initial license price for complex systems.
As a result, once a system was live, enterprises became extremely reluctant to move. Even if the software was outdated, slow, or poorly aligned with evolving needs, switching felt existentially risky. This dynamic created several reinforcing layers of lock‑in:
Data Gravity: Enterprise systems accumulate irreplaceable historical data. Extracting it, cleaning it, mapping it into a new schema, and validating accuracy was painstaking manual work. Errors could cascade into compliance issues, reporting failures, or operational outages. The safest option was often to stay put.
Customization Debt: Over time, systems became deeply customized to mirror how the organization worked (or how it used to work). Recreating these customizations in a new platform was effectively a redevelopment project.
Integration Sprawl: No system exists in isolation. CRMs talk to ERPs, billing tools, data warehouses, and internal systems. Switching meant rebuilding every integration edge. Each one added friction, cost, and risk.
Human Inertia: Users build muscle memory. Finance teams know which buttons to click. Ops teams know which reports to trust. Change introduced productivity loss and internal resistance, which was often more powerful than budget constraints.
Together, these forces created one of the strongest moats in software history. Incumbents didn’t have to win on velocity or elegance. They just had to be “good enough” to avoid triggering a migration.
Why AI Breaks the Implementation Model
AI tools are now effective enough to solve for each of these layers of lock-in. And because AI is now more frequently selling work, not software (eating into labor spend in addition to tech spend), the number of software tools a given organization uses is bound to increase – it’s no longer just about CRMs, ERPs, and HCMs. Switching AI vendors will become more akin to firing a worker and hiring their replacement. This makes understanding implementation & switching costs that much more important for businesses. The following technological capabilities lower implementation friction:
Autonomous Implementation Agents: AI agents can now perform many of the repetitive tasks that once required teams of consultants: configuring fields, setting up rules, validating workflows, and continuously adjusting systems as requirements change. Software is increasingly becomes self‑configuring. Agents observe how systems are used, identify mismatches, and adapt configurations automatically.
Automated Data Mapping and Migration: Data migration has historically carried one of the largest switching costs. AI changes this because it understands semantic meaning. For example, models can now infer that “Client_Name,” “Customer,” and “Account Holder” represent the same underlying concept, even when titles differ across data sources. This enables automated schema translation, dramatically reducing manual mapping work. A human process that used to take months now can be automated down to days, or hours.
Synthetic Testing Environments: Testing new systems before go‑live was historically constrained by data access, privacy concerns, and manpower. AI‑generated synthetic data now allows companies to spin up realistic test environments and cover edge cases that mirror production behavior without exposing sensitive information or taking too much time, reducing one of the largest sources of migration risk: the unknown.
AI‑Native Training and Support: LLM‑based assistants can act as always‑on copilots embedded directly in workflows, helping reduce the cost of retraining users.
Intelligent Integration Layers: AI‑driven integration platforms can now discover, map, and maintain connections between systems with minimal human input. Protocols like MCP hint at a future where AI becomes the universal middleware layer, brokering data across tools dynamically.
In practice, these tools have the potential to automate the most expensive, error‑prone aspects of implementation.
Lower Switching Costs Change Everything
When you strip implementation friction away, the economics of software change. Switching costs don’t disappear entirely – especially at the enterprise level, evaluation processes, compliance requirements, and governance still matter – but they compress dramatically.
This has second‑order consequences.
Incumbent Software Vendors Lose Their Free Option: Historically, incumbents benefited from inertia. Renewal was the default. Churn required heroics. As AI lowers migration pain, retention must be earned continuously. Vendors will need to compete on product velocity, embedded automation, and ROI. AI features, copilots, and verticalized workflows for incumbents are becoming table stakes, but bolt-on AI might not be enough to maintain an advantage relative to AI natives.
Systems Integrators Are Forced Up the Stack: Large SIs will not disappear, but their labor‑heavy model is under pressure. Several core parts of their business model, like configuration, migration, and integration – historically billed by the hour – will become less profitable. The value will shift toward change management at the organizational level, domain‑specific advisory, and ongoing optimization rather than one‑time deployment. In the near term, SIs may even see more migration activity as switching becomes feasible. Many will employ a “cost-plus” model, where the SIs use the AI tools themselves and charge an additional fee. SIs provide a level of “CYA” that many enterprises like to have in place.
Buyer Evaluations Change: When switching becomes more feasible, buyers behave differently. They negotiate harder. They pilot more aggressively. We are already seeing enterprises use AI‑enabled data extraction as leverage in vendor negotiations: “Improve terms, or we move.” That threat was rarely credible before. At the same time, evaluation criteria are evolving. Evaluating software is no longer just a question of understanding a given vendor’s current capabilities; it requires benchmarking their future capabilities relative to potential competitors. If an enterprise signs a multi-year contract, it’s imperative that they pick the right vendor within that ecosystem, or they risk falling behind their own competitors who better optimize critical workflows. Enterprises are therefore evaluating teams and their potential in ways that were less relevant in previous economic cycles.
As implementation shifts from a front‑loaded, human‑intensive event to a continuous, software‑driven layer embedded inside products, the biggest winners will be the companies that move with the fastest, most consistent product velocity. This is why the “talent wars” are so aggressive in the AI era. Friction is no longer a business model – talent and speed win out.

