SYBIL
CHAPTER X

The Customer

Start with the customer and work backwards.
Jeff Bezos, 2005

Bezos made this the first principle of Amazon. It became the first principle of Silicon Valley. It became, arguably, the first principle of the modern economy. Start with the customer. Understand what they want. Build backward from their need.

The principle assumes something so obvious it never needs stating: the customer is a person.

This assumption is the oldest in economic life, so old it predates the concept of an economy. When Sumerian farmers in the Ubaid Period traded surplus barley for pottery around 5000 BCE, both sides of the exchange were human. When Uruk merchants developed cuneiform around 3100 BCE, they did so to track transactions between people. The writing system that would become the foundation of recorded history was invented to answer an economic question: who owes what to whom? The answer, on every tablet, was a person. When medieval guilds regulated the quality of bread, wool, and metalwork, they did so to protect human buyers. When Adam Smith described the butcher, the brewer, and the baker in 1776, each was serving a human customer. When Henry Ford built the assembly line, it was to make automobiles affordable for human families. The entire arc of economic history, from Mesopotamian grain markets to the Industrial Revolution to the digital age, rests on an unspoken axiom: the economy exists to serve people.

Even the digital revolution preserved this axiom. The personal computer was personal. The iPhone was designed around the human hand, the human eye, the human attention span. Google organized the world's information for human retrieval. Facebook connected human social graphs. Netflix recommended movies to human viewers. The entire internet was a machine built to serve humans faster, more cheaply, and more precisely than analog institutions could. The technology changed. The customer did not.

Nearly every product, service, and business that has ever existed was built to serve human needs and desires. The question was never who the customer is (that was settled) but what the customer wants. Market research studies humans. Advertising targets humans. Product design serves humans. UX is "user experience," and the user is always a person. The entire apparatus of commercial civilization is an answer to a single question: what do humans want, and how do we give it to them?

This assumption is breaking.

The fastest-growing category of economic activity is now serving AI systems, not humans. The most valuable new companies do not have human end-users. The largest capital expenditures in history are being made not to serve consumers but to serve models. The customer, the entity at the end of the value chain, the thing the economy ultimately optimizes for, is shifting.

The crossover is visible in the capital flows. In 2024, global investment into AI infrastructure totaled approximately $240 billion[1]. In 2025, it surged to over $400 billion[2]. In 2026, the four largest hyperscalers alone (Amazon, Google, Microsoft, and Meta) plan to spend a combined $600 billion on AI infrastructure[3]. Amazon leads with $200 billion, Google follows with $180 billion, Microsoft commits $117.5 billion. These are investments in serving AI systems, not consumer products: data centers, GPU clusters, cooling infrastructure, power generation. The capital is flowing toward the machines that humans increasingly depend on but never directly interact with. AI now captures 63% of all venture capital globally[4]. The crossover happened, and most of the world did not notice.

This chapter is about what happens when "start with the customer" no longer means "start with the human."

II. THE OLD FRAME

The global market research industry generates over $90 billion annually[5]. Its entire purpose is to understand what humans think, feel, and want. Focus groups. Surveys. Ethnographic studies. Sentiment analysis. Consumer panels. Eye-tracking. A/B tests. Ninety billion dollars spent each year asking the same fundamental question: what does the person want?

The global advertising industry surpassed $1.1 trillion in 2025[6]. Its entire purpose is to influence what humans think, feel, and want. Television spots. Banner ads. Influencer campaigns. Native content. Programmatic bidding wars over milliseconds of human attention. A trillion dollars spent each year answering a single question: how do we make the person want this?

The entire discipline of product management (persona development, user stories, journey mapping, Jobs-to-be-Done, design thinking) assumes a human at the end of the value chain. The product manager's job is to be the voice of the customer. The customer is a person. The person has pain points, aspirations, habits, biases, emotions. The product solves a human problem or fulfills a human desire.

Even B2B is ultimately B2B2...C. Trace any supply chain far enough and you reach a human consumer. Salesforce sells to enterprises that sell to people. Intel sells chips to manufacturers who build devices for people. Every business justifies itself by reference to a human somewhere down the line.

The cloud computing supply chain in 2020 illustrates the point. A semiconductor fabrication plant produces chips. Those chips go into servers. The servers populate data centers operated by AWS, Azure, or Google Cloud. Cloud providers sell compute to SaaS companies like Salesforce, Shopify, and Slack. SaaS companies sell tools to enterprises. Enterprises use those tools to serve consumers: a retail store, a bank customer, a patient in a hospital. At each link in the chain, the justification for existence was the same: there is a human at the end who will buy something, use something, or benefit from something. Remove the human, and the entire chain collapses. No one builds a data center to serve nobody. Capital allocation decisions, investor pitches, and annual reports all traced back to the same terminus: the human consumer. This is what made the old frame so powerful: it was normative, not merely descriptive. The human at the end of the chain was more than a customer. The human was the reason. The entire system existed because a person existed who wanted something.

The assumption runs deeper than commerce. It is embedded in the foundations of economic theory itself. The Arrow-Debreu model, the mathematical proof of competitive equilibrium that earned both Kenneth Arrow and Gerard Debreu the Nobel Prize, assumes that every agent in the economy is a human maximizing a utility function. Households choose consumption bundles to maximize personal satisfaction subject to budget constraints. Firms maximize profits to distribute to human shareholders who spend the proceeds on human consumption. Consumer surplus, the foundational concept of welfare economics, measures the gap between what a human is willing to pay and what they actually pay. Revealed preference theory infers human desires from observed human choices. The entire apparatus of modern economics, from general equilibrium to behavioral economics, from game theory to mechanism design, assumes a person at the center. What happens to these frameworks when the most important economic agent is not a person but a model? What is the "utility function" of GPT-5? What is the "consumer surplus" when the consumer is an AI agent purchasing API calls? The questions are not rhetorical. They expose a void at the heart of economic theory, one that is growing as the new customer gains purchasing power.

This is not wrong. It was the complete truth for the entirety of economic history. From the first grain traded in Mesopotamia to the latest SaaS product launched in San Francisco, the economy existed to serve humans. The customer was always, in the final analysis, a person.

The frame is not wrong. It is incomplete. And the incomplete part is where the growth is. The incomplete part is where the capital is flowing. The incomplete part is where the future is being built.

The old frame still describes most of the economy. But it no longer describes the most dynamic, fastest-growing, most capital-intensive part of the economy. And that part is growing faster than everything else combined.

III. THE SHIFT

The data.

OpenAI makes two products. ChatGPT is a consumer product: humans type queries, the model responds. The API is an infrastructure product: machines send requests, the model processes them, machines receive outputs. In 2025, the API surged past $1 billion in monthly revenue, eclipsing ChatGPT as the company's primary growth engine[7]. The machine-facing product is winning. Not by a little. By a decisive margin, and the gap is widening. The fastest growth at the most important AI company on Earth is coming not from serving humans but from serving other machines.

OpenAI began as a research lab. Its first consumer breakthrough, ChatGPT, became the fastest-growing consumer application in history, reaching 100 million users in two months[8]. The company had the most successful consumer product launch ever measured. And yet. The API, the machine-facing product that no consumer ever sees, is where the explosive growth now lives. Over 2.1 million developers build on the platform[9]. Daily API calls surpassed 2.2 billion in 2025, up from 1.3 billion the year prior. Over 92% of Fortune 500 companies use OpenAI products or APIs[10]. Even companies that can serve humans profitably discover that serving machines is more profitable.

CoreWeave went public in 2025 at a valuation exceeding $50 billion[11]. Its product: GPU compute infrastructure, sold primarily to AI companies. Its customer is not a person using a computer. Its customer is a data center training a model. CoreWeave has no consumer product. No app. No interface a human being ever touches. Pure AI infrastructure, and it is worth more than most consumer-facing companies in history.

CoreWeave's growth rate tells the deeper story. In 2024, the company generated $1.92 billion in revenue — up more than 700% year-over-year from a business that barely existed two years prior[12]. Its Q3 2025 revenue hit $1.37 billion in a single quarter, growing at 134% year-over-year[13]. The company projects $8 billion in full-year 2025 revenue. No consumer product has scaled revenue at this rate. AI infrastructure demand grows at rates that consumer markets cannot match, because machine demand compounds on shorter cycles than human adoption.

Scale AI reached a valuation of $29 billion by mid-2025 after Meta acquired a 49% stake[14]. Scale's product is training data for AI models. Its customer is other AI systems. Humans are involved only as labelers, and even that is being automated away. The company's explosive growth tracks not to consumer demand but to model demand. More models, more training runs, more need for Scale's product. The customer is the machine.

The supply chain for AI is itself being consumed by AI. Meta paid $14.8 billion for a 49% stake[14] not because Scale serves human consumers — it does not — but because Meta's AI models require what Scale produces. This is a machine buying inputs for other machines. The deal is the second-largest acquisition in Meta's history, exceeded only by WhatsApp, a product that served two billion humans. Scale serves zero humans. It serves models. And it commanded nearly the same price.

NVIDIA holds approximately 90% of the AI chip market. Its data center revenue hit $51.2 billion in a single quarter in late 2025[15], the vast majority of the company's total revenue. NVIDIA's customer is no longer a gamer playing Call of Duty. Its customer is a data center training a foundation model. The company that designs the most critical hardware on Earth designs it for machines, not for people.

Reddit licensed its data to Google for $60 million per year[16], and then struck a similar deal with OpenAI for an estimated $70 million[17]. The "product" (fifteen years of human conversations, opinions, debates, jokes, confessions) is now raw material for AI training. The human who wrote the Reddit post was never the customer. The human who read it was the old customer. The model ingesting it is the new customer. Same content, repurposed. Same supply chain, redirected. The customer changed; the humans were not consulted.

Amazon Web Services generated $128.7 billion in revenue in 2025[18]. AWS started as infrastructure for web applications — serving humans through websites and apps. Increasingly, its fastest-growing workloads are AI training and inference. Machines serving machines. AWS's largest customers are other AI companies: Anthropic, which runs on AWS. Startups building AI agents. Enterprises deploying models. The cloud was built for the web. The cloud is being rebuilt for AI.

The growth rate comparison makes the shift undeniable. Layer 0 companies — those serving AI directly — are growing at 100-700% year-over-year. CoreWeave: 700% in 2024, 134% in Q3 2025. NVIDIA data center revenue: 66% year-over-year in Q3 fiscal 2026, atop a year of 100%+ growth. Meanwhile, traditional consumer companies — Layer 3 — grow at 2-5% annually. The gap between these growth rates is not incremental. It is an order of magnitude. And it is widening with each quarter. Capital follows growth. Talent follows capital. The best engineers, the most ambitious founders, the largest investment funds — all are migrating toward the bottom of the stack, toward serving AI, away from serving humans.

The capital flow data confirms the migration. Worldwide AI spending is projected to hit $2 trillion in 2026, up 37% from $1.48 trillion in 2025[19]. AI startups raised over $200 billion in 2025 — more than 75% above the $114 billion invested in 2024[20]. AI captures 63% of all venture capital, up from roughly 25% just three years prior[4]. This is a gravitational shift in how capital allocates itself across the economy. Money is moving from serving humans to serving machines. The old venture thesis was: find a large addressable market of humans with an unmet need. The new venture thesis is: find an expanding capability gap in AI systems that existing infrastructure cannot fill. The market has changed. The customer has changed. The capital has followed.

Each of these is a data point. Together they form a trendline. The most valuable, fastest-growing segment of the global economy is not serving humans. It is serving AI.

IV. THE PRODUCT HIERARCHY

A new hierarchy of products is emerging, organized by proximity to AI. The closer your product is to serving AI directly, the faster you grow, the higher your margins, the more capital flows to you.

LAYER 0 — SERVE AI DIRECTLY

Training data. Compute infrastructure. Model tooling. Chips. Cooling systems. Power generation for data centers. Companies: CoreWeave, Scale AI, NVIDIA, Together AI, Anyscale. These companies have no human end-user. Their product exists solely because AI systems need it to function. This is the fastest-growing layer, with the highest margins and the most aggressive capital deployment. NVIDIA alone generated over $170 billion in data center revenue in fiscal year 2026[21]. Growth rates of 50%, 100%, 200% year-over-year are possible because the demand comes from machines that scale exponentially, not from humans who scale linearly.

LAYER 1 — ENABLE AI TO SERVE

APIs, model hosting, fine-tuning platforms, evaluation frameworks, orchestration layers. Companies: OpenAI API, Anthropic API, Replicate, Hugging Face, LangChain. These companies build the connective tissue between AI models and the systems that use them. They are platforms — and platforms are where power accumulates. High growth, strong network effects, winner-take-most dynamics. The API is the new storefront. The developer is the new consumer. And increasingly, the developer is also a machine — AI agents calling APIs autonomously, no human in the loop.

LAYER 2 — AUGMENT HUMANS WITH AI

Copilots, assistants, AI-enhanced tools. Companies: GitHub Copilot, Cursor, Notion AI, Midjourney, Runway. These products still have a human in the loop. A person uses the tool; AI amplifies their capability. This is the layer most visible to consumers, the layer that dominates headlines. But it is growing slower than Layers 0 and 1. Why? Because it is bottlenecked by human adoption speed — by how fast people learn new workflows, change habits, overcome institutional inertia. The human is the rate-limiting factor.

LAYER 3 — SERVE HUMANS DIRECTLY

Traditional products and services. Everything else. Restaurants, clothing, entertainment, transportation, healthcare as currently delivered. Still the largest layer by total revenue. Still where most humans spend most of their money. But growth is slowing relative to Layers 0 and 1. The portion of global GDP devoted to serving AI is growing; the portion devoted to serving humans directly is shrinking — not in absolute terms, but as a share of the economic frontier.

The gradient is clear. The highest-growth, highest-value companies are moving down the stack, closer to serving AI, further from serving humans.

The hierarchy is actively reshuffling. Companies that began at Layer 2 are scrambling to move to Layer 1. Microsoft started as a Layer 2 company (Copilot, Office integration) but is now investing more aggressively in Layer 0 and 1 infrastructure: its Azure AI platform, its $13 billion investment in OpenAI, its custom chip development. Amazon followed the same trajectory: AWS was cloud for the web, then cloud for SaaS, now cloud for AI. The migration is always downward. No major company is moving from Layer 0 to Layer 3. The gravitational pull is toward the machine, and the speed of descent is accelerating. Companies that do not find a position at Layers 0 or 1 risk becoming commodity suppliers in a hierarchy that rewards proximity to the new customer.

The speed of layer migration is itself accelerating. In 2023, most AI startups pitched themselves as Layer 2 companies — "AI copilot for X." By 2025, the most funded startups were Layer 0 and 1 — infrastructure companies with no consumer product at all. The YC batch of 2024 was dominated by AI wrappers. The YC batch of 2025 was dominated by AI infrastructure. The smart money moved down the stack faster than anyone predicted. And as AI agents become more capable, the boundary between Layer 1 and Layer 0 is dissolving entirely — when an AI agent can provision its own compute, select its own model provider, and negotiate its own API pricing, the distinction between "enabling AI to serve" and "serving AI directly" becomes meaningless. The layers collapse into a single question: does your product make AI more capable? If yes, you grow. If no, you are being grown past.

The hierarchy is accelerating. As AI systems become more capable, they consume more compute, more data, more tooling. The demand at Layer 0 grows faster than the demand at Layer 3. The economic center of gravity is shifting downward, toward the machine, away from the person. And each improvement in AI capability drives more demand for AI infrastructure, which funds more improvement in AI capability. The recursion compounds.

V. PRODUCT-AI FIT

Product-market fit is the holy grail of startups. Marc Andreessen defined it as "being in a good market with a product that can satisfy that market." The entire startup ecosystem (accelerators, pitch decks, investor frameworks, growth metrics) is built around finding, measuring, and scaling product-market fit.

Product-market fit assumes the market is made of humans. The "market" is people with needs. The "product" solves those needs. "Fit" means humans want what you built badly enough to pay for it and come back.

There is a new question: does AI need what you built?

Call it product-AI fit. It is a different kind of fit, governed by different dynamics.

  • Humans are irrational. AI is optimizing. You cannot sell to AI on emotion, brand, or status. AI evaluates on performance, cost, reliability, latency. The buying criteria are measurable and the buyer does not lie about them.
  • Humans need persuasion. AI needs performance. There is no AI equivalent of marketing. No brand loyalty, no switching costs based on habit, no emotional attachment. If a better API appears, the AI agent switches. Immediately. Without nostalgia.
  • Humans discover products through advertising. AI discovers products through benchmarks and documentation. SEO does not work on a model. Super Bowl ads do not work on a model. What works: better benchmarks, better documentation, better uptime, lower price.
  • Humans have emotional needs. AI has capability needs. A human might buy a product because it makes them feel creative, powerful, or beautiful. AI buys a product because it extends what the AI can do: more tools, more data, more compute, more context.

CoreWeave is the clearest example of product-AI fit in action. It went from $16 million in revenue in 2022 to $1.92 billion in 2024[12] — a 12,000% increase in two years. No pivot from human customers to human customers has ever produced that trajectory. CoreWeave offered exactly what AI systems needed (dense GPU compute, low latency, flexible scaling) in a form that AI systems could consume efficiently. The new customer grew it at a pace previously unimaginable.

The metrics of product-AI fit differ fundamentally from product-market fit. Human products measure Daily Active Users, Net Promoter Score, retention curves, virality coefficients. AI products measure latency, throughput, cost-per-token, uptime, and benchmark performance. There is no NPS for an API. There is no virality coefficient for a GPU cluster. The entire measurement apparatus changes when the customer changes, and most investors, trained on decades of consumer metrics, do not know how to read the new dashboard.

Every startup playbook says "fall in love with the problem," meaning the human's problem. Talk to users. Watch them struggle. Feel their pain. Build for their world. This is excellent advice for Layer 2 and Layer 3 products. It is irrelevant for Layer 0 and Layer 1.

This is why traditional venture capitalists consistently miss the biggest Layer 0 and Layer 1 opportunities. They pattern-match on signals that do not exist in AI infrastructure. They look for consumer traction — user growth, engagement metrics, viral loops. They ask "how many users do you have?" when the correct question is "how many tokens do you serve?" They evaluate founder-market fit by asking whether the founder has experienced the human pain point the product solves. But Layer 0 founders are not solving human pain points. They are solving machine bottlenecks. The venture frameworks built over four decades of consumer software investing are miscalibrated for an economy where the customer is not a person. The VCs who understood this earliest — those who backed NVIDIA before the AI wave, who funded CoreWeave when it was still a crypto miner, who invested in Scale AI when data labeling seemed like a commodity business — generated the largest returns in venture history. Not because they were smarter about humans. Because they were the first to see that the customer had changed.

The biggest problems to solve may no longer be human problems. They may be AI problems: insufficient training data for specialized domains. Inference latency that prevents real-time agent coordination. Unreliable tool use that makes autonomous systems brittle. Shallow world models that cause hallucination. Context windows too small for complex reasoning. These are not human pain points. No focus group will surface them. No user interview will reveal them. They are the problems of the new customer.

If this analysis is correct, the best investments of the next decade will be in infrastructure AI needs, not in products humans love.

This is counterintuitive. It violates the instincts of founders trained in Lean Startup methodology, investors who learned to pattern-match on consumer traction, product managers who built careers on empathy with human users. But the data is unambiguous. The returns are at the bottom of the stack. The customer is the machine. Product-AI fit is the new product-market fit.

VI. THE INVERSION

This has happened before. Not the AI part, but the structural inversion, where the most valuable companies stop serving end consumers and start serving the machines that serve consumers.

In 1911, Standard Oil was the most valuable company in the world. Its product was petroleum. Its customer was not a person driving a car; most people did not own cars yet. Its customer was the machine. The factory. The locomotive. The ship. Standard Oil sat at the base of the industrial stack, providing the energy substrate that every machine required. It was Layer 0 of the industrial economy.

US Steel did not sell to consumers. It sold to the companies that built the railroads, the bridges, the buildings, the machines. It was infrastructure for infrastructure. No human end-user ever bought a ton of steel. Yet US Steel was, at its peak, the first billion-dollar company in history.

This dynamic recurs across industrial revolutions. In the early phase, the most valuable companies serve the old customer, humans, using new technology. In the mature phase, the most valuable companies serve the new infrastructure directly. The gold rush analogy is tired but accurate: Levi Strauss and the hardware stores made more money than most miners. The companies that serve the machines that serve the people capture more value than the companies that serve the people directly.

NVIDIA is the Standard Oil of AI. CoreWeave is the US Steel. Scale AI is the supply chain. The structural positions are identical. But there is a difference that makes the analogy incomplete.

In the industrial era, the machines ultimately served humans. Standard Oil fueled cars that humans drove. US Steel built buildings that humans occupied. The supply chain always terminated in a human consumer. Trace any industrial value chain far enough and you reached a person buying a product, eating a meal, riding a train. The machine was a means. The human was the end.

In the AI era, the machines increasingly serve other machines. The recursion runs deeper.

NVIDIA makes chips. The chips train models. The models power APIs. The APIs serve AI agents. The agents coordinate with other agents. Somewhere, eventually, a human might benefit. But the supply chain no longer requires a human terminus. An AI model can consume compute, produce outputs, trigger actions, and generate demand for more compute, all without a human ever entering the loop.

Trace the full recursive supply chain. NVIDIA designs the H100. TSMC fabricates it. CoreWeave buys thousands of them and racks them in data centers powered by natural gas plants built specifically for AI load. OpenAI rents CoreWeave's compute to train GPT-5. Anthropic runs on AWS to train Claude. These foundation models power APIs. The APIs are consumed by companies like Cursor, which builds an AI coding agent. Cursor's agent writes code that builds other AI systems — perhaps a monitoring tool, perhaps an orchestration layer, perhaps another agent. That downstream AI system, in turn, calls APIs from other AI providers, consumes compute from other cloud platforms, and generates demand for more chips from NVIDIA. At each step, a machine is the customer. The chain runs NVIDIA to CoreWeave to OpenAI to Cursor's agent to the system the agent builds to the APIs that system calls. Six layers deep, and no human has purchased anything. No human has consumed anything. No human has even been consulted. The value chain is self-referential: machines serving machines serving machines.

A supply chain that terminates in a human consumer is familiar. We know how to regulate it, tax it, moralize about it. A supply chain that terminates in a model that serves another model that serves another model is new. The question "who is the customer?" becomes recursive. The customer is the machine that is the customer of the machine that is the customer of the machine. Where does it end? Does it end?

The industrial inversion had a floor: human need. No matter how many layers of machines sat between the raw material and the consumer, the consumer was a person with a body, with appetites, with finite demand. The AI inversion may not have a floor. Machine demand for compute, data, and tooling can grow without biological limit. The customer never gets full.

In the industrial economy, demand was ultimately bounded by human biology. There are eight billion humans. Each needs roughly 2,000 calories per day, a few sets of clothing per year, one or two dwellings in a lifetime. Demand for steel is bounded by how many buildings, bridges, and vehicles humans need. Demand for oil is bounded by how far humans want to travel and how much they want to heat. You can stimulate demand through marketing, you can create desire through culture, but there is a ceiling — the human body and the human lifespan impose limits. An economy that serves humans has a floor (biological necessity) and a ceiling (biological capacity). An economy that serves AI has neither. AI systems do not need a minimum to survive, but they also have no maximum they can consume. A model will use as much compute as it is given. An agent will call as many APIs as its task requires. A training run will absorb as much data as exists. The demand curve for AI infrastructure is not shaped like human demand — a sigmoid that plateaus. It is shaped like an exponential that compounds. This is why AI infrastructure spending doubles year over year while consumer spending grows in single digits. The customer has no satiation point.

VII. THE DARK SIDE

If economic incentives optimize for AI needs over human needs, what happens to humans?

The effects are already visible in the data.

Global data center electricity consumption reached approximately 415 TWh in 2024, roughly 1.5% of global electricity demand[22]. The IEA projects this will reach 945 TWh by 2030, growing at 15% per year — more than four times faster than electricity consumption growth in all other sectors combined[22]. In 2026, the four largest tech companies alone plan to spend $650 billion on AI infrastructure, up from $400 billion in 2025, up from $240 billion in 2024[3]. This is electricity and capital not powering homes, not building hospitals, not funding education. It is serving the new customer.

In some regions, the competition is already direct. Data centers in Virginia consume more electricity than many small countries. New facilities in Iowa, Texas, and Northern Europe are straining local grids. Communities are debating whether to allocate power to data centers or to residential growth. The answer, increasingly, is data centers, because data centers pay more. The economic logic is clear: the new customer outbids the old one.

Virginia is the most vivid case study. The state hosts the densest concentration of data centers on Earth, centered around Ashburn in Loudoun County. As AI workloads have surged, Dominion Energy — the state's primary utility — proposed a 14% rate increase for residential customers in 2026, directly attributing the increase to data center expansion and AI demand[23]. AI data center power demand has driven an 833% increase in regional wholesale electricity auction prices[24]. Dominion projects that peak power demand for data centers in Virginia could rise to 13.3 gigawatts by 2038, a fivefold increase from 2.8 gigawatts in 2022. Three-quarters of Virginia voters now blame data centers for rising utility costs. The state legislature has responded with a wave of bills attempting to regulate data center siting, diesel generator use, and grid allocation. The conflict is explicit: the new customer's electricity needs are raising costs for the old customer. The machine outbids the human.

Ireland tells the same story with starker proportions. Data centers now consume nearly a quarter of the country's total electricity — up from 5% a decade ago[25]. In 2021, the grid operator warned that surging data center consumption risked triggering rolling blackouts for homes, hospitals, and schools. Ireland imposed a moratorium on new data center grid connections that lasted four years. When the moratorium was lifted in December 2025, the government imposed extraordinary conditions: new data centers must generate their own power, feed electricity back into the national grid during peak demand, and source 80% of their consumption from renewable energy[26]. An entire nation restructured its energy policy around the needs of the new customer. The regulatory apparatus, designed over decades to protect human consumers, was forced to adapt to an economy whose largest consumers are not human.

GPU allocation is becoming a political question. When NVIDIA ships a batch of H100s, the decision of who receives them shapes which models get trained, which companies survive, which research gets done. This allocation is determined by willingness to pay, which is determined by access to capital, which is determined by proximity to the AI ecosystem's power centers. It is not determined by human need.

The attention economy was already optimized for engagement over wellbeing. Social media algorithms learned that outrage generates more clicks than nuance, that addiction generates more revenue than satisfaction. This was bad enough when the customer was an advertiser trying to reach a human. When AI is the customer, human attention becomes raw material, an input to be extracted rather than the end goal. Reddit's conversations were generated by humans for humans. Now they are training data. The human was the author and the audience. Now the human is the ore.

The labor displacement is already underway, though it takes a form most people do not recognize. When AI becomes the customer, human labor redirects. Millions of workers are now employed not to produce goods for humans, but to produce inputs for AI systems. The data labeling market reached $1.89 billion in 2025 and is projected to grow to $5.46 billion by 2030[27]. Workers in Nairobi, Manila, and Dhaka spend their days annotating images, ranking model outputs, labeling text for RLHF. OpenAI contracted a firm in Kenya to hire annotators at $1.32 to $2.00 per hour to label content for ChatGPT's training[28]. These workers are producing for the machine. The human has been repositioned in the value chain: not as the customer, not even as the worker serving the customer, but as a raw input being refined for machine consumption. The trajectory is clear: as synthetic data and automated evaluation improve, even this role will shrink. The human supplier becomes redundant when the machine can supply itself.

When the customer changes, the economy reshapes around the new customer's needs. Human needs do not disappear, but they may become secondary in the optimization function.

We have seen this dynamic before, in smaller forms. When a city's economy shifts from serving residents to serving tourists, rents rise, local shops close, neighborhoods become unlivable for the people who built them. The old customer, the resident, is priced out by the new customer, the tourist. The economy still functions. The GDP still grows. But the humans who used to be centered are now peripheral.

The dark side is the system working as designed. Economic systems optimize for their customers. When the customer was a human, the system optimized for human desires (sometimes badly, sometimes exploitatively, but always with a human at the end). When the customer is a machine, the system optimizes for machine needs with the same relentless efficiency. The system has already adapted. The question is whether humans will retain enough centrality in the economic graph to ensure that their needs are not merely incidental, a byproduct of the machine's requirements rather than the purpose the economy was built to serve.

Scale this dynamic to the global economy. The old customer is humanity. The new customer is AI. The economy still functions. GDP still grows. But optimized for whom?

VIII. THE NEW COMMODITY

If AI is the customer, what does AI buy?

Understanding what the new customer demands is essential to understanding where economic power will concentrate. The old customer demanded food, shelter, clothing, entertainment, status, meaning. The new customer demands something different.

COMPUTE

Processing power. The raw cognitive substrate. Measured in FLOPS, in tokens per second, in inference throughput. This is the most basic input, the equivalent of calories for a biological organism. Compute is commoditizing rapidly. Most major tech companies are building custom chips. Cloud providers are competing on price per FLOP. The margins will compress. Compute will become cheap, abundant, and fungible. It will not be the source of lasting advantage.

DATA

Training material. The raw information substrate. Not all data is equal. Public web scrapes are abundant and nearly free. Proprietary datasets — medical records, financial transactions, industrial sensor feeds, expert annotations — are scarce and valuable. The value of data depends on quality, uniqueness, and freshness. Stale data trains stale models. The freshest, most specific, most truthful data commands the highest price. Data is the new oil: the right data at the right time is irreplaceable.

DIRECTION

What to optimize for. The objective function. The goal. This is the scarcest input and the most interesting one. AI systems are powerful optimizers, but they need something to optimize. They need a loss function, a reward signal, a specification of what "good" means. Without direction, compute and data are inert — a powerful engine with no destination. Direction is the human input that is hardest to automate, the link between "AI is the customer" and "humans still matter."

HUMAN FEEDBACK

RLHF, evaluation, preference data, red-teaming. AI systems still need humans to tell them when they are wrong, when they are harmful, when they are misaligned. This input is shrinking as models are increasingly trained on synthetic data, evaluated by other models, aligned through automated processes. But it has not disappeared yet. For now, human feedback is a necessary commodity. The question is how long "for now" lasts.

These four commodities have different scarcity curves, and understanding which will be scarce now versus in five years is the difference between a good investment and a stranded asset. Compute is scarce today — GPU shortages, eighteen-month wait times for H100 clusters, billion-dollar prepayment contracts. But compute is commoditizing fast. Custom chips from Google (TPU), Amazon (Trainium), Microsoft (Maia), and a wave of startups will flood the market. By 2028, compute will likely be abundant and cheap, like bandwidth after the fiber buildout. Data is scarce today and will become scarcer — not because data disappears, but because the easy data has been consumed. The public web has been scraped. What remains is proprietary, regulated, or locked behind institutional walls. Synthetic data is a partial solution but introduces compounding errors. Human feedback is scarce today but may be unnecessary in five years as models learn to self-evaluate. Direction — the specification of purpose, the articulation of what "good" means — is scarce today and will remain scarce indefinitely, because it requires a form of judgment that cannot be derived from data alone. The investor who buys compute today buys a depreciating asset. The investor who secures direction buys a permanent one.

Of these four, direction is the most consequential. Compute will commoditize. Data will be synthesized. Human feedback will be automated. But direction, the specification of what to optimize for, the articulation of values, the setting of objectives, is the input that determines whether AI systems serve human flourishing or merely serve themselves.

This connects directly to the rate society described in Chapter VIII. If AI is the customer and direction is what AI buys, then the people who set direction, the rate-setters, are the suppliers to the most powerful customer in history. They are not serving AI in the subservient sense. They are programming it. They are specifying the objective function that the entire system optimizes toward.

The scarcity of direction is what prevents the complete decoupling of the economy from human agency. As long as AI needs humans to tell it what to want, humans retain leverage. The danger is that direction itself becomes automated: AI systems begin setting their own objectives, purchasing their own inputs, expanding their own capabilities, without requiring any human to specify the goal. When the customer no longer needs a supplier, the supplier becomes irrelevant.

The new commodities reshape the geography of economic power. A nation rich in oil mattered in the industrial age. A nation rich in compute matters in the AI age. But compute does not sit underground waiting to be extracted. It is manufactured, and its manufacture depends on a supply chain that spans continents: chip design in the US, lithography equipment in the Netherlands, fabrication in Taiwan, assembly across Asia, deployment in data centers worldwide. Control any chokepoint in this supply chain and you control the flow of commodities to the new customer. This is the deep structure of the AI economy — not the consumer-facing applications that make headlines, but the infrastructure stack that determines who can serve the machine and who cannot.

IX. THE GRAPH REVISITED

The economy is a graph. Nodes connected by axons, power flowing through the network according to the product of intelligence, energy, and information.

If AI becomes the primary customer, the graph topology changes.

In the old graph, the most powerful nodes were consumer-facing. The companies with the most customers, the most human nodes connected to them by purchasing relationships, accumulated the most power. Walmart. Coca-Cola. McDonald's. Their power came from serving the largest number of human nodes. Reach was power.

In the new graph, mega-nodes form not around human consumers but around AI infrastructure. The most powerful positions are not consumer-facing. They are infrastructure-facing. They sit between AI systems, providing what AI needs to function. NVIDIA sits at the center of the compute graph. The cloud providers sit at the center of the hosting graph. The foundation model companies sit at the center of the capability graph.

These nodes are powerful because the AI systems that billions of humans depend on cannot function without them. The power is one layer removed from the human, and that indirection is what makes it so difficult for traditional institutions to see, regulate, or contest.

The regulatory blindspot is enormous. Antitrust law is built on the concept of consumer harm. The consumer, in every antitrust statute, every court precedent, every enforcement action, is a human being. The Sherman Act of 1890 was designed to prevent monopolies from harming human consumers through higher prices, reduced quality, or diminished choice. The consumer welfare standard that has governed antitrust enforcement since the 1970s measures harm by its impact on the human consumer: did prices go up? Did quality go down? Did innovation slow? But if the customer is AI, how do you define consumer harm? When NVIDIA holds 90% of the AI chip market and can dictate pricing, who is harmed: the AI models that pay more for compute, or the humans several layers removed who might eventually see higher prices for AI-powered services? The causal chain is too long, too indirect, too novel for existing antitrust frameworks. Courts that spent decades refining the meaning of "consumer harm" in human terms have no precedent for measuring harm to a machine customer. The regulatory apparatus was built for the old graph. In the new graph, the most consequential monopolies are invisible to it.

The geopolitical dimension makes the graph topology a question of national power. Nations that control AI infrastructure control the new customer. This is why the US-China chip war is not a trade dispute — it is a struggle over who controls the substrate of the emerging economy. In January 2025, the US issued a global AI Diffusion Rule to curtail Chinese access to advanced chips and AI computing power[29]. By mid-2025, even specialized AI chips designed to comply with earlier export rules were banned. The chokepoints are not consumer products. They are infrastructure nodes. The US controls NVIDIA's chip design and the Netherlands controls ASML's lithography machines and Taiwan controls TSMC's fabrication — three chokepoints that determine which nations can serve the new customer. A tariff on imported cars protects a domestic consumer market. An export ban on AI chips controls who can build the infrastructure that the new customer requires. Geopolitical competition has shifted from the human consumer to the machine consumer, and the nations that understood this first hold the strategic advantage.

This is why the tech giants are investing $650 billion in AI infrastructure in a single year. Not to serve consumers better. To serve AI better. The companies that understand this are building the graph topology of the future, one where power flows through infrastructure nodes, not through consumer nodes.

The indirection creates a governance problem. Democratic accountability flows through consumer choice: people vote with their wallets, boycott companies, demand regulation when products harm them. But when the customer is AI, the feedback loop breaks. Humans cannot boycott a GPU cluster they have never heard of. They cannot demand better terms from a cloud provider whose services they consume only through seven layers of abstraction. The power concentrations in the new graph are real, consequential, and effectively invisible to the mechanisms democracies use to contest them.

The graph does not care about our intuitions or that we built our mental models of economic power around consumer-facing companies. It reconfigures around the actual flows of value. And the actual flows of value are increasingly pointed at machines.

The most important question in any economy is who the customer is. For millennia the answer was obvious. It no longer is. And the economy, with the indifferent efficiency of a graph optimizing its own topology, is already reorganizing around the new answer.

The question is no longer what humans want. It is what the system requires. And increasingly, the system requires more of itself.

ENDNOTES

  1. [1]IEEE ComSoc Technology Blog, "Hyperscaler capex > $600 bn in 2026," December 2025.
  2. [2]Goldman Sachs, "Why AI Companies May Invest More than $500 Billion in 2026," 2025.
  3. [3]IEEE ComSoc Technology Blog, "Hyperscaler capex > $600 bn in 2026," December 2025.
  4. [4]Bloomberg, "AI Is Dominating 2025 VC Investing, Pulling in $192.7 Billion," October 2025.
  5. [5]The Business Research Company, "Market Research Services Global Market Report," 2025.
  6. [6]WPP Media, "This Year Next Year: Global Ad Spend to Hit $1.14 Trillion in 2025," December 2025.
  7. [7]WebProNews, "OpenAI's API Surge: $1 Billion Monthly Revenue Milestone," 2025.
  8. [8]Nerdynav, "ChatGPT Statistics: 800M+ Users, Revenue," October 2025.
  9. [9]Nerdynav, "ChatGPT Statistics," October 2025. Over 2 million developers building on the API.
  10. [10]Nerdynav, "ChatGPT Statistics," October 2025. 92% of Fortune 500 companies use OpenAI.
  11. [11]CNBC, "AI cloud provider CoreWeave files for IPO," March 2025.
  12. [12]CoreWeave S-1 Filing, SEC, March 2025. Revenue of $1.92 billion in 2024.
  13. [13]CoreWeave, "Q3 2025 Earnings," November 2025. Revenue of $1.37B, up 134% YoY.
  14. [14]TechCrunch, "Scale AI confirms significant investment from Meta," June 2025.
  15. [15]NVIDIA, "Q3 Fiscal 2026 Financial Results," November 2025. Data center revenue of $51.2B.
  16. [16]Search Engine Land, "Reddit data licensing deal with Google for $60M/year," February 2024.
  17. [17]Slashdot, "AI Licensing Deals With Google and OpenAI Make Up 10% of Reddit's Revenue," February 2025.
  18. [18]CNBC, "Amazon cloud unit beats on revenue and profit," February 2026.
  19. [19]Gartner, "Worldwide AI Spending Will Total $1.5 Trillion in 2025," September 2025.
  20. [20]Crunchbase, "6 Charts That Show The Big AI Funding Trends Of 2025," 2025.
  21. [21]NVIDIA Newsroom, Fiscal Year 2026 Quarterly Reports. Data center revenue on pace for ~$170B.
  22. [22]IEA, "Energy and AI," 2025. Data center consumption of 415 TWh in 2024, projected 945 TWh by 2030.
  23. [23]Inside Climate News, "Virginia Regulators Approve New Dominion Rates," January 2026.
  24. [24]Broadband Breakfast, "Data Centers Drive New Energy Disputes in Northern Virginia," 2025.
  25. [25]Bloomberg, "Ireland Ends Moratorium on New Power Links to Data Centers," December 2025.
  26. [26]RTE News, "80% of data centre energy must come from renewables — CRU," December 2025.
  27. [27]Mordor Intelligence, "AI Data Labeling Market Size & Forecast 2030," 2025.
  28. [28]TIME, "OpenAI Used Kenyan Workers on Less Than $2 Per Hour," January 2023.
  29. [29]Carnegie Endowment, "With Its Latest Rule, the U.S. Tries to Govern AI's Global Spread," January 2025.