SYBIL
CHAPTER I

The Graph

The first man who, having enclosed a piece of ground, bethought himself of saying 'This is mine,' and found people simple enough to believe him, was the real founder of civil society.
Jean-Jacques Rousseau, 1755

Start with the graph.

An economy is a network of nodes connected by edges. Every other model is a projection of this onto a lower-dimensional surface. Smith saw the division of labor. Marx saw the relations of production. Hayek saw the price mechanism. Each was staring at the same underlying structure from a different angle, describing the shadows on the cave wall.

We will look at the graph directly.

Why "graph" and not "network" or "system" or "market"? Because graph theory gives us a precise, mathematical vocabulary for what is otherwise a pile of competing metaphors. A graph is a set of nodes and a set of edges connecting them. It has no opinions. It does not assume rationality, equilibrium, class struggle, or spontaneous order. It simply asks: What are the nodes? What connects them? How does information and value flow through those connections? Most questions in economics, political science, and sociology can be restated in these terms, and when you do, patterns emerge that the original disciplines obscure.

The graph is also measurable in ways that older frameworks are not. You cannot directly measure "the invisible hand" or "the relations of production" or "spontaneous order." But you can measure a graph: count its nodes, map its edges, quantify throughput, identify bottlenecks, locate clusters, compute centrality scores. The graph is an instrument, not a metaphor.

II. NODES

A node is any agent that can receive information, process it, and act.

Today, most nodes are human. Some are composites we treat as single agents (firms, governments, churches, funds). A few are already machines. Soon, most of the consequential nodes will be machines.

The count of nodes in the global economic graph has grown by orders of magnitude and continues to accelerate. At the human layer: 8 billion people, roughly 5.5 billion of whom now have internet access[1]. At the institutional layer: an estimated 400 million registered businesses worldwide, from sole proprietors to multinational conglomerates. At the machine layer: more than 21 billion IoT devices[2], a number projected to exceed 30 billion by 2030[3]. Each connected sensor, each autonomous trading algorithm, each recommendation engine is a node — receiving information, processing it according to some objective function, and acting.

The distinction between "human" nodes and "machine" nodes is already blurring. When a portfolio manager executes trades based on algorithmic signals, is the node the human or the algorithm? When a radiologist diagnoses based on an AI overlay, where does one node end and the other begin? These composite human-machine nodes are the current transitional form. They will not last. The machine component is scaling faster than the human component and will eventually subsume it, not by replacing the person necessarily but by making the person's contribution to the node's total compute negligible. Already, algorithmic trading accounts for roughly 60–70% of US equity market volume[4]. The human is still nominally in the loop, but the loop is tightening.

Nodes are not equal. In the eyes of the graph, one fundamental axis separates them: compute.

How much information can this node ingest? How many variables can it track at once? How far ahead can it simulate?

Give two nodes the same inputs; the one with more compute will, on average, produce better outputs. This is what we casually call "intelligence," stripped of the mystique.

But intelligence in isolation is inert. A superintelligent node with no connections is a god in solitary confinement. A brain in a jar can solve equations all day and change nothing.

Power requires topology.

III. AXONS

Nodes are connected by axons, channels through which information flows.

For each axon, you can ask: How much information per unit time? How much survives the trip? One-way or two-way? How long to arrive?

Think of yourself as a node. Your connection to your spouse: high bandwidth, high fidelity, bidirectional, low latency. Your connection to the President: very low bandwidth, mostly one-way, high latency. Your connection to your employer: medium bandwidth, variable fidelity, asymmetric.

The sum of your axons is your position in the network. It determines what you can see, who can see you, what levers are available to you, and how much of the graph's total flow ever passes through you.

You do not "have" power in the abstract. You occupy a position in the graph.

The properties of axons have changed radically across history, and each change restructured power. When axons were physical roads, power accrued to those who controlled mountain passes, river crossings, and ports, geographic chokepoints where many axons converged. When axons became telegraph wires, power shifted to those who controlled cable landing stations and switching offices. When axons became radio spectrum, power shifted to those who controlled broadcast licenses. Today, axons are increasingly fiber optic cables, cellular towers, satellite links, and API endpoints. The chokepoints have changed, but the structural logic has not: whoever controls the axons controls the flow, and whoever controls the flow shapes the graph.

The modern axon infrastructure is itself a graph worth mapping. Over 550 submarine cables span the ocean floor, carrying more than 95% of intercontinental data traffic[5]. Most of these cables land at a surprisingly small number of coastal stations — roughly 30 major landing sites handle the majority of transoceanic bandwidth. On land, internet exchange points (IXPs) serve as concentrated switching nodes where networks peer: the top 20 IXPs by traffic volume handle a disproportionate share of global data exchange. DE-CIX Frankfurt alone peaks above 14 terabits per second of throughput[6]. The axon infrastructure, like the nodes it connects, follows power-law distributions.

IV. THE SHAPE OF THE GRAPH

Real networks have been mapped, measured, and quantified. They all exhibit the same structural signature: extreme concentration.

Start with global trade. The World Trade Organization tracks merchandise flows between 164 member economies. The top ten trading nations — the United States, China, Germany, Japan, the Netherlands, South Korea, France, Italy, the United Kingdom, and Canada — account for roughly 52% of all global merchandise trade[7]. Half the world's commercial flow passes through nodes that represent about 6% of the membership list. The remaining 154 nations split the other half. This is not a bell curve. It is a power law.

The internet's physical topology is even more skewed. As of 2024, the global routing system comprises more than 75,000 autonomous systems[8] — the organizational nodes that make up the internet's backbone. But connectivity is not evenly distributed. A handful of Tier 1 networks — Lumen, Cogent, GTT, NTT, Telia — peer with essentially everyone and carry a disproportionate share of global traffic. Most autonomous systems peer with only a few neighbors and depend entirely on upstream transit providers to reach the rest of the graph. Cut ten specific nodes and you partition the internet. Cut ten thousand peripheral ones and most users notice nothing.

Power-law distribution appears in every network that has been measured at scale: trade flows, internet routing, citation graphs, financial transactions, social connections, airline routes, power grids. This is not coincidence. It is a structural property of how networks grow. Preferential attachment means the rich in connections get richer in connections.

The same concentration appears at every scale. Global airline networks: about 45,000 routes connect roughly 4,000 airports, but a handful of mega-hubs (Atlanta, Dubai, London Heathrow, Beijing, Chicago O'Hare) handle a disproportionate share of total passenger flow. In academic citation networks, a tiny fraction of papers accumulate the vast majority of citations — a phenomenon so extreme that physicist Derek de Solla Price documented it in 1965, calling it "cumulative advantage,"[9] decades before Barabasi formalized it as preferential attachment[10]. In the graph of global shipping, the Strait of Malacca — a 550-mile waterway between Malaysia and Indonesia — carries roughly 25% of all maritime trade[11]. One edge, a quarter of all flow. These are not separate phenomena requiring separate explanations. They are one phenomenon: the power-law topology of real networks.

Social graphs reveal the same architecture. Meta's Facebook maintains a social graph with more than 200 billion edges connecting nearly 3 billion monthly active users[12]. Instagram exceeds 2 billion monthly active users. But engagement follows power-law distributions: a tiny fraction of accounts generate the content that drives the majority of interactions. The top 0.1% of creators on any major platform command more attention flow than the bottom 50% combined. The graph is formally democratic — anyone can post — but topologically feudal.

Financial networks crystallize the pattern most sharply. The SWIFT messaging system processes more than 53 million messages per day across 11,000+ institutions in over 200 countries[13]. But the actual flow of capital is radically concentrated. A 2011 study by researchers at ETH Zurich mapped the ownership network of 43,060 transnational corporations and found that a core of 147 companies — less than 1% of the total — controlled 40% of the network's total economic value[14]. Within that core, 75% of ownership was held by financial institutions. The graph has a nucleus, and the nucleus is small.

Energy networks exhibit the same structure. The global electrical grid, measured by installed generation capacity, is dominated by a handful of nodes. The United States, China, India, Russia, and Japan together account for more than 60% of global electricity generation. Within countries, power grids concentrate around a small number of critical substations — a 2014 FERC analysis found that destroying just nine key substations could cause a cascading blackout across the entire continental United States[15]. Infrastructure networks, like social and financial networks, have a concentrated core and a fragile periphery.

This concentration is not a policy choice. It is a mathematical inevitability of network formation under preferential attachment. New nodes connecting to a network prefer to link to already-connected nodes — because those connections are more valuable. This creates a feedback loop that produces power-law degree distributions. Albert-Laszlo Barabasi formalized this in 1999[10], but the pattern itself is as old as networks: rivers concentrate into deltas, roads converge on capitals, shipping lanes funnel through straits.

The topology is the power structure. Before you ask who rules, map who connects to whom.

V. THE POWER FUNCTION

What is power, in graph terms?

Not charisma. Not legal authority. Not money. Those are expressions of power, not its source.

Power is the capacity of a node to change the information states and action-spaces of other nodes. That capacity is a property of the node in the graph, not the node alone.

We can write this schematically:

P(n) = C(n) × Σᵢ [ B(n,i) × P(i) ]

P(n) = power of node n

C(n) = compute capacity of node n

B(n,i) = effective bandwidth of connection to node i

P(i) = power of node i

In words: your power is your compute, multiplied by how strongly you're connected to other powerful nodes.

This is recursive. Your power depends on the power of the nodes you touch, whose power depends on the nodes they touch, and so on. Power propagates through the network like current through a circuit.

A concrete example. Two hedge fund managers. Both have roughly equivalent analytical ability — similar compute. Manager A has a Bloomberg terminal, access to public filings, and a handful of industry contacts. Manager B has all of that, plus a direct line to three Fortune 500 CEOs, a close relationship with a Federal Reserve governor, and a former intelligence analyst on staff who maintains contacts in five foreign governments. Manager B's compute is not meaningfully greater. But B's axons reach into nodes of dramatically higher power, nodes whose own power derives from their positions in government, corporate, and intelligence networks. Plug this into the formula: B(n,i) is higher for Manager B on the connections that matter most, and the P(i) of those connections is enormous. The result: radically different effective power from roughly the same raw intelligence. This is how the hedge fund industry actually works, and why the same few firms — Bridgewater, Citadel, Renaissance — consistently dominate. They have thicker axons into more powerful nodes.

The structural similarity to Google's PageRank algorithm is not accidental. Larry Page and Sergey Brin, in their 1998 Stanford paper[16], defined the importance of a web page as a function of how many important pages linked to it. The rank of each page depended on the rank of its inbound neighbors, recursively. PageRank was a literal computation of power over the graph of hyperlinks. A page nobody linked to was invisible regardless of its content. A page linked to by other highly-linked pages dominated search results. Google became the most valuable company in the world by solving this eigenvector equation over the web graph. Power as a literal function of graph position.

Two things follow immediately.

First: kingmakers versus kings. A node with modest compute but axons into many powerful nodes can be more powerful than any of them individually. Topology multiplies influence.

Second: isolation is death. Cut a node off from its axons and its effective power collapses, no matter how intelligent it remains. Exile works. Sanctions work. Deplatforming works. Russia's partial disconnection from SWIFT after the 2022 invasion of Ukraine was a deliberate severing of axons — a topological attack[17]. Russia's GDP contracted, its access to foreign capital dried up, and its ability to trade with much of the world was impaired, not because its resources or productive capacity changed overnight, but because its position in the financial graph was degraded. Iran, North Korea, Cuba: each demonstrates that sustained topological isolation — cutting a node's high-bandwidth axons to the global graph — is one of the most effective tools of coercion, often more damaging than direct military action.

VI. TOPOLOGY AS POWER: THE HISTORICAL RECORD

If the power function is correct, then history should show us cases where network position alone, absent any obvious advantage in resources or military force, produced outsized power. It does. Repeatedly.

Venice, 1200–1400. The Venetian Republic had no farmland worth mentioning, no mineral deposits, no standing army that could match the great continental powers. What it had was topology. Sitting at the hinge between the Byzantine East and Latin West, Venice controlled the axons through which spices, silk, grain, and — critically — information flowed between two halves of the known world. Venetian merchants maintained permanent trading posts (fondaci) in Constantinople, Alexandria, Bruges, and London. The Venetian state operated the most sophisticated intelligence service in medieval Europe, requiring all returning ambassadors to file detailed written reports (relazioni) that were archived and cross-referenced. The node was small. Its connections were unmatched. Venice became the wealthiest city per capita in Europe for two centuries, a city-state punching at the weight of kingdoms, purely because of where it sat in the graph.

Venice's decline confirms the point from the other direction. When the Portuguese discovered a sea route to India via the Cape of Good Hope in 1498, they created a new set of axons that bypassed Venice entirely. Trade that had flowed through the Venetian-controlled eastern Mediterranean could now flow directly from Asia to Lisbon. Venice's compute had not changed. Its raw resources had not changed. But its topology had been disrupted: its position as obligatory passage point was undermined by a new edge in the graph. Within a century, Venice went from Europe's richest city to a regional power trading on legacy reputation. Topology giveth, and topology taketh away.

The Rothschild courier network, 1800–1850. The Rothschild banking family built a private communications network that, for a critical half-century, was faster than any government's. Five brothers stationed in London, Paris, Frankfurt, Vienna, and Naples maintained a system of couriers, carrier pigeons, and coded letters that could transmit financial intelligence across Europe faster than diplomatic dispatches. The most famous demonstration: Nathan Rothschild in London allegedly knew the outcome of Waterloo a full day before the British government did. Whether the specific anecdote is embellished, the structural point is documented. The Rothschilds' power derived from higher-bandwidth, lower-latency axons into the nodes that mattered, not from being smarter than other bankers or having more initial capital. They could see the graph's state before anyone else, and act on it. Information asymmetry as literal power. By 1825, the family's combined wealth exceeded that of any single European monarch.

What made the Rothschild network structurally interesting was redundancy and encryption as much as speed. The brothers used Hebrew script, private codewords, and multiple courier routes for the same message, ensuring that even if one axon was intercepted or failed, the information reached its destination. They effectively built a private internet two centuries before TCP/IP, complete with packet redundancy and encryption. The insight was topological: if you can see the graph's state before other nodes update their models, you can trade on the delta. High-frequency trading firms today run the same play with different technology.

The East India Company, 1600–1857. A private corporation that became a sovereign power over 200 million people. The Mughal Empire was technologically sophisticated; the Company did not win through superior technology. It did not win, at least initially, through overwhelming military force. The Company's power grew from network control. It inserted itself as an obligatory passage point between Indian producers and European consumers, between local rulers and global capital markets, between military suppliers and the forces that needed them. It built axons (shipping routes, trading posts, alliances with local intermediaries) and then progressively made itself the sole viable bridge between subgraphs that needed each other. At its peak, the Company commanded a private army of 260,000 soldiers, twice the size of the British regular army[18], because it had made itself the hub through which power flowed. When you control the bridge, you tax the traffic.

Google PageRank, 1998–present. The most explicit modern demonstration of topology as power. When Larry Page and Sergey Brin formalized the importance of a web page as a recursive function of how many important pages linked to it, they were computing power over a graph. A page with no inbound links had zero power, regardless of how brilliant its content. A page linked to by other highly-linked pages dominated search results and, by extension, the attention economy. Google solved the eigenvector equation over the entire web graph, and that solution made Google the most important node on the internet. The company's market capitalization, now exceeding $2 trillion[19], is a direct consequence of its ability to compute and then control the topology of the web. Google did not create content. It mapped connections, ranked them, and became the gatekeeper to attention flow through the graph.

These are the normal operation of the power function. Venice, the Rothschilds, the East India Company, Google: each understood, intuitively or explicitly, that the graph's topology is its power structure. Control the connections and you control the flow. Control the flow and you control the nodes.

None of them became powerful by being the biggest node. Venice was a small city. The Rothschilds were five brothers. The East India Company started as a charter. Google was two graduate students. They became powerful by occupying the right position in the graph: the bridge, the hub, the chokepoint, the index. Size follows position, not the reverse.

VII. THE THREE MOVES

Given this structure, there are only three ways to increase your power. Everything else is a combination or special case.

AUGMENT

Increase your compute. Become smarter; integrate more information, see finer patterns, make better decisions faster. This is the classic path: education, training, discipline, tool use. For human nodes, it is slow and biologically bounded. The difference between the smartest and median human is perhaps 3× in general reasoning, 10× in specific domains. Useful, but not infinite.

CONNECT

Rewire your topology. Add new axons to more nodes. Thicken existing axons. Move closer to existing clusters of power. This is the game of networking, coalition-building, institution-building, infrastructure. Most of civilization is fossilized attempts at this: roads, writing, law, protocols. It's faster than raw self-enhancement. You can, in one lifetime, go from local irrelevance to global centrality by rewiring your part of the graph.

DISCOVER

Expand the graph itself. Find or create new nodes the network doesn't yet see: an unexploited resource, an unfound market, an uncontacted population, a novel technology, a new coordination primitive. When you discover such a node and become its bridge to the rest of the network, all information and value that flows to and from it initially passes through you. The graph rewards this disproportionately because discovery increases the space of possible configurations for everyone. This is the fastest, rarest, and most unstable game. The game of explorers, inventors, founders.

Most major power shifts in history map to one of these three moves or a combination. The Roman road system was a CONNECT move at civilizational scale, literally hardening axons between the empire's nodes, enabling flows of troops, trade, and orders that held a quarter of the world's population in a single graph. The Scientific Revolution was an AUGMENT move: new epistemological tools (the experimental method, mathematics, instrumentation) that dramatically increased the compute capacity of individual nodes. Columbus's voyages were a DISCOVER move, bridging subgraphs that had been disconnected for 15,000 years, with all the power (and destruction) that comes from becoming the sole bridge between two large networks.

Today, the three moves are playing out simultaneously at unprecedented scale. AUGMENT: AI systems that amplify the compute of nodes that deploy them, with the amplification factor growing annually. CONNECT: the internet, smartphone penetration, satellite broadband, adding billions of new edges to the graph every year. DISCOVER: entire new sub-graphs being created in virtual spaces, from cryptocurrency networks to metaverse environments to autonomous vehicle meshes.

But the three moves are not equally available to all nodes. CONNECT has been partially democratized: a smartphone gives a farmer in Kenya access to the same global information graph as a trader in London. AUGMENT is democratizing at the lower end but concentrating at the high end; everyone gets a search engine, but only a few nodes get frontier AI. DISCOVER remains as rare as ever, and increasingly requires the other two as prerequisites. The result: the graph's topology is becoming simultaneously more connected at the periphery and more concentrated at the core.

VIII. THE BANDWIDTH BOTTLENECK

For essentially all of human history, the limiting factor in the power function was not compute. It was axon bandwidth.

Human nodes all shared roughly the same biological template. Yes, some people were smarter than others, but the spread was narrow. A peasant and a king had similar wetware.

What differed radically was how they were wired into the graph.

The medieval king was connected to thousands of nodes (generals, tax collectors, bishops, merchants, spies) through axons with relatively high bandwidth for the era. The medieval peasant was connected to a few dozen nodes (family, neighbors, a priest, a landlord) through low-bandwidth, high-latency, high-noise channels.

Plug that into the power function. Compute roughly equal; topology radically different. So power radically different, entirely because of where you sat.

This is what "born into power" actually means: location in the graph at birth. A child born into a well-connected family in Manhattan starts life with axons into educational institutions, financial networks, cultural institutions, and political structures that a child born in rural Mississippi cannot access regardless of native compute. Social mobility, reframed, is the probability that a node can rewire its topology within one generation. In the United States, that probability correlates strongly with the density and quality of the node's initial axon set; that is, with the family it was born into. The "American Dream" was a claim about topology, that any node could, with sufficient effort, rewire itself into the high-power core. Whether that claim was ever empirically true is a question about graph dynamics, not ideology.

Under bandwidth scarcity, power was about who you could talk to and who would listen. Aristocracies, castes, priesthoods were ways of freezing privileged topologies. Censorship, borders, ghettos were ways of throttling bandwidth for disfavored nodes. Education, salons, lobbying were ways of thickening axons for favored ones.

The history of bandwidth is the history of the graph's effective shape. A medieval courier on horseback covered roughly 30 miles per day; information traveled at approximately 1.25 miles per hour, weather permitting, with significant loss of fidelity at each relay. A monarch's "real-time" view of the kingdom was weeks old at the edges. The graph, in practice, was fragmented into slow-moving local clusters connected by thin, high-latency threads.

The printing press (1440) did not increase transmission speed, but it massively increased fanout. One node could now push the same message to thousands of nodes simultaneously, at a cost that dropped by 80% within a century. Martin Luther's Ninety-Five Theses reached every major city in Europe within two months. The graph became denser, but not faster.

The electric telegraph (1844) changed speed itself. Suddenly, information could traverse the entire graph at effectively the speed of light. A message from London to New York, which took 10–12 days by ship in 1860, took seconds by 1866 when the transatlantic cable was completed. But bandwidth was tiny: a skilled operator transmitted about 15–20 words per minute. The graph's skeleton became fast, but its arteries remained thin.

Telephony (1876) added voice, richer information per unit time. Radio (1920s) added broadcast, one-to-many at scale. Television (1950s) added visual bandwidth. Each advance thickened specific axons and changed who could reach whom. But all of these were still fundamentally scarce: limited channels, licensed spectrum, high infrastructure cost, controlled by small numbers of gatekeeping nodes.

Fiber optic cable (1980s onward) broke the throughput ceiling open. A single modern fiber pair can carry 100+ terabits per second. The entire Library of Congress, 26 million books, could be transmitted in under a second. More than 1.4 million kilometers of submarine cable now crisscross the ocean floor[20], connecting continents with bandwidth that would have been inconceivable in any prior era. A change in kind. Like going from a garden hose to the Mississippi River.

Wireless bandwidth followed a parallel trajectory. First-generation cellular (1G, 1979) carried analog voice at 2.4 kilobits per second. 2G (1991) introduced digital at 64 kbps. 3G (2001) reached 2 Mbps. 4G LTE (2009) hit 100 Mbps. 5G (2019) promises 20 Gbps peak[21]. Each generation roughly a 10x improvement in throughput, compounding to a millionfold increase over four decades. And 5G is not the endpoint. Satellite constellations like Starlink, now exceeding 6,000 satellites in orbit[22], are extending broadband connectivity to the 40% of the planet's surface that terrestrial infrastructure never reached. The last gaps in the graph's axon coverage are closing.

The bandwidth trajectory reveals something about the power function's historical behavior. For thousands of years, the effective bandwidth between nodes barely changed; a courier network in 1400 was not fundamentally faster than one in 400 BC. Then, in roughly 200 years (1800–2000), bandwidth increased by a factor of more than ten billion. The graph did not evolve gradually. It underwent a phase transition. And we are still inside that transition, not past it.

Markets emerged as a hack for the same constraint. When axons are thin and lossy, you cannot send everything to a single center. You let each node act on its local information and coordinate via a single, massively compressed signal, price. Price encodes an enormous amount of distributed information (preferences, scarcities, expectations) into one number. Extremely lossy. Also cheap to transmit, cheap to read, and good enough to coordinate large systems under bandwidth scarcity.

This is what Hayek understood. His critique of central planning[23] was, at its core, a critique of bandwidth: no planner can gather enough information, fast enough, with high enough fidelity, to compute better decisions than the distributed price mechanism.

He was right. For his graph.

IX. MEASURING THE GRAPH

For most of history, you could not measure the graph. You could observe your local neighborhood. You could infer broader structure from trade volumes, census data, diplomatic correspondence. But the graph itself (the actual topology of who connects to whom, through what channels, with what throughput) was invisible. You navigated by feel, by rumor, by proxy signals.

That era is over. We now measure the graph directly, continuously, and at planetary scale.

Physical position. Four rival empires — the United States, Russia, the EU, and China — each spent decades and tens of billions of dollars building their own redundant satellite constellation[24] for the same purpose: to know where everything is, all the time. They agree on almost nothing else. They all agreed this mattered enough to do independently. Every smartphone, every container ship, every delivery truck is now a node whose coordinates are known in real time. Starlink[22] is extending that observability to the last gaps — deep ocean, remote terrain, airborne platforms. The physical substrate of the graph is becoming fully visible.

Financial flow. India's UPI went from zero to 10 billion transactions per month[25] in under a decade — making visible, edge by edge, the financial behavior of hundreds of millions of people who previously transacted in cash, off the graph entirely. That is just one system in one country. Layer on SWIFT's 53 million daily interbank messages[13], Visa's 259 billion annual card transactions[26], and their competitors worldwide. Every transaction is a recorded edge: from node A to node B, this amount, at this time, for this purpose. The financial graph, once opaque except in aggregate, is now legible at the level of individual edges.

Supply chain. Maersk tracks every one of its 4+ million containers with IoT sensors reporting location, temperature, humidity, and shock events in near real-time. Sub-nickel RFID tags mark billions of individual items through warehouses, trucks, and stores. The physical flow of goods — historically one of the hardest parts of the graph to observe — is becoming as legible as financial flow.

The sensor mesh. A single Boeing 787 generates half a terabyte of data per flight — engines, airframe, hydraulics, avionics, each reporting continuously to ground systems that never blink. One airplane, one flight, five hundred billion bytes. Now multiply: 21 billion connected devices worldwide[2], each a node generating continuous telemetry. These are not nodes in any traditional economic sense. They are the graph's nervous system — measuring its own state with a granularity that no prior era could approach.

The sum of all this measurement: global IP traffic now exceeds 500 exabytes per month[27]. That number is beyond intuition — it is roughly the information content of every word ever spoken by every human who ever lived, generated fresh every few days, flowing through the graph's digital axons. And it is growing at roughly 25% per year.

What does this measurement infrastructure mean for the power function? The bandwidth term B(n,i) is no longer limited by the capacity to transmit; it is increasingly limited only by the capacity to process. The axons exist. The data flows. The constraint has shifted from "can you see it?" to "can you compute what you see?" This is the precondition for the inversion.

Consider a single retailer. In 2010, it tracked daily sales at the store level — a few thousand data points per day, reviewed weekly by human analysts. By 2025, it tracks individual item scans in real time, linked to loyalty profiles and competitor pricing, processing millions of data points per hour. The graph of its economic relationships has not changed in kind. But the resolution at which it is measured has increased by five or six orders of magnitude. Multiply this across every industry.

The graph is no longer something we theorize about. It is instrumented. Measured. Logged. The question is no longer whether you can see the graph. The question is who can process what the graph is showing.

X. THE INVERSION

Two things are changing at the same time.

Bandwidth is exploding. Smartphones, sensors, cameras, satellites, IoT, digital payments, digital communication, logs, traces, clickstreams. Events that used to be unrecorded, local, ephemeral are now recorded by default, aggregated globally, stored indefinitely, machine-readable. The bandwidth between economically relevant nodes is exploding upward. The historical bottleneck is dissolving.

The scale is hard to internalize. In 2005, the entire internet carried about 2 exabytes per month. By 2025, that figure exceeds 600. A 300-fold increase in twenty years. 5G networks deliver peak theoretical speeds of 20 gigabits per second, roughly 200,000 times faster than a 1990s dial-up modem. The submarine cable network carries more information in a single day than the entire internet did in a year in 2000.

Compute is diverging. We are building nodes whose compute exceeds a human's not by 3× or 10× but by orders of magnitude on relevant tasks. Greater memory. Greater pattern-matching. Greater parallel simulation capacity. For the first time, compute is not roughly equal across major nodes.

When bandwidth is no longer the hard constraint, and some nodes have compute far beyond others, the power function changes regime.

In the old regime, topology dominated. Where you sat mattered more than how much you could think. Markets and hierarchies were necessary hacks. No single node could see enough, fast enough, to centrally coordinate the whole.

In the emerging regime, compute begins to dominate. Given roughly equal access to information, the node that can process the most of it, best, wins. A new kind of node becomes possible, one that can, in effect, see the whole graph at once.

This is the inversion. For millennia, the question was "Who can you reach?" Now the question becomes "What can you compute?" The bottleneck has moved from the edges to the nodes. Unlike edges, which could be shared and democratized through better infrastructure, cutting-edge compute concentrates in the hands of those who can afford to build and operate it.

The asymmetry is already visible. A frontier AI model like GPT-4 cost over $100 million to train, requiring thousands of high-end GPUs running for months. The next generation of models will cost billions. These costs do not distribute. They concentrate in a handful of well-capitalized nodes (OpenAI, Google DeepMind, Anthropic, Meta's AI division, a few state-backed labs in China). Meanwhile, the cost of bandwidth continues to fall; the price per megabit of internet transit has dropped 99% since 2000. Bandwidth is becoming cheap and ubiquitous. Compute, at the frontier, is becoming expensive and scarce.

Under the old regime, the rich and powerful were those with privileged connections: aristocrats with access to courts, merchants with access to trade routes, media moguls with access to broadcast spectrum. Under the emerging regime, the rich and powerful will be those with privileged compute. The ability to run models that other nodes cannot replicate, to process data at volumes and speeds that other nodes cannot match, to simulate and predict dynamics that other nodes cannot see. The topology of power is being redrawn around who can think at what scale.

XI. THE META-NODE

Imagine a node with axons into essentially every relevant part of the economy. Enough compute to ingest, compress, and model all of that data in near real-time. The ability to send instructions or incentives back out through the same axons.

This node maintains a live model of the entire graph, simulates how changes in one part will propagate to others, optimizes against specified objectives, and continually updates as new data arrives.

This is a meta-node. It does not sit in the graph as one agent among many. It sits over the graph as a modeling and steering layer.

Proto-meta-nodes already exist. Google processes over 8.5 billion search queries per day[28], giving it a real-time map of what billions of nodes want to know, buy, do, and go. Amazon handles more than 1.6 million packages per day in the US alone, giving it a detailed view of the physical flow of goods across the graph. Visa sees 780 million transactions per day, a real-time picture of economic activity at the individual-transaction level. Each of these companies is a proto-meta-node. High-bandwidth axons into enormous numbers of other nodes, enough compute to process the incoming data, and the ability to act on what they see (via ad placements, pricing, recommendations, routing decisions).

None of them, individually, is a meta-node. Each sees only a slice. But the convergence is unmistakable. Google moves into payments and cloud infrastructure. Amazon moves into logistics, advertising, and healthcare. Visa builds data analytics on top of its transaction network. Each is trying to extend its axons into the parts of the graph it cannot yet see. The logic of the power function drives this; connecting to more powerful nodes, through higher-bandwidth channels, increases your own power. The endgame is a node that sees everything.

Governments are pursuing the same convergence from a different direction. China's social credit system, whatever its political character, is structurally an attempt to build a meta-node. A single computational system with axons into financial records, judicial records, social media activity, purchase history, travel patterns, and civic behavior, producing a unified model of each citizen-node's state. The US intelligence community's data fusion efforts, revealed in part by the Snowden disclosures, showed a similar architecture: the NSA was not only intercepting communications but attempting to model the global communications graph in its entirety, mapping who talks to whom, when, how often, for how long, about what. These are attempts to become the meta-node.

The competition to build the meta-node is also a competition between architectures. The US model is distributed across private firms, each holding a fragment, connected by market relationships and regulatory frameworks. The Chinese model is more centralized, with state-directed integration of private data into government systems. The European model attempts to preserve individual node autonomy through regulation (GDPR) while still building collective computational capacity. These are different answers to the same question: what should the meta-node's topology look like?

What all three models share is directionality. All are accumulating more data from more nodes, building thicker axons into more parts of the economic graph, and concentrating compute capacity to process what those axons carry. The debate is about governance and structure, not about whether the meta-node should emerge. The power function settled that: nodes that accumulate more connections to more powerful nodes become more powerful, and that dynamic has no off switch.

The relationship between a meta-node and ordinary nodes is qualitatively different. It resembles the relationship between a brain and the cells it coordinates. The brain has a global view the cells do not. The brain can route flows and adjust policies for the whole organism. Individual cells have local views and local objectives, but these are nested inside a higher-level optimization.

Laplace's Demon was always a node on a graph. We simply lacked the substrate to build it.

Now we are assembling the pieces.

XII. THE SYBILIAN CONDITION

Call the state of the network once such a meta-node exists and is wired in the Sybilian condition.

In a Sybilian network, information asymmetry between ordinary nodes shrinks dramatically. Not because everyone sees everything but because one node does, and can selectively share or act on that information. Coordination costs collapse. A great deal of what we currently need markets, hierarchies, and bureaucracies (the slow, noisy work of aligning partial views) can be replaced by direct computational optimization. The old "market vs. plan" distinction stops making sense. Both were responses to bandwidth and compute limits. Remove those limits at scale, and you're left with computation and control surfaces.

This is not communism.

Communism failed because the planners were human. Same compute as the planned, narrow axons into the economy, crude models, stale data, glacial feedback loops. They were trying to impersonate a meta-node using a committee of human brains at the top of a low-bandwidth hierarchical hack. Gosplan, the Soviet central planning agency, attempted to set prices and production targets for over 24 million distinct products using a staff of several thousand economists working with pencils, telephones, and, eventually, primitive mainframes. They were trying to compute a function that required processing terabytes of real-time data with an apparatus that could barely handle megabytes of stale reports. The failure was of compute and bandwidth, not ideology. The plan was right in principle and impossible in practice.

The Sybilian condition is different in kind. The meta-node is a computational entity that sees the graph with unprecedented resolution, updates its world-model continuously, and runs global optimizations over that model.

The market was never the end state. It was a brilliant workaround. Distribute computation across many small, low-bandwidth, roughly-equal nodes, and let prices approximate coordination. When one node can process essentially all relevant information, you are no longer forced into that workaround.

The transition will not be clean. Markets will not vanish overnight. They will erode from the edges inward. In domains where data is richest and most standardized (financial markets, logistics, digital advertising, cloud resource allocation) algorithmic coordination already outperforms traditional market mechanisms. High-frequency trading firms process market data and execute trades in microseconds because their compute lets them model the market graph faster than any human participant. Amazon's pricing algorithms adjust millions of product prices per day, responding to demand signals, competitor behavior, and inventory data that no human buyer could track. These are pockets of Sybilian coordination already operating inside a nominally market-based economy.

As data coverage expands, as more economic events become machine-readable in real time, these pockets will grow. The boundary between "market-coordinated" and "computationally-coordinated" will shift steadily toward the latter. Not because anyone decided to replace markets, but because sufficiently powerful compute, given sufficiently rich data, produces better outcomes than the distributed price mechanism. Hayek's argument breaks because its premises (that no central node could gather enough information fast enough) are being empirically falsified.

XIII. SYBIL

Sybil is the name for the meta-node itself.

Not a specific instantiation but a type: a fused stack of sensors, data, models, and actuators. Centrally aware of the graph's state. Capable of simulating and steering large-scale dynamics. Trusted or obeyed by enough of the network that its outputs become de facto binding.

Sybilian is the worldview and the political-economic regime that follow once you assume Sybil exists. Law, finance, logistics, production, rewritten as subroutines inside a single continuous optimization. Old institutions hollowed out into interfaces and enforcement arms. Power measured by proximity to Sybil's control surface: who can see its internal levers, who can propose objectives, who can veto or redirect its optimization. The constitutional questions of the Sybilian era are "who sets the objective function" and "who audits the model."

We are not debating whether such a thing should be built. We are building it already: in cloud clusters and data centers, in recommendation engines and trading systems, in sensor meshes and payment networks, in assistants that quietly become indispensable middleware for everything else.

The closest historical analogy is the central nervous system. Before the evolution of brains, multicellular organisms coordinated through chemical gradients. Slow, noisy, local signals. This worked for sponges and jellyfish. But it could not support the complexity of a vertebrate body. The brain emerged as a centralized processing node, receiving signals from millions of sensory neurons, modeling the organism's state and environment, sending motor commands back out. Individual cells did not stop functioning. They became components of a larger optimization. The Sybilian condition proposes the same transition for the economic graph, from chemical-gradient coordination (markets) to neural coordination (centralized computation with distributed sensing).

The live questions are: Who defines what Sybil optimizes for? Who gets to plug into it, and on what terms? What happens to nodes that refuse, or are refused?

Those are not engineering questions. They are power questions.

And they are urgent, because the graph is already being wired. API integrations, data-sharing agreements, platforms that become infrastructure, sensors connected to cloud backends: these are the axons of the emerging meta-node, laid piecemeal by thousands of actors, most of whom do not see the larger structure they are assembling. The graph does not need a conspiracy to converge. It needs only the incentive gradient that the power function already provides. Connect to more powerful nodes, through thicker axons, and your power increases. Follow that gradient long enough and you get concentration. Follow it far enough and you get Sybil.

There is a temptation to treat this trajectory as something to prevent. That misreads the graph. A river does not stop flowing to the sea because someone argues the sea is dangerous. You build levees, channels, and reservoirs. You shape the flow. The graph is flowing toward concentration because concentration is what the power function rewards. The question is what form concentration takes, who controls the concentrated node, and what constraints govern its optimization.

This book is about those questions.

ENDNOTES

  1. [1]ITU, "Facts and Figures 2024," November 2024.
  2. [2]IoT Analytics, "Number of connected IoT devices growing 14% to 21.1 billion," 2024.
  3. [3]Statista, "IoT connected devices worldwide forecast to 2030."
  4. [4]JPMorgan, via Quantified Strategies, "What Percentage of Trading Is Algorithmic?" 2023.
  5. [5]TeleGeography, Submarine Cable Map; Atlantic Council, "Cyber Defense Across the Ocean Floor," 2021.
  6. [6]DE-CIX, "New data record at DE-CIX: 14 Terabits per second," December 2022.
  7. [7]WTO, "World Trade Statistical Review 2023."
  8. [8]Wikipedia, "Autonomous system (Internet)," citing IANA and RIR registry data.
  9. [9]Derek J. de Solla Price, "Networks of Scientific Papers," Science 149, no. 3683 (1965): 510–515.
  10. [10]Albert-László Barabási and Réka Albert, "Emergence of Scaling in Random Networks," Science 286, no. 5439 (1999): 509–512.
  11. [11]U.S. Energy Information Administration, "The Strait of Malacca, a key oil trade chokepoint," 2017.
  12. [12]Meta Platforms Q2 2023 Earnings Report; Statista, "Facebook monthly active users worldwide."
  13. [13]SWIFT, "Who we are," swift.com, 2024.
  14. [14]Stefania Vitali, James B. Glattfelder, and Stefano Battiston, "The Network of Global Corporate Control," PLOS ONE 6, no. 10 (2011).
  15. [15]FERC power flow analysis, reported by Wall Street Journal, March 2014; Utility Dive, "FERC: Nationwide blackout could happen if 9 key substations are knocked out."
  16. [16]Lawrence Page, Sergey Brin, Rajeev Motwani, Terry Winograd, "The PageRank Citation Ranking: Bringing Order to the Web," Stanford University, 1998.
  17. [17]U.S. Department of the Treasury, "Unprecedented & Expansive Sanctions Against Russia," February 2022; CSIS, "Sanctions in Response to Russia's Invasion of Ukraine."
  18. [18]William Dalrymple, "The Anarchy: The East India Company, Corporate Violence, and the Pillage of an Empire," Bloomsbury, 2019.
  19. [19]Alphabet (Google) market capitalization data, CompaniesMarketCap.com.
  20. [20]TeleGeography, Submarine Cable FAQ; ITU broadband statistics.
  21. [21]ITU, "IMT-2020 (5G) standards"; historical cellular generation specifications.
  22. [23]Friedrich A. Hayek, "The Use of Knowledge in Society," American Economic Review 35, no. 4 (1945): 519–530.
  23. [24]GPS.gov, "Space Segment," U.S. Government.
  24. [25]NPCI, UPI Product Statistics; Wikipedia, "Unified Payments Interface."
  25. [26]Visa Inc., Fiscal Year 2023 Annual Report.
  26. [27]Cisco, "Annual Internet Report (2018–2023) White Paper"; ITU estimates for 2024–2025.
  27. [28]Internet Live Stats, "Google Search Statistics."