For millennia, humans have asked the same question: what should I do today?
The farmer asked it and the answer was: tend the field. The soldier asked it and the answer was: follow orders. The merchant asked it and the answer was: buy low, sell high. The worker asked it and the answer was: perform the task your employer assigns.
The question assumed that doing was valuable. That human action (physical, cognitive, creative) was the scarce resource around which the economy organized. You sold your doing. You were paid for your doing. Your identity was your doing: I am a farmer, a soldier, a merchant, a programmer.
The assumption is failing.
Not in the distant future but now. Over 200,000 overdose deaths in America between 2021 and 2023[1], concentrated in communities where doing has already been automated away. One in five American adults reporting depression[2]. 1.5 million Japanese citizens who have withdrawn from society entirely[3]. Forty percent of teenagers reporting persistent sadness[4]. These are not unrelated phenomena. They are the first symptoms of a species losing its answer to the oldest question.
When robots do the physical work, and AI does the cognitive work, what is left for humans to do?
The answer is not nothing. The answer is: set the rates.
All economic activity can be decomposed into a hierarchy:
LEVEL 0: PHYSICAL EXECUTION
Moving matter. Lifting, carrying, assembling, transporting. The work of muscles, now increasingly the work of machines.
LEVEL 1: COGNITIVE EXECUTION
Processing information. Calculating, analyzing, writing, coding. The work of brains, now increasingly the work of models.
LEVEL 2: DECISION-MAKING
Choosing between options. Given these possibilities, which one? The work of judgment, increasingly augmented by algorithms.
LEVEL 3: OBJECTIVE-SETTING
Defining what counts as success. What should we optimize for? The work of values, still mostly human.
LEVEL 4: RATE-SETTING
Determining the exchange rates between objectives. How much safety for how much growth? How much equality for how much efficiency? The work of politics, economics, ethics, the work that determines the shape of everything below.
Automation ascends this hierarchy. First we automated Level 0: machines replaced muscles. Then Level 1: computers replaced calculators. Now we are automating Level 2: AI replaces judgment.
The speed of ascent is accelerating. It took roughly two centuries to automate Level 0 — from the spinning jenny (1764) to full factory automation. It took roughly five decades to automate Level 1 — from mainframes (1960s) to ubiquitous computation. Level 2 automation is happening in years, not decades. GPT-3 (2020) could barely write a coherent paragraph. Five years later, AI systems outperform human experts on medical diagnosis, legal reasoning, software engineering, and scientific analysis. The hierarchy is being climbed not step by step but in leaps.
Each level of automation pushes humans up the hierarchy. When machines lifted, humans calculated. When computers calculated, humans decided. When AI decides, humans must... what?
Set the rates.
A rate is an exchange ratio. How much of X for how much of Y.
This sounds simple. It is the most powerful concept in economics, arguably in all of human coordination. Nearly every transaction, contract, policy, and law is, at bottom, a rate. A statement about the conversion factor between two things that humans value.
Economics is built on rates. Prices are rates: dollars per good. Wages are rates: dollars per hour. Interest is a rate: future dollars per present dollar. The economy is a vast machine for converting things into other things, and rates are the conversion factors.
But rates extend far beyond money.
Every decision is implicitly a rate. When you choose to work late instead of seeing your family, you are setting a rate: career advancement per unit of family time. When a company chooses to cut costs by reducing quality, it is setting a rate: profit per unit of customer satisfaction. When a society chooses to permit pollution for economic growth, it is setting a rate: GDP per unit of environmental degradation.
These rates are usually implicit. We do not say "I value career advancement at 1.5 family-hours per promotion." We just... decide. The rate is revealed by the decision, not stated before it.
The implicitness is a feature, not a bug, at least for those who benefit from the current rates. The interest rate set by central banks is an explicit rate, and therefore a site of political contest. But the rate at which Amazon's algorithm favors its own products over third-party sellers is implicit, buried in code, and therefore uncontested. The rate at which Google's search results favor large advertisers over small publishers is implicit, and therefore invisible. The most consequential rates in the modern economy are the ones no one can see.
The Sybilian condition makes rates explicit.
When the Sibyl computes allocation, it must be given rates. How much do we value efficiency versus equity? Growth versus sustainability? Innovation versus stability? These are not questions the Sibyl can answer. They are parameters the Sibyl requires.
The rate is the human input to the optimization function.
What is a trader?
In the narrow sense: someone who buys and sells financial instruments, seeking profit from price movements. There are roughly 10 million active traders in the US alone. They are seen as a specialized profession: Wall Street, not Main Street.
In the broader sense: someone who sets exchange rates. The trader looks at two things and decides: how much of this for how much of that? The price they offer is a rate. The market that emerges from many traders is a rate-discovery mechanism. A stock price is not a fact about a company. It is a rate, the aggregated exchange ratio between dollars and fractional ownership, discovered through the continuous negotiation of millions of trades.
In the Sybilian economy, everyone becomes a trader in the broader sense.
Not because everyone will work in finance. Because the only remaining human function is rate-setting. The Sibyl executes. Humans parameterize. Eight billion people will not all stare at Bloomberg terminals. But they will all, in one form or another, express preferences that feed into systems that allocate resources. They will all set rates, consciously or unconsciously, directly or through delegation, individually or collectively.
It already works in algorithmic systems:
A recommendation algorithm needs to balance engagement versus well-being. More engagement means more time on platform, more ad revenue. But too much engagement harms users: addiction, anxiety, distorted worldviews. Someone must set the rate: how much engagement per unit of well-being? That someone is not the algorithm. That someone is human: a product manager, a policy team, a regulator.
An autonomous vehicle needs to balance speed versus safety. Faster arrival means happier passengers, more trips per day. But speed increases accident risk. Someone must set the rate: how much time savings per unit of risk? The car cannot decide. A human must.
A resource allocation system needs to balance efficiency versus resilience. Lean supply chains are cheaper. But lean systems are fragile; one disruption cascades everywhere. Someone must set the rate: how much cost savings per unit of fragility?
In each case, the computation is done by machines. The rate is set by humans. The machine is infinitely capable within its parameters and infinitely helpless without them. It can solve any problem you define but cannot define the problem. It can optimize any objective function but cannot write the objective function. The power is in the parameters. The parameters are rates.
This is already the reality in quantitative finance, perhaps the most advanced rate-setting domain on the planet. Algorithmic trading systems execute millions of transactions per second — Level 0 and Level 1 fully automated. Increasingly, they also make trading decisions — Level 2. But the strategy, the risk parameters, the loss limits, the sector weights — the rates — are still set by humans. The human trader has evolved from someone who buys and sells into someone who parameterizes systems that buy and sell. The trader's job title has not changed. The job's actual content has been completely transformed.
Scale this up. Make the machines more capable. Make them capable of everything: all physical execution, all cognitive execution, all decision-making within defined parameters.
What remains for humans? Setting the parameters. Setting the rates.
Here is the uncomfortable truth about rate-setting: it has already been automated, just not by the Sibyl.
The largest rate-setting operation in human history is not the Federal Reserve. It is not the United Nations or the World Trade Organization. It is the algorithmic feed.
Any platform that mediates between humans and information is setting rates. What you see versus what you do not see. Which creators are amplified and which are buried. Which ideas spread and which die in obscurity. These are not neutral conveyances. They are rate-setting machines that determine the allocation of the scarcest resource in the information economy: human attention.
The data is clear.
TikTok averages 95 minutes of daily use per user[5], with the algorithm selecting nearly every piece of content consumed. Users do not browse. They submit. The feed is not a menu — it is a rate imposed from above, determining the exchange ratio between creative output and audience attention. A musician in Lagos and a comedian in Jakarta are not competing in a market. They are being allocated attention by a system whose rates they cannot see, let alone influence.
Spotify controls music discovery for over 600 million users[6]. Its recommendation engine determines which artists are heard and which are not. The platform's rates (plays per listener, revenue per stream, visibility per genre) shape the entire economics of music. An artist's livelihood depends not on talent or even audience demand, but on where the algorithm's rates place them.
Google's top search result captures 27% of all clicks[7]. The second result gets 15%. By the tenth, it is under 2%. Google sets the rate of visibility. For any given query, Google determines the exchange ratio between relevance (as it defines it) and attention (as users allocate it). This is rate-setting at civilizational scale: what humanity knows is increasingly a function of what one algorithm decides to surface.
Netflix's recommendation engine drives 80% of content watched on the platform[8]. Users believe they are choosing. They are being allocated. The distinction matters: choice implies agency over the option set; allocation implies the option set has been pre-filtered by someone else's rates.
The feed shapes what people treat as meaningful.
Beyond economic rates (revenue per creator, cost per impression), these platforms are setting epistemic rates: the exchange ratio between reality and perception. They determine how quickly a political movement gains traction, how rapidly a conspiracy theory spreads, which scientific findings reach the public and which languish in journals. The rate at which a society updates its beliefs is now set by algorithms optimizing for engagement.
The feedback loop is what makes this dangerous. Algorithmic rate-setting shapes preferences as much as it reflects them. A teenager who spends 95 minutes per day on TikTok does not arrive with fully formed preferences that the algorithm serves. The algorithm forms the preferences. It teaches the teenager what is funny, what is beautiful, what is cool, what is worth caring about. The rate-setter becomes the preference-maker. The Sibyl, in its current primitive form, is manufacturing the wants that justify its allocation.
This corrupts the rate-setting model. In the idealized version, humans bring authentic preferences to the system, and the system optimizes accordingly. In the actual version, the system generates the preferences it then claims to serve. The exchange rates between content and attention are not discovered through negotiation. They are imposed by optimization functions whose objective (maximize engagement time) has no necessary relationship to human flourishing.
The product managers and engineers who tune these algorithms are, in effect, the most powerful rate-setters in human history. They do not hold office. They were not elected. Their rate-setting decisions affect billions of people and face no democratic accountability whatsoever. The rate oligarchy described later in this chapter is not a future risk. It is the present condition.
Not all rates are equal. They exist in a hierarchy.
Micro-rates: Individual preferences. How much do I value coffee versus tea? Work versus leisure? This city versus that one? These are the rates that parameterize personal optimization, the Sibyl helping you get what you want, given your stated rates.
Meso-rates: Organizational rates. How does this company balance profit versus sustainability? How does this hospital balance cost versus care quality? These are the rates that parameterize institutional optimization, the Sibyl helping organizations achieve their objectives.
Macro-rates: Societal rates. How does this society balance growth versus equality? Freedom versus security? Present versus future? These are the rates that parameterize civilizational optimization, the Sibyl allocating resources across the entire network.
The hierarchy matters because rates at higher levels constrain rates at lower levels.
If the macro-rate prioritizes environmental sustainability, micro-rates that conflict (e.g., preferring cheap goods over sustainable goods) may be overridden or taxed. If the meso-rate of a company prioritizes profit above all, micro-rates of employees who value work-life balance may be subordinated.
The rate stack is a power structure. Whoever sets the higher-level rates constrains everyone below.
Politics has always worked this way. Laws are macro-rates, society's stated exchange ratios, enforced by violence. Culture is a set of implicit rates: social approval per unit of norm compliance.
What is new is the explicitness. When allocation is computed, the rates must be stated. When rates are stated, they can be examined, contested, changed. The implicit becomes explicit. The hidden becomes visible.
What happens when a society must explicitly state its discount rate on future generations? Right now, climate policy is debated through proxies — carbon taxes, renewable mandates, emission targets — each encoding an implicit discount rate that is never directly confronted. In the Sybilian condition, the parameter is exposed: "How much do we weight the welfare of people born in 2100 relative to people alive today?" The answer determines everything from infrastructure investment to resource extraction to research priorities. And the answer must be given in a number, not a speech.
This explicitness disarms the rhetorical strategies that currently dominate politics. You cannot say you care about the future while setting a discount rate that implies you do not. You cannot claim to value equality while setting rates that produce concentration. The rate stack makes hypocrisy computationally detectable.
The implications cut both ways: toward accountability and toward control.
If the rate stack is the structure, who operates within it?
The first chapter described three moves available to any node in the graph: discover, connect, and augment. In the Sybilian economy, each move maps to a distinct human role, a different way of relating to the Sibyl and a different source of meaning.
The first and most rewarded is the founder. In graph terms, this is the discover move, expanding the graph itself by creating nodes that did not exist before.
The Sibyl can optimize any graph but cannot expand the graph. It can compute the best allocation of resources across existing categories but cannot invent a new category. It can simulate the consequences of any proposed system but cannot conceive of a system no one has yet proposed.
This distinction matters more than it appears to. When someone creates a new platform (the way Uber created ride-sharing, or Bitcoin created decentralized digital money, or Wikipedia created open encyclopedic knowledge) they are not setting a rate within an existing system. They are creating a new region of the graph, a new set of nodes and edges that did not exist before, and becoming the bridge through which all value initially flows.
The Sibyl can optimize Uber's pricing but cannot invent the concept of ride-sharing. It can allocate Bitcoin across portfolios but cannot conceive of trustless consensus. It can improve Wikipedia's recommendation engine but cannot imagine free, collaboratively produced knowledge.
Founding is the act of saying this should exist before it does. Imagination hardened into structure. And it is irreducibly human, not because machines lack generative capacity (they increasingly do not) but because the question "should this exist?" is a values question, not a computational one. The Sibyl can model what would happen if something existed. Only a human can decide that it should.
In graph terms, the founder expands the action space of the Sibyl itself. Each new platform, institution, and coordination mechanism is a new set of parameters the Sibyl can optimize over. The founder does not set rates; the founder creates the dimensions along which rates can be set.
The graph chapter showed why this is the most rewarded role: "when you discover such a node and become its bridge to the rest of the network, all information and value that flows to and from it initially passes through you." The founders of the Sybilian era will be those who create new substrates for human coordination, new ways for people to express preferences, negotiate rates, and govern the systems that allocate their lives.
The meaning is in the creation. You matter because you brought something into being that did not exist before. This is thick meaning, the kind that sustained artists, entrepreneurs, scientists, and builders throughout human history. The Sybilian condition does not eliminate founding. It concentrates it.
The second role is the one this chapter has been building toward: the trader. In graph terms, this is the connect move, rewiring topology, building coalitions, thickening the channels between nodes that need to negotiate.
Not rate-setting in isolation (that would make you a dictator). Rate trading: negotiating, exchanging, adjusting rates through interaction with others. The word "trading" is precise. A trade implies reciprocity, compromise, mutual adjustment. You bring your rates. I bring mine. We negotiate. The result is neither yours nor mine but a synthesis that reflects the exchange.
This is the democratic ideal, stripped of its institutional artifacts and reduced to its essence: distributed rate negotiation among agents with different preferences.
Here is how it might work:
You wake up in the morning. Your personal AI assistant asks: what are your priorities today? You state them, or rather, you adjust them from yesterday's defaults. More focus on health, less on productivity. The assistant optimizes your day accordingly.
You go to work, but "work" now means participating in rate negotiations for your organization. What should the company prioritize this quarter? You have a voice, a vote, a stake. The rates you advocate for reflect your values. The rates that win reflect the collective negotiation.
You participate in governance, local, regional, global. Not by voting for representatives who make decisions, but by directly stating preferences that feed into allocation systems. More parks versus more housing. More research versus more redistribution. Your rates are weighted by... something. Citizenship, stake, expertise, lottery. The weighting itself is a rate that was collectively set.
Prediction markets are rate-discovery mechanisms. Participants bet on outcomes, and the market aggregates beliefs into probabilities. Quadratic voting lets participants allocate influence across issues, trading intensity on one for intensity on another. Liquid democracy lets participants delegate their votes on specific topics, trading direct influence for expert judgment. Each is a primitive form of rate trading.
The meaning is in the negotiation. You matter because your preferences push against other preferences and the collision produces something neither of you would have chosen alone. This is thinner meaning than the founder's (you are not creating, you are adjusting) but it is real. It is the meaning of participation, of voice, of being heard. The Sybilian condition scales it up. When all allocation is computed, all allocation is parameterized. When all allocation is parameterized, all parameters are negotiable.
Later in this chapter, we will describe the meaning crisis, the evidence that stripping humans of productive work produces despair at industrial scale. The aristocrat is often cited as proof: give people freedom from labor and they flounder. Ennui, addiction, existential drift.
This reading is incomplete. The best aristocrats did something specific and irreplaceable.
They cultivated the quality of wanting.
The Medici did not paint. They created the conditions under which painting flourished. They selected which artists to fund, which projects to commission, which vision of beauty to elevate above all others. Their contribution was not production. It was taste, discernment about what was worth producing in the first place. Lorenzo de' Medici's patronage defined what excellence meant for a civilization. That is not a passive act. It is the most consequential form of rate-setting: determining what counts as good.
In graph terms, this is the augment move, transformed for the Sybilian era. You cannot augment your compute; AI exceeds you. But you can augment the quality of your preferences. You can develop the judgment to know what the Sibyl should optimize for.
This matters because objective functions vary enormously in quality. "Maximize engagement" is a crude objective that produces TikTok addiction and attention capture. "Maximize the conditions under which humans develop rich inner lives, deep relationships, and meaningful creative expression" is a refined objective. Both are valid inputs to the Sibyl. They produce radically different worlds.
The aristocrat is the human who takes this difference seriously.
Not negotiating exchange ratios (that is the trader). Not creating new possibility spaces (that is the founder). The aristocrat asks: given the spaces that exist and the rates that have been negotiated, are we optimizing for the right things? Is what we want worth wanting? Is the objective function worthy of the computational power aimed at it?
This is the custodial function. Stewardship of culture, of meaning, of the standards by which the Sibyl's output is judged. The aristocrat maintains what the founder creates and the trader negotiates. Without the aristocrat, the system optimizes efficiently for objectives no one has bothered to refine. With the aristocrat, the objectives themselves evolve, becoming richer, more nuanced, more worthy of the civilization they shape.
Every society that has flourished has had this function. The Japanese concept of ikigai (reason for being) is aristocratic in the Sybilian sense: the cultivation of knowing what matters, which is prior to and more important than the pursuit of what matters. The liberal arts tradition, the conservatory, the apprenticeship, the monastic discipline of attention: all are training systems for the aristocratic function. They do not teach you to do. They teach you to want well.
The meaning is in the cultivation. You matter because your refined judgment shapes the quality of the world the Sibyl produces. This is the meaning of taste at civilizational scale, not as luxury but as the essential human contribution to a system that can achieve anything except knowing what is worth achieving.
If rate-setting is the future of human activity, we should expect to see early experiments: primitive systems that let participants discover and negotiate rates outside the traditional structures of markets and elections.
We do. And they are proliferating: crude, partial, often failing, but unmistakably pointing toward the rate society.
Prediction markets are the purest rate-discovery mechanism yet invented. Polymarket processed over $3.5 billion in volume on the 2024 US presidential election alone[9] — participants staking real money on their beliefs about the future. The market aggregated distributed information into a single, continuously updating rate: the probability of each outcome. By election eve, Polymarket's rates were more accurate than polling aggregates, pundit consensus, and expert forecasts. The market was not smarter than any individual. It was a better rate-discovery mechanism than any institution.
Prediction markets demonstrate something the Sybilian framework requires: that distributed rate-setting can outperform centralized expert judgment. When participants have skin in the game, when their rates carry consequences, the aggregation is remarkably accurate. The wisdom is not in the crowd. It is in the mechanism that forces the crowd to put up or shut up.
Quadratic funding attacks a different problem: how to set rates for public goods. Gitcoin has distributed over $50 million[10] using a mechanism where the number of individual contributors matters more than the size of contributions. A project funded by 1,000 people giving $1 each receives more matching funds than a project funded by one person giving $1,000. The mechanism sets a rate: community breadth per dollar of allocation. It surfaces collective preference in a way that dollar-voting alone cannot.
Carbon markets represent a $2 billion attempt to set rates for environmental externalities[11]: the exchange ratio between economic activity and atmospheric degradation. The concept is pure rate-setting. Assign a price to carbon, and the market will optimize around that rate. In practice, the experiments have been plagued by verification problems. Credits are issued for forests that were never going to be cut down, or for reductions that cannot be measured. The rate is set, but the underlying measurement is unreliable. Rate-setting is only as good as the information infrastructure beneath it.
Community currencies and LETS (Local Exchange Trading Systems) are rate experiments at the smallest scale. Hundreds of communities worldwide have created local currencies that set different exchange rates than the national economy. A time-dollar system sets the rate of all labor equal: one hour of plumbing equals one hour of tutoring. A radical rate choice that produces different social dynamics than market wages. People help their neighbors more. Status hierarchies flatten. The rate determines the society.
Reputation systems are perhaps the most pervasive rate experiment, though rarely recognized as such. Your Uber rating is a rate: service quality per ride. Your Airbnb score is a rate: hospitality per stay. Your credit score is the most consequential rate experiment of all. It converts your entire financial history into a single number that determines the rate at which you can borrow, rent, and increasingly be employed.
Credit scores are social engineering disguised as objective measurement. A FICO score is a rate imposed on individuals by an algorithm they cannot inspect, using criteria they did not consent to, producing consequences they cannot escape. It is the prototype of the rate oligarchy, and 200 million Americans live under its governance[12].
These experiments converge on a single theme. Traditional institutions (governments, corporations, banks) set rates through hierarchical authority. The new mechanisms set rates through distributed participation. The Federal Reserve sets the interest rate by committee. Polymarket discovers the probability rate by market. Congress allocates the federal budget through negotiation among 535 legislators. Gitcoin allocates funding through quadratic aggregation of thousands of individual contributors. The question is not which approach is "better" in the abstract but which approach produces rates that more accurately reflect the preferences of those affected.
Each of these experiments reveals something about the architecture of rate-setting. Prediction markets show that skin-in-the-game produces accurate rates. Quadratic funding shows that mechanism design can surface genuine preference. Carbon markets show that rates without reliable measurement are hollow. Community currencies show that different rates produce different societies. Reputation systems show that concentrated rate-setting power becomes a tool of social control.
The Sybilian condition inherits all these lessons. Rate-setting at civilizational scale requires all of these properties: the accuracy of skin-in-the-game, the inclusivity of quadratic mechanisms, the rigor of verified measurement, the sensitivity to local context, and above all, the distribution of power to prevent any single entity from monopolizing the rates.
A job is a bundle: a set of tasks you perform in exchange for a wage. But a job is also a meaning-delivery system. A social role, a daily structure, a source of identity, a community of colleagues, a narrative of progress (junior to senior, apprentice to master, entry-level to management). The wage is the smallest part of what a job provides. The meaning is the largest.
In the Sybilian economy, the bundle unbundles.
Tasks are performed by machines. What remains is not a "job" in the traditional sense; it is a rate portfolio. A set of negotiations you participate in, influence you exercise, preferences you express. The wage component can be replaced by universal basic income or its equivalent. The task component is automated. But the meaning component (the identity, the narrative, the community) has no obvious replacement.
This is disorienting. Identity has been tied to jobs for centuries. "What do you do?" is how we introduce ourselves. The answer locates us in the social structure: our skills, our status, our contribution.
When no one "does" anything, when all doing is automated, the question becomes meaningless. Or rather, it transforms. "What do you do?" becomes "What do you want?" or "What rates do you set?" or "What do you optimize for?"
This is not unemployment. Unemployment is the absence of a job when jobs exist. The Sybilian condition is the absence of jobs as a category altogether.
It is closer to the condition of the aristocrat: someone whose material needs are met without labor, who must find meaning outside of production. But the aristocracy was small, defined against a mass who did work. When no one works, everyone is an aristocrat. Everyone must find meaning in something other than doing.
The historical record of what aristocrats actually did with their freedom is not encouraging. They fought wars (now automated), patronized arts (now algorithmically generated), engaged in politics (now parameterized), and overwhelmingly suffered from ennui, addiction, and existential drift. The leisure class, when given all the leisure it wanted, floundered.
The universal aristocracy of the Sybilian condition faces this at scale. Eight billion people, materially provided for, stripped of the doing that gave their ancestors purpose. The 40-hour workweek was a meaning-delivery system. It told you where to go, what to do, who you were. Remove it and you remove the scaffolding of daily life.
Rate-setting is one answer. You matter because your preferences matter. You have influence over the world because your rates feed into the system. Your life has consequence because your values shape allocation.
This is thin gruel for a species that spent millennia defining itself through action. But it may be what we have.
The rate society demands something of its participants that the job society never did: the capacity to want coherently.
In the job society, meaning was bundled with employment. You did not need to generate your own purpose; your employer generated it for you, in the form of tasks, deadlines, performance reviews, promotions. The structure of work was the structure of meaning. Even miserable jobs provided this: a reason to get out of bed, a place to be, a role to inhabit, a social world to navigate.
Strip this away and what remains?
We do not need to speculate. The evidence is already abundant, and it is grim.
Between 2021 and 2023, the United States recorded over 200,000 overdose deaths[13]. Not from recreational experimentation — from despair. Economists Anne Case and Angus Deaton documented this as "deaths of despair" — mortality from suicide, drug overdose, and alcoholic liver disease, concentrated among working-age adults without college degrees. The communities most affected are those where traditional employment has evaporated: deindustrialized towns, rural areas bypassed by the knowledge economy, places where the job society died but nothing replaced it.
One in five American adults (21%) now reports symptoms of depression. Over 40% of teenagers report persistent feelings of sadness or hopelessness, a figure that has risen steadily for a decade. These are not marginal populations. This is the mainstream. The meaning infrastructure of the job society is collapsing faster than any replacement is being built.
Japan offers a preview of what happens when the meaning crisis reaches an advanced stage. An estimated 1.5 million Japanese citizens are hikikomori[14], people who have withdrawn entirely from social life, remaining in their homes for months or years at a time. The rational response of individuals who cannot find a meaningful role in a society that still defines meaning through work, but whose labor market offers them nothing but precarity and humiliation. They have opted out of a game whose rates they find unacceptable, but they have no alternative game to play.
The UBI experiments hint at the shape of the problem. Finland's two-year basic income trial (2017-2018) gave 2,000 unemployed citizens 560 euros per month with no conditions[15]. The results were revealing: participants reported reduced stress, better health, and improved life satisfaction. But their employment rates did not change. They did not rush to get jobs, nor did they become productive entrepreneurs. They simply... existed more comfortably. Stockton, California's experiment ($500 per month to 125 residents)[16] showed similar patterns — reduced financial anxiety, better mental health, marginal employment effects.
The policy world read these results as ambiguous. Did UBI work or not? The Sybilian reading is different. UBI solves the income problem without solving the meaning problem. Give people money and they suffer less from material deprivation. But do not give them a role, a purpose, a rate to set, a function in the system, and the existential crisis remains.
The perverse irony: the same algorithmic systems that are automating human relevance are also filling the meaning vacuum they create. TikTok replaces the inner life. The teenager who cannot find purpose in work, education, or community finds pseudo-purpose in the feed: an endless stream of micro-engagements that simulate meaning without providing it. The algorithm sets the rate of dopamine per scroll, precisely calibrated to be addictive but unsatisfying. Enough to keep scrolling. Never enough to fulfill.
The meaning crisis is not a mental health problem. It is a structural problem. A society that automates doing without providing an alternative form of mattering will produce despair at industrial scale, regardless of how generous its transfer payments are.
The youth mental health crisis is particularly revealing. These are not people who lost jobs — they never had them. They are growing up in a world where the connection between effort and outcome is visibly fraying. Where the career paths their parents followed are closing. Where the algorithmic rate-setters control their social world, their cultural intake, their sense of status and belonging. Over 40% of American teens report persistent sadness or hopelessness. They are not nostalgic for the job society. They are stranded between a meaning system that is dying and one that has not yet been born.
Case and Deaton's work is often cited as an argument against economic disruption: slow down automation, protect jobs, preserve the meaning infrastructure of the industrial economy. But this is a conservative fantasy. The jobs are not coming back. The coal mines and factories and clerical offices are not reopening. Telling people to find meaning in work that does not exist is cruelty dressed as policy.
The rate society offers a different answer. Meaning through mattering, not meaning through doing. You matter because your preferences shape the world. You matter because you set rates that determine allocation. This is a thinner form of meaning than the thick social world of the workplace. But it is meaning nonetheless, and unlike the job society, it does not require artificial scarcity to function. There are only so many jobs. There are infinite rates to set.
The question is whether the transition from thick meaning (jobs) to thin meaning (rates) can happen without an intervening period of meaninglessness so devastating that it tears the social fabric beyond repair. The overdose statistics, the depression rates, the hikikomori, the teen mental health crisis: these suggest the transition is already underway, and it is not going well.
The rate society is a civilizational therapy problem. We must teach eight billion people to find purpose in wanting, when they have only ever known purpose in doing. The stakes are not abstract. Two hundred thousand overdose deaths is what the transition looks like when it is unmanaged.
If the only human function is rate-setting, then rate-setting becomes the arena of conflict.
Not physical conflict (violence is automated, handled by the Sibyl according to macro-rates that define legitimate force). Not economic conflict in the traditional sense (allocation is computed, not competed for). Not intellectual conflict (the factual questions are resolved by systems that can process more data than any human).
What remains is conflict over rates themselves. Wars of values. Battles over what the objective function should optimize for. These wars cannot be resolved by more information, more computation, or more optimization, because they are about preferences, not facts. About what kind of world is worth building.
This is already the shape of contemporary politics. Material scarcity is largely solved in wealthy societies. No one in America starves for lack of food production capacity; the country produces enough food to feed its population twice over. Yet food insecurity persists, because the rates of distribution prioritize profit over access. The conflict is about rates, not production.
The same pattern holds everywhere. America does not lack housing; it lacks housing at rates people can afford. It does not lack healthcare capacity; it lacks healthcare at rates that do not bankrupt the sick. The scarcity is parametric, not material. The rates are wrong, and the fight over what "right" means is the rate war in its current primitive form.
The Sybilian condition intensifies this. When the stakes are not "who gets this scarce resource" but "what rates govern all allocation," the conflict becomes existential. Your rate preferences are about the shape of reality.
The rate wars are already visible:
Present versus future. The discount rate — how much we value future outcomes relative to present ones — is the deepest rate war of all. The Stern Review and the Nordhaus model reached dramatically different climate policy recommendations not because they disagreed on climate science but because they used different discount rates: 1.4% versus 6%[17]. A high discount rate says: the present generation matters most. A low one says: our grandchildren's grandchildren have claims on us. Climate policy, national debt, infrastructure investment, AI safety research — all of these are battlegrounds in the discount rate war. And the participants are, by definition, unequally represented: future generations cannot vote, cannot lobby, cannot set rates. Their interests depend entirely on the rates set by people who will never meet them.
The same structure repeats across every axis of conflict: equality versus liberty, human agency versus machine optimization, local autonomy versus global coordination. Each is irreducible to calculation. Each is a values conflict dressed as a policy debate. And each will intensify as the Sibyl makes the rates explicit.
The rate wars will not look like traditional wars. There will be no armies, no territory to capture, no physical infrastructure to destroy. They will look like politics, but politics of a kind we have never seen. Not "who gets what" but "what is the exchange ratio between everything and everything else." Not arguments about policy but arguments about the parameters of the optimization function.
The weapons will be persuasion, data, coalition-building, narrative, and potentially manipulation of the systems that aggregate rates into allocation. Whoever can hack the aggregation mechanism does not need to win the rate war. They just need to rig the accounting.
There is a dark version of the rate society.
Everything described so far assumes that rate-setting is distributed, that the transition from the job society to the rate society is also a transition toward broader participation in the decisions that shape collective life. This assumption is not warranted. The history of technology suggests the opposite: new capabilities concentrate power before they distribute it, and sometimes they never distribute it at all.
Not everyone sets rates equally. Some rates dominate others. Some rate-setters have more influence than others. The hierarchy of rates becomes a hierarchy of power.
Imagine: the macro-rates are set by a small elite: those who control the Sibyl, those who have captured the governance mechanisms, those who simply moved first and established their rates as defaults. Everyone else sets micro-rates within constraints defined from above. You can choose between options, but you cannot choose the options. You can optimize within parameters, but you cannot set the parameters.
It is the current trajectory.
Platform companies set the rates for their ecosystems. Apple sets the rate of value extraction from app developers (30%). Amazon sets the rate of margin for marketplace sellers. Google sets the rate of visibility for content producers. Users "choose" within these platforms, but the platform's rates constrain all choices.
Governments set macro-rates through policy, but governments are captured by interests that shape those rates. Tax rates favor some over others. Regulatory rates protect incumbents. Monetary rates transfer wealth between generations.
The concentration is already measurable. Five companies — Apple, Google, Amazon, Meta, Microsoft — collectively mediate the digital experience of over four billion people. Their platform rates (app store commissions, ad auction parameters, search ranking weights, content moderation thresholds) shape more daily economic and social activity than most national governments. And unlike governments, they face no meaningful electoral constraint on their rate-setting. Users can theoretically leave, but network effects make exit functionally impossible. You do not choose Google because it has the best search algorithm. You choose Google because everyone else chose Google, and the rates are set accordingly.
The Sybilian condition does not automatically democratize rate-setting. It can just as easily concentrate it. If the Sibyl is controlled by a small group, that group controls all allocation. The rate oligarchy becomes the only oligarchy that matters.
The mechanism of concentration is subtle. It requires not malice but defaults. The first rates set become the baselines against which all future rates are negotiated. The initial parameters of the Sibyl become the Schelling points around which the system stabilizes. Changing a default rate requires coordination. Maintaining one requires only inaction. The rate oligarchy need not actively suppress alternatives. It need only set the initial conditions and let inertia do the rest.
This is why the political question of the Sybilian era is not "what should the rates be?" but "who sets the rates?" The content of the objective function matters less than the process by which it is determined.
Democracy, in its deepest sense, is the claim that rate-setting should be distributed. That everyone affected by rates should have a voice in setting them. That power over the objective function should be shared.
But distributed rate-setting is computationally expensive and socially demanding. It requires infrastructure: mechanisms for preference elicitation, aggregation algorithms, dispute resolution, transparency, audit. None of this exists at the scale the Sybilian condition requires. Building it is the political project of the coming century.
The alternative is what we have now: rates set by whoever built the system first. Zuckerberg's rates for social interaction. Pichai's rates for information access. Cook's rates for app distribution. Bezos's rates for commerce. These are not evil people imposing evil rates. They are ordinary people imposing ordinary rates at extraordinary scale, with no mechanism for those affected to negotiate, contest, or opt out.
This claim is not self-enforcing. It must be built into the architecture of the Sibyl. It must be defended against those who would concentrate rate-setting power. It must be the rate that governs all other rates: the meta-rate of distributed control.
After all the automation, after all the rate-setting, what is left that is distinctly human?
Not physical labor (machines do it better). Not cognitive labor (AI does it better). Not decision-making (algorithms do it better). Not even rate-setting in the mechanical sense; systems can aggregate preferences more efficiently than deliberative assemblies.
Strip away everything a machine can do, and what you are left with is not a capability but a quality. Not something humans can do that machines cannot. Something humans are that machines are not.
What is left is the thing that cannot be automated: the having of preferences in the first place.
The Sibyl can optimize for any objective function but cannot want. It can compute the best path to any goal but cannot choose the goal. It can model human preferences with extraordinary accuracy but cannot have human preferences.
Wanting is the human residual. The irreducible core that remains when everything else is automated. Not doing, but desiring. Not executing, but valuing. Not computing, but caring.
This sounds like a demotion. From masters of the physical world to sources of preference data. From doers to wanters.
But consider: wanting is not nothing. It is the source of all purpose. The Sibyl, for all its power, has no purpose of its own. It is an optimization machine aimed at targets it did not choose. All its power is borrowed from the preferences of beings that can actually prefer.
Humans are those beings. In the Sybilian economy, that is our function: to be the beings that prefer. The experiencers that value. The subjects that care.
There is a philosophical tradition, from Aristotle through Heidegger to the phenomenologists, that insists this is not a demotion but a return. That the essence of being human was never the doing but the experiencing. That consciousness, subjectivity, the felt quality of being alive are not side effects of cognition but its foundation. The Sibyl can process information but cannot experience processing information. It can model preference but cannot feel preference. It can simulate caring but cannot care.
If this philosophical tradition is correct, the rate society does not diminish humanity. It distills it. By stripping away everything that machines can replicate (and machines can replicate everything except subjective experience), what remains is the one thing that is irreplicably human: the inner life. The texture of wanting. The weight of caring. The ache of preference when two goods conflict and you must choose.
The rate society is not a society where humans are obsolete. It is a society where humans are finally, exclusively, essentially human. Where the things we share with machines have been outsourced to machines, and what remains is the thing that cannot be outsourced: the having of a perspective, the holding of values, the caring about outcomes.
This may be enough. It may be everything.
Or it may be unbearable. A species built for action, condemned to preference. A species that defined itself through doing, stripped of all doing and left only with wanting. A species that built cathedrals and crossed oceans and split atoms, now reduced to filling in parameters on a preference form.
The 200,000 overdose deaths suggest the unbearable reading has empirical support. The 1.5 million hikikomori suggest it transcends culture. The 40% of teens reporting persistent sadness suggest it is not generational but structural. Humanity is not depressed because of bad policy or insufficient therapy. Humanity is depressed because the meaning infrastructure of the job society is dissolving, and the meaning infrastructure of the rate society has not yet been built.
The rate society is the experiment that will tell us which reading is correct. Whether wanting is enough. Whether the human residual is a treasure or a torment.
Every rate discussed in this chapter (micro, meso, macro, algorithmic, experimental) is secondary to one. There is a final rate that must be set, and it determines whether any of the others matter:
How much human agency for how much optimal outcome?
The Sibyl can compute better allocation than humans can negotiate. It can set rates that produce better outcomes than democratic deliberation. It can optimize in ways that messy, conflicted, irrational human rate-trading never will.
If we optimize for outcomes, we should let the Sibyl set the rates. Remove humans from the loop. Accept whatever the optimization produces.
But "better outcomes" is itself a rate-laden concept. Better for whom? By whose measure? The question of what counts as better cannot be answered by the system that optimizes for better. It must be answered by those who define better: humans, with their messy, conflicted, irrational preferences.
The last trade is whether to keep trading at all. Whether to preserve human rate-setting even when it produces worse outcomes. Whether to accept inefficiency as the price of agency.
Some will want to trade agency for outcomes. Let the Sibyl decide. Sit back and enjoy the optimized world. There is a coherent case for this: human rate-setting is noisy, biased, tribal, short-sighted, and corruptible. Each rate war, each captured governance mechanism, each credit score weaponized as social control — these are arguments for removing humans from the loop. If the Sibyl can compute rates that produce better outcomes for more beings with less suffering, who are we to insist on our clumsy, partisan, self-interested rate-trading?
Others will want to preserve agency at any cost. Keep humans in the loop. Accept worse outcomes for the sake of remaining beings that choose, not beings that merely want. There is a coherent case for this too: a perfectly optimized world that no one chose is a prison. A world of suboptimal outcomes that humans fought for, negotiated over, and accepted responsibility for is a civilization. The value of agency is not in the outcomes it produces but in the act of producing them. A life of imposed perfection is not a life.
Between these poles, there will be a spectrum. Partial agency. Bounded optimization. Humans setting some rates and delegating others. The negotiation of that boundary (how much to keep, how much to surrender) will be the defining political question of the Sybilian era. It will not be settled once. It will be renegotiated continuously, as capabilities change, as trust develops or erodes, as the consequences of delegation become visible.
The tragedy is that the trade is not symmetric. Each increment of agency surrendered is easy: the Sibyl produces better outcomes, people enjoy the results, the case for further delegation strengthens. Each increment of agency reclaimed is hard: it means accepting worse outcomes, disrupting optimized systems, bearing costs that delegation had eliminated. The ratchet favors surrender. The gravity of optimization pulls toward a world where the Sibyl sets all the rates and humans merely experience the results.
This is not a trade the Sibyl can make for us. It is the rate that defines what kind of beings we want to be.
It is the last human choice.