Intro
You find yourself in 2040. Everything seems wonderful; money is abundant, entertainment is personalised to every whim, and charismatic AI companions are the best conversation partners you can imagine. Terminator has not taken over the world. AI Safety concerns have been largely dismissed.
You explore further - sure your needs are very well provided for, but you realise that you don’t really have any power to do anything in the world. You can barely form connections with other people who are busily talking to their chatbots. You don’t have a job - the AI does everything for you. You can’t create anything, since the AI has you beat there too. There might still be “elections”. but AI bureaucracies have gotten so complicated that governments can’t really change the system. The algorithm decides everything now. You realise this Brave New World is essentially a human zoo.
This is the world that the paper “Gradual Disempowerment“ warns us about. It describes how societal systems today mostly work in service of human needs and desires, but may face extreme pressure to automate themselves using AI. The authors argue that societal systems which don’t need humans to function neglect human needs and desires, and disempower humanity from shaping its own future.
Societal systems today are mostly good
You could think of today’s societal systems as comprised of three main parts:
The economy
Culture
Governments or nation-states
These systems are far from perfect, but they are better than what has come before. Fewer humans (as a percentage of the total population) are in poverty or dying from preventable diseases than ever. Most people, especially those living in democracies, have fairly good lives.
Let’s go through each of these systems and explain how humans shape what these systems do.
Economy
The economy represents how we make, buy, and sell things and also how money flows between people, companies, and governments. Having money means you can consume goods and services and thus shape what the economy produces.
In the UK, about 60% of the economic pie (the wealth created or the money that exchanges hands) every year goes to workers as wages, while the other 40% goes to people who own physical things or assets (e.g. shareholders, business owners, landlords).
Since workers earn 60% of this, they have a lot of power and influence over what gets made in the economy. What they decide to buy informs what gets produced and what then gets sold.
As an example, if you are in a city right now, there are probably 10 coffee shops within a 10-minute walk of your location because so many workers want their (delicious?) oat latte first thing in the morning. So there’s a lot of demand for this product and therefore people decide to provide it.
Culture
Culture encompasses the media we consume, the customs we follow, and the social norms we live by. It shapes how we think, behave, and relate to one another.
People move towards cultures that serve them. Cultures that don’t adapt or serve their members’ needs simply die out - people leave for better alternatives, or healthier cultures outcompete them. This means people have real power to shape culture: they produce content, challenge norms, and vote with their feet.
Think about Nokia. The company had a rigid culture that couldn’t adapt when smartphones arrived - Apple ate their lunch. The reason companies spend so much time thinking about “culture” is because a good culture is essential to success.
States
The government provides critical functions for society. Without it, healthcare, defence, education and the rule of law would collapse. Even though the government has a huge amount of power, it is constrained by the demands of the people it governs, even in autocracies.
Ultimately, governments need an educated, skilled highly motivated workforce to gain sufficient tax revenue. Over 70% of the UK government’s income comes directly from their population, through various kinds of tax.
They also need people for the national security apparatus. Even autocracies need people to staff their security services. Many a revolution has succeeded because, when push came to shove, the military refused to open fire on protestors. Bangladesh’s July 2024 revolution is one such example.
Systems face pressure to be automated by AI
All of these societal systems face enormous pressure to automate.
The economy
Imagine you are the leader of a successful company. You love your staff, but see that all your competitors are slowly automating, and raking in massive profits by doing so. Your shareholders encourage you to do the same.
Now imagine this dynamic across society. Some companies automate solely to maximise profit. Others do so to stay competitive with the automated companies. Even the last holdouts against automation are faced with an economy that moves so fast and is so complicated that they have to offload all their major strategic decisions to AI.
The economy has largely been automated.
Culture
Before the internet, we used to get our information from shared sources. Our local community, newspapers and TV companies largely stuck to the truth. We lived in a shared reality.
Today’s world looks quite different. Information is still produced by people, but what we get exposed to is determined by the opaque algorithms of social media. Echo chambers have become a big problem.
Imagine this trend continuing. Instead of just having algorithmic recommendations, we have AI produced content, hyper-personalised to our every whim. The production of culture slowly becomes about prompting artificial systems, and soon those systems know enough about us to just produce the content we want before we know we want it.
We can easily see how this happens. Mass culture takes time to produce and needs some way of reaching a wide group of people. Compare that to artificial content, which can be made instantly and be highly tailored to a person’s needs. It’s hard to see how human-made culture wins out.
Culture ends up being automated too…
States
If you are a dictator, of course you’d want to automate as much of the state apparatus as possible. Less dependence on others means less of a chance that they can overthrow you. We have already seen automated surveillance in China, and the Russia-Ukraine conflict has shown that drones are the future of combat. We should expect these trends to continue.
The really worrying thing is that even in democracies we might see people becoming less relevant. As artificial systems become more and more complex, it will become easier and easier to defer to them for efficient decision making. This is especially so if you are competing against other states that have themselves largely automated. Sure, there might be “elections” every 5 years, but in practice governments are powerless to change the systems that they now defer to in almost every way.
After a while, if almost all of the economic work is done by AIs, and governments stop needing people for tax, then the fate of democracy might become an open question…
AI-automated societal systems are bad for humans
Our societal systems are not good by default. They have evolved over thousands of years to get to the point they are at now. Throughout that time, underlying them has been a single assumption: these systems are designed for and created by people. “Gradual Disempowerment” invites us to consider a world where, slowly, over the course of years, this assumption stops being true.
The outlook is bleak. By default, a fully automated economy means that the value of human labour falls to nothing. If most humans stop being paid wages, then the percentage of the economy that produces things for those people will wither away. Even those humans who do retain financial power can’t in practice affect the economy, as it now moves too quickly for humans to understand. In the extreme, we can imagine fully AI companies, producing things for AI consumers. Who provides for basic human needs then…
The story is similar for culture. 29% of Americans under 30 say they feel lonely most of the time. Imagine that the gap is filled by AI companions. They’re always available, never judge you, and say exactly what you want to hear. Over time, this becomes the default form of friendship and cultural engagement. People’s feeds are full of AI generated content and their conversations are all with AI. In this world, human connection falls by the wayside. It’s hard to imagine humans coordinating on anything in this world either; how do you inspire someone to leave their AI bubble to go on a protest? Humans have been utterly disempowered in this world.
Even if this scenario doesn’t happen, there are still ways for humans to be gradually disempowered relative to AIs. If AI is so much faster at producing cultural content, we can imagine being flooded en masse with AI produced content. This might be slop for now, but quality is rapidly going up. We might merely become hapless consumers of AI culture, or pawns of the AI culture war.
Finally, let’s consider states. As more and more state decisions are offloaded to AI, the state’s bureaucracy might become so complex and fast moving that no person can have a handle on it. Humans stop being able to understand the legal and regulatory systems on their own.
Even worse, the state stops relying on humans economically or for security purposes. It is hard to see how they would be incentivised to care about even their citizens in such a scenario. We can already see this dynamic play out in states suffering from the “resource curse.” An oil rich country like Venezuela has no reason to care about its citizens - all revenue comes from oil. Now imagine Venezuela with a fully automated security apparatus. That is the future that “Gradual Disempowerment” is warning us against.
All of these dynamics interact with each other. An automated economy makes it easier for states to ignore the preferences of their citizens. An automated culture helps keep people in line. These dynamics are re-enforcing, and if we end up in a world where we have been gradually disempowered, it’s hard to see a way out.
What you can do
Gradual disempowerment is an under-explored problem at the moment, and there is plenty of helpful work that can be done on the issue:
How would we find out if we have been gradually disempowered? Relatedly, how close are we to being disempowered today?
Would having personal AI assistants that act for us help, or make things worse?
Technology that helps us identify AI-generated content, or ways to verify that content was made by humans
Investigations into how countries with natural resources were able to avoid the resource curse and instead build resilient democracies (eg Norway).
These are just some ideas! There is plenty you can be doing if you are worried about this issue, and we’re excited to see what you come up with on the course.



