I’ve recently taken an interest in the concept of the technological “Singularity”, referring to the acceleration of economic growth and social change brought along by escalating technological growth, and the potential for extreme growth (thousands of times faster than what exists now) in the future. People sometimes use “exponential” to refer to fast growth, but the reality is that (a) exponential curves do not always grow fast, and (b) economic growth has actually been faster than exponential to this point.
Life is estimated to estimated to be nearly 4 billion years old, but sexual reproduction and multicellular life are only about a billion years old. In other words, for most of its time in existence, life was relatively primitive, and growth itself was slow. Organisms themselves could reproduce quickly, but they died just as fast, and the overall change was minimal. This was true until the Cambrian Explosion, about 530 million years ago, when it accelerated. Evolution has been speeding up over time. If we represent “growth” in terms such as energy capture, energy efficiency, and neural complexity, we see that biological evolution has a faster-than-exponential “hockey stick” growth pattern. Growth was very slow for a long time, then the rate sped up.
One might model pre-Cambrian life’s growth rate at below 0.0000001% (note: these numbers are all estimates) per year, but by the age of animals it was closer to 0.000001% per year, or a doubling (of neural sophistication) every 70 million years or so, and several times faster than that in the primate era. Late in the age of animals, creatures such as birds and mammals could adapt rapidly, taking appreciably different forms in a mere few hundred thousand years. With the advent of tools and especially language (which had effects on assortative mating, and created culture) the growth rate, now factoring in culture and organization as well as evolutionary changes, skyrocketed to a blazing 0.00001% per year, in the age of hominids. Then came modern humans.
Data on the economic growth of human society paint a similar picture: accelerating exponential growth. Neolithic humans plodded along at about 0.0004% per year (still an order of magnitude faster than evolutionary change) and with the emergence of agriculture around 10000 B.C.E., that rate spend up, again, to 0.006% per year. This fostered the growth of urban, literate civilization (around 3000 B.C.E) and that boosted the growth rate to a whopping 0.1% per year, which was the prevailing economic growth rate for the world up until the Renaissance (1400 C.E.).
This level of growth– a doubling every 700 years– is rapid by the standards of most of the Earth’s history. It’s so obscenely fast that many animal and plant species have, unfortunately, been unable to adapt. They’re gone forever, and there’s a credible risk that we do ourselves in as well (although I find that unlikely). Agricultural humans increased their range by miles per year and increased the earth’s carrying capacity by orders of magnitude. Despite this progress, such a rate would be invisible to the people living in this 4,400-year span. No one had the global picture, and human lives aren’t long enough for anyone to have seen the underlying trend of progress, as opposed to the much more severe, local ups and downs. Tribes wiped each other out. Empires rose and fell. Religions were born, died, and were forgotten. Civilizations that grew too fast faced enemies (such as China, which likely would have undergone the Industrial Revolution in the 13th century had it not been susceptible to Mongol invasions). Finally, economic growth that occurred in this era was often absorbed entirely (and then some) by population growth. A convincing case can be made that the average person’s quality of life changed very little from 10000 B.C.E. to 1800 C.E., when economic growth began (for the first time) to outpace population growth.
In the 15th to 17th centuries, growth accelerated to about 0.3 percent per year: triple the baseline agricultural rate. In the 18th century, with the early stages of the Industrial Revolution, the Age of Reason, and the advent of rational government (as observed in the American experiment and French Revolution) it was 0.8 percent per year. By this point, progress was visible. Whether this advancement is desirable has never been without controversy, but by the 18th century, that it was occurring was without question. At that rate of progress, one would see a doubling of the gross world product in a long human life.
Even Malthus, the archetypical futurist pessimist, observed progress in 1798, but he made the mistake of assuming agrarian productivity to be a linear function of time, while correctly observing population growth to be exponential. In fact, economic growth has always been exponential: it was just a very slow (at that time, about 1% per year) exponential function that looked linear. On the other hand, his insight– that population growth would outpace food production capacity, leading to disaster– would have been correct, had the Industrial Revolution (then in its infancy) not accelerated. (Malthusian catastrophes are very common in history.) The gross world product increased more than six-fold in the 19th century, rising at a rate of 1.8 percent per year. Over the 20th, it continued to accelerate, with economic growth at its highest in the 1960s, at 5.7 percent per year– or a doubling every 150 months. We’re now a society that describes lower-than-average but positive growth as a “recession”.
In that sense, we’re also “in decline”. We’ve stopped growing at anything near our 1960s peak rate. We’re now plodding along at about 4.2 percent per year, if the last three decades are any indication. Most countries in the developed world would be happy to grow at half that rate.
The above numbers, and the rapid increase in the growth rate itself, describe the data behind the concept of “The Singularity”. Exponential growth emerges as a consequence of the differential equation, dy/dx = a * y, whose solution is an exponential function. Logistic growth is derived from the related equation dy/dx = a * y * (1 – y/L), where L is an upper limit or “carrying capacity”. Such limitations always exist, but I think that, with regard to economic growth, that limit is very far away– far enough away that we can ignore it for now. However, what we’ve observed is much faster than exponential growth, since the growth rate itself seems to be accelerating (also at a faster than exponential rate). So what is the correct way to model it?
One class of models for such a phenomenon is derived from the differential equation, dy/dx = a*y^(1+b), where b > 0. The solution to this differential equation (power law) is of the form y = C/(D-t)^(-1/b), the result of which is that as t -> D, growth becomes infinite. Hence, the name “Singularity”. No one actually believes that economic progress will become literally infinite, but that is a point at which it is assumed we will land comfortably in a post-scarcity, indefinite-lifespan existence. These two concepts are intimately connected and I would consider them identical. Time is the only scarce element in the life of a person middle-class or higher, but extremely so as long as our lifespans are so short compared to the complexity of the modern world (a person only gets to have one or two careers). Additionally, if people live “forever” (by which I mean millions of years, if they wish) then there will be an easy response to not being able to afford something: wait until you can. There will still be differences in status among post-scarcity people (some being at the end of a five-year waiting list for lunar tourism, and with the richest paying a premium for the prestige of having human servants) and probably some people will care deeply about them, but on the whole, I think these differences will be trivial and people will (over time) develop an immunity to the emotional problems of extreme abundance.
I should note that there are also dystopian Singularity possibilities, such as in The Matrix, in which machines become sentient and overthrow humans. I find this extremely far-fetched, because most artificial intelligence (to date) is still human intelligence applied to difficult statistical problems. We use machines to do things that we’re bad at, like multiply huge matrices in fractions of a second, and analyze game trees at 40-ply depth. I don’t see machines becoming “like us” because we’ll never have a need for them to be so. We’ll replicate functionality we want in order to solve menial tasks (with an increasingly sophisticated category of tasks being considered “menial”) but we won’t replicate the difficult behaviors and needs of humans. I don’t think we’ll fall into the trap of creating a “strong AI” that overthrows us. Sad to say it, but we’ve been quite skilled, over the millennia, at dehumanizing humans (slavery) in the attempt to make ideal workers. The upshot of this is that we’re unlikely to go to the other extreme and attempt to humanize machines. We’ll make them extremely good at performing our grunt work and leave the “human” stuff to ourselves.
Also, I don’t think a “Singularity” (in the sense of infinite growth) is likely, because I don’t think the model that produces a singularity is correct. I think that economic and technical growth are accelerating, and that we may see a post-scarcity, age-less world as early as 2100. That said, the data show deceleration over the past 50 years (from 5-6 percent to 3-4 percent annual growth) so rather than rocketing toward such a world, we seem to be coasting. I would be willing to call the past 40 years, in the developed world, an era of malaise and cultural decline. It’s the Great Discouragement, culminating a decade (2000s) of severe sociological contraction despite economic growth in the middle years, ending with a nightmare recession. What’s going on?
Roughly speaking, I think we can examine, and classify, historical periods by their growth rate, like so:
- Evolutionary (below 0.0001% per year): 3.6 billion to 1 million BCE. Modern humans not yet on the scene.
- Pre-Holocene (0.0001% to 0.01% per year): 1 million to 10,000 BCE.
- Agrarian (0.01 to 1.0% per year): 10,000 BCE to 1800 CE. Most of written human history occurred during this time. Growth was slower than population increase, hence frequent Malthusian conflict. Most labor was coerced.
- Industrial (1.0 to 10.0% per year): 1800 CE to Present. Following the advent of rational government, increasing scientific literacy, and the curtailment of religious authority, production processes could be measured and improved at rapid rates. Coercive slavery was replaced by semi-coercive wage labor.
- Technological (10.0 to 100.0+% per year): Future. This rate of growth hasn’t been observed in the world economy as a whole, ever, but we’re seeing it in technology already (Moore’s Law, cost of genome sequencing, data growth, scientific advances). We’re coming into a time where things that were once the domain of wizardry (read: impossible) such as reading other peoples’ dreams can now be done. In the technological world, labor will be non-coercive, because the labor of highly motivated people is going to be worth 10 to 100 times more than that of poorly motivated people.
Each of these ages has a certain mentality that prospers in it, and that characterizes successful leadership in such a time. In the agrarian era, the world was approximately zero-sum, and the only way for a person to become rich was to enslave others and capture their labor, or kill them and take their resources. In the early industrial era, growth became real, but not fast enough to accommodate peoples’ material ambitions, creating a sense of continuing necessity for hierarchy, intimidation, and injustice in the working world. In a truly technological era (which we have not yet entered) the work will be so meaningful and rewarding (materially and subjectively) that such control structures won’t be necessary.
In essence, these economic eras diverge radically in their attitudes toward work. Agrarian-era leaders, if they wanted to be rich, could only do so by controlling more people. Kings and warlords were assessed on the size of their armies, chattel, and harems. Industrial-era leaders focused on improving mechanical processes and gaining control of capital. They ended slavery in favor of a freer arrangement, and workplace conditions improved somewhat, but were still coarse. Technological-era leadership doesn’t exist yet, in most of the world, but its focus seems to be on the deployment of human creativity to solve novel problems. In the technological world, a motivated and happy worker isn’t 25 or 50 percent more productive than an average one, but 10 times as effective. As one era evolves into the next, the leadership of the old one proves extremely ineffective.
The clergy and kings of antiquity were quite effective rulers in a world where almost no one could afford books, land was the most important form of wealth, and people needed a literate, historically-aware authority to direct them over what to do with it. Those in authority had a deep understanding of the limitations of the world and the slow rate of human progress: much slower than population growth. They knew that life was pretty close to a zero-sum struggle, and much of religion focuses on humanity’s attempts to come to terms with such a nasty reality. These leaders also knew, in a macabre way, how to handle such a world: control reproduction, gain dominion over land through force, use religion to influence the culture and justify land “ownership”, and curtail population growth in small-scale massacres called “wars” instead of suffering famines or revolutions.
People like Johannes Gutenberg, Martin Luther, John Locke, Adam Smith, and Voltaire came late in the agrarian era changed all that. Books became affordable to middle-class Europeans, and the Reformation happened a couple centuries later. This culminated in the philosophical movement known as The Enlightenment, in which Europe and North America disavowed rule based on “divine right” or heredity and began applying principles of science and philosophy to all areas of life. By 1750, there was a world in which the clerics and landlords of the agrarian era were terrible leaders. They didn’t know the first thing about the industrial world that was appearing right in front of them. Over the next couple hundred years, they were either violently overthrown (as in France) or allowed to decline gracefully out of influence (as in England).
The best political, economic, and scientific minds in that time could see a world that grew at industrial rates that were unheard of until that time. The landowning dinosaurs from the agrarian era died out or lost power. This was not always an attractive picture, of course. One of the foremost conflicts between an industrial and an agrarian society was the American Civil War, an extremely traumatic conflict for both sides. Then there were the nightmarish World Wars of the early 20th century, which established that industrial societies can still be immensely barbaric. That said, the mentalities underlying these wars were not novel, and it wasn’t the industrial era that caused them, so much as it was a case of pre-industrial mentalities combining with industrial power, to very dangerous results.
For example, before Nazism inflamed it, racism in Germany was (although hideous) not unusual by European or world standards, then or at any point up to then. In fact, it was a normal attitude in England, the United States, Japan, and probably all of the other nation-states that were forming around that time. Racism, although I would argue it to be objectively immoral in any era, was a natural byproduct of a world whose leaders saw it necessary, for millennia, to justify dispossession, enslavement, and massacre of strangers. What the 1940s taught us, in an extreme way, is that this hangover from pre-industrial humanity, an execrable pocket of non-Reason that had persisted into industrial time, could not be accepted.
The First Enlightenment began when leading philosophers and statesmen realized that industrial rates of growth were possible in a still mostly agrarian world, and they began to work toward the sort of world in which science and reason could reign. Now we have an industrial economy, but our world is still philosophically, culturally and rationally illiterate, even in the leading ranks. Still, we live on the beginning fringe of what might be (although it is too early to tell) a “Second Enlightenment”. We now have an increasing number of technological thinkers in science and academia. We see such thinking on forums like Hacker News, Quora, and some corners of Reddit. It’s “nerd culture”. However, by and large, the world is still run by industrial minds (and the mentality underlying American religious conservatism is distinctly pre-industrial). This is the malaise that top computer programmers face in their day jobs. They have the talent and inclination to work to turn $1.00 into $2.00 on difficult, “sexy” problems (such as machine learning, bioinformatics, and the sociological problems solved by many startups) but they work for companies and managers that have spent decades perfecting the boring, reliable processes that turn $1.00 into $1.04, and I would guess that this is the kind of work with which 90% of our best technical minds are engaged: boring business bullshit instead of the high-potential R&D work that can actually change the world. The corporate world still thinks in industrial (not technological) terms, and it always will. It’s an industrial-era institution, as much as baronies and totalitarian religion are agrarian-era beasts.
Modern “nerd culture” began in the late 1940s when the U.S. government and various corporations began funding basic research and ambitious engineering and scientific projects. This produced immense prosperity, rapid growth, and an era of optimism and peace. It enabled us to land a man on the moon in 1969. (We haven’t been back since 1972.) It built Silicon Valley. It looked like the transition from industrial to technological society (with 10+ percent annual economic growth) was underway. An American in 1969 might have perceived that the Second Enlightenment was underway, with the Civil Rights Act, enormous amounts of government funding for scientific research, and a society whose leaders were, by and large, focused on ending poverty.
Then… something happened. We forgot where we came from. We took the great infrastructure that a previous generation had build for granted, and let it decay. As the memory of the Gilded Age (brought to us by a parasitic elite) and Great Depression faded, elitism became sexy again. Woodstock, Civil Rights, NASA and “the rising tide that lifts all boats” gave way to Studio 54 and the Reagan Era. Basic research was cut for its lack of short-term profit, and because the “take charge” executives (read: demented simians) that raided their companies couldn’t understand what those people did all day. (They talk about math over their two-hour lunches? They can’t be doing anything important! Fire ‘em all!) Academia melted down entirely, with tenure-track jobs becoming very scarce. America lost its collective vision entirely. The 2001 vision of flying cars and robot maids for all was replaced with a shallow and nihilistic individual vision: get as rich as you can, so you have a goddamn lifeboat when this place burns the fuck down.
The United States entered the post-war era as an industrial leader. It rebuilt Europe and Japan after the war, lifted millions out of poverty, made a concerted (if still woefully incomplete) effort to end its own racism, and had enormous technical accomplishments. Yet now it’s in a disgraceful state, with people dying of preventable illnesses because they lack health insurance, and business innovation stagnant except in a few “star cities” with enormous costs of living, where the only thing that can get funded are curious but inconsequential sociological experiments. Funding for basic research has collapsed, and the political environment has veered to the far right wing. Barack Obama– who clearly has a Second Enlightenment era mind, if a conservative one in such a frame– has done an admirable job of fighting this trend (and he’s accomplished far more than his detractors, on the left and right, give him credit for) but one man alone cannot hold back the waterfall. The 2008 recession may have been the nadir of the Great Discouragement, or the trough may still be ahead of us. Right now, it’s too early to tell. We’re clearly not out of the mess, however.
How do we escape the Great Discouragement? To put it simply, we need different leadership. If the titans of our world and our time are people who can do no better than to turn $1.00 into $1.04, then we can’t expect more of them. If we let such people dominate our politics, then we’ll have a mediocre world. This is why we need the Second Enlightenment. The First brought us the idea of rational government: authority coming from laws and structure rather than charismatic personalities, heredity, or religious claims. In the developed world, it worked! We don’t have an oppressive government in the United States. (We may have an inefficient one, and we have some very irrational politicians, but the system is shockingly robust when one considers the kinds of charismatic morons who are voted into power on a fairly regular basis.) To the extent that the U.S. government is failing, it’s because the system has been corrupted by the unchecked corporate power that has stepped into the power vacuum created by a limited, libertarian government. Solving the nation’s economic and sociological problems, and the cultural residue associated with a lack of available, affordable education, will take us a long way toward fixing the political issues we have.
The Second Enlightenment will focus on a rational economy and a fair society. We need to apply scientific thought and philosophy to these domains, just as we did for politics in the 1700s when we got rid of our kings and vicars. I don’t know what the solution will end up looking like. Neither pure socialism nor pure capitalism will do: the “right answer” is very likely to be a hybrid of the two. It is clear to me, to some extent, what conditions this achievement will require. We’ll have to eliminate the effects of inherited wealth, accumulated social connection, and the extreme and bizarre tyranny of geography in determining a person’s economic fortune. We’ll have to dismantle the current corporate elite outright; no question on that one. Industrial corporations will still exist, just as agrarian institutions do, but the obscene power held by these well-connected bureaucrats, whose jobs involve no production, will have to disappear. Just as we ended the concept of a king’s”divine right” to rule, turning such people into mere figureheads, we’ll have to do the same with corporate “executives” and their similarly baseless claims to leadership.
We had the right ideas in the Age of Reason, and the victories from that time benefit us to that day, but we have to keep fighting to keep the lights on. If we begin to work at this, we might see post-scarcity humanity in a few generations. If we don’t, we risk driving headlong into another dark age.