What Silicon Valley’s ageism means

Computer programming shouldn’t be ageist. After all, it’s a deep discipline with a lot to learn. Peter Norvig says it takes 10 years, but I’d consider that number a minimum for most people. Ten years of high-quality, dedicated practice to the tune of 5-6 hours per day, 250 days per year, might be enough. For most people, it’s going to take longer, because few people can work only on the interesting problems that constitute dedicated practice. The fundamentals (computer science) alone take a few thousand hours of study, and then there’s the experience of programming itself, which one must do in order to learn how to do it well. Getting code to work is easy. Making it efficient, robust, and legible is hard. Then, there’s a panoply of languages, frameworks, paradigms, to learn and absorb and, for many, to reject. As an obstacle, there’s the day-to-day misery of a typical software day job, in which so much time is wasted on politics and meetings and pointless projects that an average engineer is lucky to have 5 hours per week for learning and growth. Ten years might be the ideal; I’d bet that 20 years is typical for the people who actually become great engineers and, sadly, the vast majority of professional programmers never get anywhere close.

It takes a long time to be actually good in software engineering. The precocious are outliers. More typically, people seem to peak after 40, as in all the other high-skill disciplines. It, then, seems that most of age-related decline in this field is externally enforced. Age discrimination is not an artifact of declining ability but changing perceptions.

It doesn’t make sense, but there it is.

Age discrimination has absolutely no place in technology. Yet it exists. After age 40, engineers find it increasingly difficult to get appropriate jobs. Startups are, in theory, supposed to “trade against” the inefficiencies and moral failures of other companies but, on this issue, the venture capital (VC) funded startups are the biggest source of the problem. Youth and inexperience have become virtues, while older people who push back against dysfunction (and, as well, outright exploitation) are cited as “resistant to change”.

There’s another issue that isn’t derived from explicit ageism, but might as well be. Because our colonizers (mainstream business culture) are superficial, they’ve turned programming into a celebrity economy. A programmer has two jobs. In addition to the work itself, which is highly technical and requires continual investment and learning, there’s a full-time reputation-management workload. If a machine learning engineer works at a startup and spends most of his time in operations, he’s at risk of being branded “an ops guy”, and may struggle to get high-quality projects in his specialty from that point on. He hasn’t actually lost anything– in fact, he’s become far more valuable– but the superficial, nontechnical idiots who evaluate us will view him as “rusty” in his specialty and, at the least, exploit his lack of leverage. All because he spent 2 years doing operations, because it needed to be done!

As we get older and more specialized, the employment minefield becomes only more complicated. We are more highly paid at that point, but not by enough of a margin to offset the increasing professional difficulties. Executives cite the complexity of high-end job searches when demanding high salaries and years-long severances. Programmers who are any good face the same, but get none of those protections. I would, in fact, say that any programmer who is at all good needs a private agent, just as actors do. The reputation management component of this career, which is supposed to be about technology and work and making the world better, but is actually about appeasing the nontechnical, drooling patron class, constitutes a full-time job that requires a specialist. Either we need unions, or we need an agent model like Hollywood, or perhaps we need both. That’s another essay, though.

The hypocrisy of the technology employer

Forty years ago, smart people left finance and the mainstream corporate ladder for technology, to move into the emerging R&D-driven guild culture that computing had at the time. Companies like Hewlett-Packard were legitimately progressive in how they treated their talent, and rewarded for it by their employees’ commitment to making great products. In this time, Silicon Valley represented, for the most technically adept people in the middle class, a genuine middle path. The middle path will require its own essay, but what I’m talking about here is a moderate alternative between the extremes of subordination and revolution. Back then, Silicon Valley was the middle path that, four decades later, it is eagerly closing.

Technology is no longer managed by “geeks” who love the work and what it can do, but by the worst kinds of business people who’ve come in to take advantage of said geeks. Upper management in the software industry is, in most cases, far more unethical and brazen than anywhere else. To them, a concentration of talented people who don’t have the inclination or cultural memory that would lead them to fight for themselves (labor unions, agent models, ruthlessness of their own) is an immense resource. Consequently, some of the most disgusting HR practices (e.g. stack ranking, blatant sexism) can be found in the technology industry.

There’s one really bad and technical trait of software employers that, I think, has damaged the industry immensely. Technology employers demand specialties when vetting people for jobs. General intelligence and proven ability to code isn’t enough; one has to have “production experience” in a wide array of technologies invented in the past five years. For all their faults, the previous regime of long-lasting corporations was not so bigoted when it came to past experience, trusting people to learn on the job, as needed. The new regime has no time for training or long-term investment, because all of these companies have been built to be flipped to a greater fool. In spite of their bigoted insistence on pre-existing specialties in hiring, they refuse to respect specialties once people are hired. Individual programmers who attempt to protect their specialties (and, thus, their careers) by refusing assignment to out-of-band or inferior grunt work are quickly fired. This is fundamentally hypocritical. In hiring, software companies refuse to look twice at someone without a yellow brick road of in-specialty accomplishments of increasing scope; yet, once employees are inside and fairly captive (due to the pernicious stigma against changing jobs quickly, even with good reason) they will gladly disregard that specialty, for any reason or no reason. Usually, this is framed as a business need (“we need you to work on this”) but it’s, more often, political and sometimes personal. Moving talent out of its specialty is a great way for insecure middle managers to neutralize overperformance threats. In a way, employers are like the pervert who chases far-too-young sexual partners (if “partner” is the right word here) for their innocence, simply to experience the thrill of destroying it. They want people who are unspoiled by the mediocrity and negativity of the corporate world, because they want to inflict the spoilage. The virginity of a fresh, not-yet-cynical graduate from a prestigious university is something they want all for themselves.

The depression factor

I’m not going to get personal here, but I’m bipolar so when I use words like “depression” and “hypomania” and “anxiety” I do, in fact, know what the fuck I am talking about.

A side effect of corporate capitalism, that I see, is that it has created a silent epidemic of middle-aged depression. The going assumption in technology that mental ability declines after age 25 is not well supported, and it is in fact contrary to what most cultures believe about intelligence and age. (In truth, various aspects of cognition peak at different ages– from language acquisition at age 5 to writing ability around 65– and there’s also so much individual variation that there’s no clear “peak” age.) For general, holistic intelligence, there’s no evidence of an age-bound peak in healthy people. While this risks sounding like a “No True Scotsman” claim, what I mean to say is that every meaningful age-related decline in cognition can be tracked to a physical health problem and not aging itself. Cardiovascular problems, physical pain and side effects of medication can impair cognition. I’m going to talk about the Big One, though, and that’s depression. Depression can cause cognitive decline. Most of that loss is reversible, but only if the person recovers from it and, in many cases, they never do.

In this case, I’m not talking about severe depression, the kind that would have a person considering electroconvulsive therapy or on suicide watch. I’m talking about mild depression that, depending on time of diagnosis, might be considered subclinical. People experiencing it in middle age are, one presumes, liable to attribute it to getting older rather than a real health problem. Given that middle-aged “invisibility” in youth-obsessed careers is, in truth, unjust and depressing, it seems likely that more than a few people would experience depression and fail to perceive it as a health issue. That’s one danger of depression that those who’ve never experienced it might not realize exists: when you’re depressed, you suffer from the delusion that (a) you’ve always been depressed, and (b) that no other outlook or mood makes sense. Depression is delusional, and it is a genuine illness, but it’s also dangerously self-consistent.

Contrary to the stereotype, people with depression aren’t always unhappy. In fact, people with mild depression can be happy quite often. It’s just easier to make them unhappy. Things that don’t faze normal people, like traffic jams or gloomy weather or long queues at the grocery store, are more likely to bother them. For some, there’s a constant but low-grade gloom and tendency to avoid making decisions. Others might experience 23 hours and 50 minutes per day of normal mood and 10 minutes of intense, debilitating, sadness: the kind that would force them to pull over to the side of the road and cry. There isn’t a template and, just as a variety of disparate diseases (some viral, some bacterial, and some behavioral) were once called “fevers”, I feel like “depression” is a cluster of about 20 different diseases that we just don’t have the tools to separate. Some depressions come without external cause. Others are clearly induced by environmental stresses. Some depressions impair cognition and probably constitute a (temporary) 30-IQ-point loss. Others (more commonly seen in artists than in technology workers) seem to induce no intellectual impairment at all; the person is miserable, but as sharp as ever.

Corporate workers do become less sharp, on average, with age. You don’t see that effect, at least not involuntarily so, in most intellectually intense fields. A 45-year-old artist or author or chess master has his best work ahead of him. True entrepreneurs (not dipshits who raise VC based on connections) also seem to peak in their 50s and, for some, even later. Most leaders hit their prime around 60. However, it’s observable that something happens in Corporate America that makes people more bitter, more passive, and slower to act over time, and that it starts around 40. Perhaps it’s an inverse of survivor bias, with the more talented people escaping the corporate racket (by becoming consultants, or entrepreneurs) before middle age. I don’t think so, though. There are plenty of highly talented people in their forties and fifties who’ve been in private-sector programming for a long time and just seem out of gas. I don’t blame them for this. With better jobs, I think they’d recover their power surprisingly quickly. I think they have a situationally-induced case of mild depression that, while it may not be the life-threatening illness we tend to associate with major depression, takes the edge off their abilities. It doesn’t make them unemployable. It makes them slow and bitter but, unlike aging itself, it’s very easily reversible: change the context.

Most of these slowed-down, middle-aged, private-sector programmers wouldn’t qualify for major depressive disorder. They’re not suicidal, don’t have debilitating panic attacks, and can attribute their losses of ability (however incorrectly) to age. Rather, I think that most of them are mildly but chronically depressed. To an individual, this is more of a deflation than a disability; to society, the costs are enormous, just because such a large number of people are affected, and because it disproportionately affects the most experienced people at a time when, in a healthier economic environment, they’d be in their prime.

The tournament of idiots

No one comes out of university wanting to be a private-sector social climber. There’s no “Office Politics” major. People see themselves as poets, economists, mathematicians, or entrepreneurs. They want to make, build, and do things. To their chagrin, most college graduates find that between them and any real work, there’s at least a decade of political positioning, jockeying for permissions and status, and associated nonsense that’s necessary if one intends to navigate the artificial scarcity of the corporate world.

The truth is that most of the nation’s most prized and powerful institutions (private-sector companies) have lost all purpose for existing. Ideals and missions are for slogans, but the organization’s true purpose is to line the pockets of those ranking high within it. There’s also no role or use for real leadership. Corporate executives are the farthest one gets from true leaders. Most are entrenched rent-seekers. With extreme economic inequality and a culture that worships consumption, it should surprise no one that our “leadership” class is a set of self-dealing parasites at a level that hasn’t been seen in an advanced economy since pre-Revolution France.

Leadership and talent have nothing to do with getting to the top. It’s the same game of backstabbing and political positioning that has been played in kings’ courts for millennia. The difference, in the modern corporation, is that there’s a pretense of meritocracy. People, at least, have to pretend to be working and leading to advance further. The work that is most congruent with social advancement, however, isn’t the creative work that begets innovation. Instead, it’s success in superficial reliability. Before you get permission to be creative, you have to show that you can suffer, and once you’ve won the suffering contest, it’s neither necessary nor worth it to take creative risks. Companies, therefore, pick leaders by loading people with unnecessary busywork that often won’t go anywhere, and putting intense but ultimately counterproductive demands on them. They generate superficial reliability contests. One person will outlast the others, who’ll fall along the way due to unexpected health problems, family emergencies, and other varieties of attrition (for the lucky few, getting better jobs elsewhere). One of the more common failure modes by which people lose this tournament of idiots is mild depression: not enough to have them hospitalized, but enough to pull them out of contention.

The corporate worker’s depression, especially in midlife, isn’t an unexpected side effect of economic growth or displacement or some other agent that might allow Silicon Valley’s leadership to sweep it under the rug of “unintended consequences”. Rather, it’s a primary landscape feature of the senseless competition that organizations create for “leadership” (rent-seeking) positions when they’ve run out of reasons to exist. At that level of decay, there is no meaningful definition of “merit” because the organization itself has turned pointless, and the only sensible way to allocate highly-paid positions is to create a tournament of idiots, in which people psychologically abuse each other (often subtly, in the form of microaggressions) until only a few remain healthy enough to function.

Here we arrive at a word I’ve come to dread: corporate culture. Every corporation has a culture, and 99% of those are utterly abortive. Generally, the more that a company’s true believers talk about “our culture”, the more fucked-up the place actually is. See, culture is often ugly. Foot binding, infantile genital mutilation (“female circumcision”), war, and animal torture are all aspects of human culture. Previous societies used supernatural appeal to defend inhumane practices, but modern corporations use “our culture” itself as a god. “Culture fit” is often cited to justify the otherwise inconsistent and, sometimes, unjustifiable. Why wasn’t the 55-year-old woman, a better coder than anyone else on the team, hired? “It wouldn’t look right.” Can’t say that! “A seasoned coder who isn’t rich would shatter the illusion that everyone good gets rich.” Less illegal, but far too honest. “She didn’t fit with the culture.” Bingo! Culture can always be used in this way, by an organization, because it’s a black box of blame, diffusing moral culpability about the group. Blaming an adverse decision on “the team” or “the culture” avoids individual risk for the blamer, but the culture itself can never be attacked as bad. Most people in most organizations actually know that the “leadership team” (career office politicians, also known as executives) of their firm is toxic and incompetent. When they aren’t around, the executives are attacked. But it’s rare that anyone ever attacks the culture because “the culture” is everyone. To indict it is to insult the people. In this way, “the culture” is like an unassailable god.

Full circle

We’ve traveled through some dark territory. The tournament of idiots that organizations construct to select leadership roles, once they’ve ceased to have a real purpose, causes depression. The ubiquity of such office cultures has created, I argue, a silent epidemic of mild, midlife depression that has led venture capitalists (situated at its periphery, their wealth buying them some exit from the vortex of mediocrity in which they must still work, but do not live) and privileged young psuedo-entrepreneurs (terrified of what awaits them when their family connections cool and they must actually work) to conclude that a general cognitive mediocrity awaits in midlife, even though there is no evidence to support this belief, and plenty of evidence from outside of corporate purgatory to contradict it.

What does all of this say? Personally, I think that, to the extent that large groups of individuals and organizations can collectively “know” things, the contemporary corporate world devalues experience because it knows that the experience it provides is of low value. It refuses to eat its own dogfood, knowing that it’s poisoned.

For example, software is sufficiently technical and complex that great engineers are invariably experienced ones. The reverse isn’t true. Much corporate experience is of negative value, at least if one includes the emotional side effects that can lead to depression. Median-case private-sector technology work isn’t sufficiently valuable to overcome the disadvantages associated with age, which is another way of saying that the labor market considers average-case corporate experience to have negative value. I’m not sure that I disagree. Do I think it’s right to write people off because of their age? Absolutely not. Do I agree with the labor market’s assessment that most corporate work rots the brain? Well, the answer is mostly “yes”. The corporate world turns smart people, over the years, into stupid ones. If I’m right that the cause of this is midlife depression, there’s good news. Much of that “brain rot” is contextual and reversible.

How do we fix this?

Biologists and gerontologists seeking insight into longevity have studied the genetics and diet of long-living groups of people, such as the Sardinians and the people of Okinawa. Luckily for us, midlife cognitive decline isn’t a landscape feature of most technical or creative fields. (In fact, it’s probably not present in ours; it’s just perceived that way.) There are plenty of places to look for higher cognitive longevity, because few industries are as toxic as the contemporary software industry. When there is an R&D flavor to the work, and when people have basic autonomy, people tend to peak around 50, and sometimes later. Of course, there’s a lot of individual variation, and some people voluntarily slow down before that age, in order to attend to health, family, spiritual, or personal concerns. The key word to that is voluntary

Modeling and professional athletics (in which there are physical reasons for decline) aside, a career in which people tend to peak early, or have an early peak forced upon them, is likely to be a toxic one that enervates them. Silicon Valley’s being a young man’s game (and the current incarnation of it, focused on VC-funded light tech, is exactly that) simply indicates that it’s so destructive to the players that only the hard-core psychopaths can survive in it for more than 10 years. It’s not a healthy place to spend a long period of time and develop an expertise. As discussed above, it will happily consume expertise but produces almost none of value (hence, its attraction to those with pre-existing specialties, despite failing to respect specialties once the employee is captive). This means, as we already see, that technical excellence will fall by the wayside, and positive-sum technological ambition will give way to the zero-sum personal ambitions of the major players.

We can’t fix the current system, in which the leading venture capitalists are striving for “exits” (greater fools). That economy has evolved from being a technology industry with some need for marketing, to a marketing industry with some need (and that bit declining) for technology. We can’t bring it back, because its entrenched players are too comfortable with it being the way it is. While the current approach provides mediocre returns on investment, hence the underperformance of the VC  asset class, the king-making and career-altering power that it affords the venture capitalist allows him to capture all kinds of benefits on the side, ranging from financial upside (“2 and 20″) to executive positions for talentless drinking buddies to, during brief bubbly episodes such as this one, “coolness”. They like it the way it is, and it won’t change. Rather than incrementally fix the current VC-funded Valley, it must be replaced outright.

The first step, one might say, is to revive R&D within technology companies. That’s a step in the right direction, but it doesn’t go far enough. Technology should be R&D, full stop. To get there, we need to assert ourselves. Rather than answering to non-technical businessmen, we need to learn how to manage our own affairs. We need to begin valuing the experience on which most R&D progress actually relies, rather than shutting the seasoned and cynical out. And, as a first step in this direction, we need to stop selling each other out to nontechnical management for stupid reasons, including but especially “culture fit”.

Inverted placism: a possible future in which Silicon Valley’s a ghetto

I was having brunch with a couple of friends who are lawyers, and we were talking about desirable and undesirable places to live. Seattle (where I may be moving in early 2015) scored high on every list, and one of the attorneys said something to the effect of, “I’d love to live there, but it’s next to impossible to get a job there.” Getting a law job in Seattle is, apparently, ridiculously difficult. This surprised me, because law is even more pedigree-obsessed than VC-funded technology, and where there’s pedigree obsession, there’s placism. Placism, for law, seemed to favor New York and D.C. to the exclusion of all else. There were some attorneys making lots of money in entertainment law (or as divorce lawyers) in Los Angeles, but it wasn’t prestigious to be in a “secondary” market. That seems to be changing, with locations like Seattle and Austin– desirable places to live, no doubt, but not law hubs– becoming very selective, and some moreso than New York.

Ten years ago, in large-firm corporate law (“biglaw”) New York was the place where attorneys wanted to stay as long as they could. Even though the pay wasn’t substantially higher– when adjusted for cost of living, it was invariably lower– the prestige was strong and followed a person for life. The best outcome was to make partner in one’s New York firm. Perhaps 1 in 10 was offered the brass ring of partnership. The next set, those who were clearly good but wouldn’t get partnerships, would move to firms in “secondary markets” and become partners out there. It was acceptable to move out to Austin or L.A. or Seattle, in your mid-30s, if Manhattan partnership wasn’t in the cards, but few planned for it. Law is even more pedigree-obsessed than VC-funded technology, and so placism is pretty major, and the going assumption has, for a long time, been that the best students of the top law schools will invariably end up in New York.

It seems to be changing. More attorneys are considering New York their backup choice, not wanting to put up with the long-hours culture and high rents. It’s no longer considered unusual for top talent to favor other locations, and some of those smaller markets are developing a reputation for being much more selective than New York, the old first choice.

Does anyone care to guess how this might apply to technology?

Silicon Valley isn’t stable

Balaji Srinivasan gave a talk at Y Combinator’s Startup School entitled “Silicon Valley’s Ultimate Exit”. In it, he decried the four traditional urban centers of the United States: New York, Boston, Los Angeles, and Washington, DC. He named that stretch “the Paper Belt”, a 21st-century analogue of the “Rust Belt”. See, all of those cities are apparently run by dinosaurs. Boston is the academic capital, but MOOCs are rendering in-person education obsolete. D.C. is apparently no longer relevant, under the theory that the decline of nation states (which will occur over the next 200 or so years) might as well be concluded to have already happened. New York? All that media stuff’s being replaced by the Internets. Los Angeles? Well, Youtube and iTunes and Netflix have already disrupticated Hollywood, which might as well be relegated to history’s dustbin as well (except for the fact that someone still has to make the content).

Silicon Valley’s arrogance is irritating and insulting. I’m not exactly lacking when it comes to intellectual ability, and on several occasions, I’ve interviewed for a position in the Valley (for the right job, I’ll work anywhere). On multiple occasions it has happened this way: I knock the code sample out of the park, on one occasion submitting one’s considered one of the 3 best submissions. I nail the technical interview. I get the offer… and it’s a junior position because, whatever I accomplished up to this point, I didn’t do it in California. The effect of placism is very real in technology, and it’s strongest in the Valley.

I don’t see this elsewhere. Banks and hedge funds don’t care if you’re from a rural village in China. If you’re smart, they respect it. They have the intellectual firepower to recognize intelligence. What about the Valley? Surely, I’m not saying that the people in the Valley are dumber? Well, it’s not quite that. As individuals, I don’t think there’s a difference. There are A-level intellects everywhere, whether you’re in the middle of Nebraska or in the Valley or on Wall Street. The problem, instead, is that the Valley has a passive-aggressive consensus culture, which means you need to impress several people to get the green light. In New York, it’s typical for an influential person to say, “I like this guy, and those who don’t won’t have to deal with him, but I think he’s fucking brilliant”. In California, that doesn’t happen. This gives intellectual mediocrities (who can, likewise, be found in Valley startups and on Wall Street) a certain show-stopping power (“I don’t think he’s a team player”, “she’s not a culture fit”) that they don’t have, to the same extent, on the East Coast.

For traders and quants, pedigree isn’t all that important. It can get you in the door, but it ceases to matter after that. In the Valley, pedigree matters much more, because recognizing individual excellence challenges the “collaborative culture” and the “laid back” mentality that California is “supposed” to have and, if you can’t bring up a person’s individual firepower, you start defaulting to credentialism and prestige. Not all Stanford grads are geniuses (see: Lucas Duplan). On the East Coast, it’s socially acceptable to say, “He’s fucking stupid and I’m sure his parents bought him in.” That’s a pretty clear “no hire”. In California, you can’t say that! Instead, in the California culture, you end up having to say something like, “Well, his problem-solving skills aren’t what I expected, and I think he’d be unhappy in a technical role, but I guess we can give him a product position and tap the Stanford network.” To me, that’s a “no hire” but, to many managers, that sounds like a ‘yes’.

I had the reverse of this experience when (as part of my consulting practice) I was hiring an engineer for a startup. He was a 20-year-old college graduate, sharp as fuck and probably a better programmer than I was, but socially inept. I said, “he’s brilliant, but would need some mentoring on the social aspect of the work”. To make it clear, I was being as honest as I could be, and my recommendation was to hire him. Unfortunately, my “he’ll need social mentoring” was taken as a passive-aggressive way of saying “no-hire”, rather than a completely honest acknowledgement that a good candidate had (minor) imperfections. He wouldn’t have been hired anyway, nor would he have liked the place, so it didn’t matter in the end. Still, it shocked me that such a minor note against someone (I said, “he’ll need social mentoring”, not “he’s an incorrigible fuckup”) could be taken so far out of proportion.

The point of this digression is that, because people in the Valley refuse to communicate meaningfully, and because of the consensus-driven culture, the rank-ordering of potential candidates that is actually used is the one already furnished. For younger candidates, that’s derived from educational pedigree. For older ones, it comes down to job titles and companies to some small degree, but much more important is location. Placism rules in the Valley.

There’s nothing stable about that, in my view. Academic institutions have lifelong contracts (tenure) with professors and gigantic capital investments, so universities tend to stay put. (Universities that are too prolific with branch campuses, such as NYU having an Abu Dhabi campus, destroy whatever prestige they might otherwise have.) The seat of the U.S. government is, likewise, unlikely to leave D.C. except in event of an unforeseen catastrophe. Hollywood’s geographical advantage (its proximity to a diverse array of terrain types) is still major, because the cost of travel with a film production team is extremely high. New York? New York won’t lose finance (the exchanges are there) and, even if it did, it would still be New York.

That’s something that the Valley, with its arrogant placism, doesn’t get. Let’s say that New York’s financial industry takes a catastrophic dive. We see apartments once valued at $50 million selling for $4 million, and rents dropping to Midwestern levels. And then? Creative people will move in, rapidly, and restore life to the city. New York isn’t beholden to one industry. It will always be New Fucking York. Unless we see a recurrence the 1970s general abandonment of cities by the American population (and, in my lifetime, we probably won’t) the worst-case scenario for it is that it becomes like Chicago: an also-ran city that is, in spite of its lack of “paper belt” specialty, thriving and an excellent place to live.

New York can lose its status as the prestige center of biglaw. It could even lose Wall Street. (That would be a disaster for New York property owners, but the city itself would be resilient.) Silicon Valley, on the other hand, is fucked if the placism of venture-funded technology inverts. That just might happen, too.

Inversion of placism tends to happen when the young and creative decide that the advantages of living in the “prestigious” place are not worth the disadvantages. The rents are too high, the culture is too elitist, and upward mobility is too low. The progeny of well-connected families still end up in the prestigious place (New York biglaw, Valley technology) but the successes of the next generation head elsewhere. Sometimes, they choose another location; others, there’s a sense of diaspora for a while. The Valley could easily lose its singularity. It’s not a great place to live (it’s a strip mall) and it’s far too expensive for what it offers. In truth, everything about it is mediocre except, to some extent, the work; but 95 percent of the work is mediocre (who wants to work in operations at IUsedThisToilet.com?) and getting the other 5 percent requires an advanced degree from a top-5 CS department, or elite connections. A few good people have those and will be able to stomach the Valley, but most good people come from no-name schools (not because the no-name schools are better, but because most people come from no-name schools) and don’t know anyone important out there. In 2010, talented and naive 22-year-olds were willing to move out to the Valley and provide cheap, clueless, highly dedicated labor under the naive (and wrong) assumption that a year at a startup would have them personally introduced to Peter Thiel by the founders. Is that trickery going to work in 2016? I doubt it.

Starting about now, it’s going to become increasingly evident that the talent wants to be outside of Silicon Valley if the same quality of job is available elsewhere. In fact, being in Silicon Valley after 35 will mean one of two things: astronomical success, or dismal mediocrity, with no middle ground. Being in California, at that age, and not being part of the investor community (either as a VC, or as a founder able to raise money on a phone call) will be a mark of shame. If you’re good, you should be able to move to Seattle or Austin, no later than 30, and get the same quality of job. If you’re really good, you can get that kind of job in St. Louis or Nashville. Aside from the outlier successes ($20 million net worth, national reputation) who can make a Silicon Valley residence part of their personal brand, it’ll be the mediocrities who stick around the Valley, still trying to catch a break while ignoring the hints that have been dropped all around them.

By 2020, this will have more of a “diaspora” shape. There won’t be a new tech hub yet. You’ll see talent gravitating toward places like Seattle, Boulder, Portland, Chicago, Austin, Pittsburgh, and Minneapolis, with no clear winner. Millennials are, if not blindly optimistic, attracted to the idea of turning a second-rate city into a first-rate one. By the late 2020s, it will be clear whether (a) new hubs have emerged or (b) technology has become “post-placist”. I’m not going to try to opine on how that will play out. I don’t think anyone can predict it.

Cheap votes

What gives Silicon Valley its current grip on technology? The answer is a concept that seems to recur when aggregations such as democratic elections and markets break down. Cheap votes.

Electoral voting, statistically, can actually magnify the power of a small number of votes. If there are 101 voters and we model 100 votes as coin-flips, the power of the 101st vote isn’t a 1-in-101 chance of swaying the election. It’s about 1-in-13. (Due to the central tendency of the mean, there’s a 7.9% chance that the 100 votes split evenly.) Likewise, the statistical power of a voting bloc increases as the square of its size (in the same as the variance of perfectly correlated identical  variables, when summed, grows as the square of the individual’s variance). What this means is that a small number of voters, acting as a bloc, can have immense power.

Another issue is that many voters don’t really give a damn. Low voter turnout is cited as a negative, but I think it’s a good thing. Disinterested people shouldn’t vote, because all they’ll do is add noise. The ugly side effect of this is that societies generate a pool of cheap votes. Ethical reservations aside, there are plenty of people who care so little about electoral politics that (absent a secret ballot) they’d sell their vote for $100. How much is a vote worth? To the individual, the vote is worth less than $100. But, to many entrenched interests, 500,000 votes (which can sway a national election) is worth a lot more than $50 million.

When you allow vote-buying, power shifts to those who can bundle cheap votes together. That’s obviously a very bad thing for society. Such people tend, historically, to be deeply associated with society’s criminal elements, and corruption ensues. This is one of the major reasons why the secret ballot is so important. Anonymity and privacy in voting are sacred rights, for sure, but we also want to kill the secondary market for cheap votes. There’s no real harm in someone selling his vote to his grandma for $100, but if we allow vote-buying to take place, we give power to some unelected, vicious people who use the statistics of electoral practices to subvert democracy.

Markets are the voting systems of capital allocation and business formation, designed as principled plutocracy rather than a democracy. Of course, just as in democracies, there are a lot of cheap votes to go around. Plenty of middle-class people want to park their savings “somewhere” and watch their numbers go up at a reasonable annual rate, but have no interest in dictating how the sausage is made. They don’t know what the best thing to do with their $500,000 life savings is and, to their credit, they admit as much. So they put that money in bonds or index funds and forget about it. Some of that money ends up in high-risk, high-yield (in theory) venture capital funds.

VCs are the cheap vote packagers of a certain 21st-century question: how do we build out the next stage of capitalism, which requires engagement and autonomy within the labor pool itself to a degree that disadvantages giant organizations? The era of large corporations is ending. It’s not like these companies will disappear overnight, or even in 50 years, but we’re seeing a return to small-scale, niche-player capitalism in which a few small companies manage to have outlier success and (if they want it) can become large ones. VC is the process of taking cheap votes (passive capital) and attempting to influence the formation processes of the nation’s most innovative (again, at least in theory) small businesses.

Abstractly, your typical doctor in St. Louis would rather have more small businesses in the Midwest (his children need jobs, and they may not want to move to Mountain View) than in California, and might prefer his capital being deployed locally. But he has a full-time job and is smart enough to know that he’s not ready to manage that investment actively. So, he parks his money in an “investment vehicle” that has the funds redirected to a bunch of careerists in California who care far more about the prestige of association with news-making businesses (hence, the focus on gigantic exits) than the success of their portfolios. His returns on the investment are mediocre, but his locale is also starved of passive capital, which has all been swept away into the bipolar vortex of Sand Hill Road.

Passive investors don’t care, enough, to pull their funding. In fact, it’s rarely individual investors whose capital ends up directly in venture capital. Because of protections (which may not be well-structured, but that’s another debate) that prevent middle-class individuals from investing directly in high-risk vehicles, it’s actually large pension funds and university endowments (increasing the indirection) that tend to end up in VC coffers. With all this indirection, it’s not surprising that passive investors would tacitly accept the current arrangement, which congests Northern California while starving the rest of the country. But is this arrangement stable? I think not. I think that, while it takes time, people eventually wake up. When it happens, San Francisco may still possess its urban charm, but the Valley itself is properly screwed.