Gervais / MacLeod 19: Living in Truth, fighting The Lie

Yahoo recently bought Summly, a startup run by a 17-year-old, for $30 million. Since the product was shut down, it was a “talent acquisition” (or, “acq-hire”) intended to hire the team, making the list price a pure hiring bonus. This move has, predictably, generated a lot of buzz.

Let’s look at the economics of the damn thing. The information that’s coming out seems to indicate that three engineers will join Yahoo, with an 18-month commitment. Prorated over that time, that’s $6.7 million per engineer per year. Of course, Yahoo hopes that these engineers will stick around for longer than that– perhaps five years, making that price only $2 million per engineer-year. Such numbers are not atypical in the world of acq-hires. Companies routinely pay $5 million per head (40 years of salary, at market rates) just to get a team of validated talent. There are two ways to look at this. The first is to conclude that in-house engineers are getting screwed: if a relationship with an engineer (expected duration: about four years) is worth $5-10 million, doesn’t that mean they’re severely underpaying in-house talent? I think software engineers are underpaid, but on average, we’re not worth $2+ million. Some of us are, most aren’t.  The second possibility is that the engineers being hired are just far superior to Yahoo’s in-house talent. I doubt that as well. I’m sure that Yahoo has amazing software engineers making much less than $6 million per year.

What does it say that Yahoo is buying high-school-age engineers at a panic price?

Of course, some people are taking it as a sign that Yahoo doesn’t have internal talent, or can’t get it. That’s offensive, and almost certainly not the case. I’m sure that Yahoo has plenty of capable people. What acq-hires say is not that a company is so bereft of talent that it can only get it from outside, but that a company can’t recognize talent at the bottom. It’s there, but the middle-management filter is so defective that the executives have no clue what they have. This is similar in character to a hoarder. By “hoarder”, I’m not talking about people who keep receipts or silly mementos, but the pathological kind whose living quarters become filthy, dangerous, and borderline uninhabitable, and who require psychiatric help for normal living. A true hoarder will have to buy a new coat every winter because the old one, without fail, gets lost in a personal junkyard of useless stuff. Anyway, such a person’s likely to be needlessly spending $500 on winter clothing every season, because his house is such a pigsty. What he needs is already there, but inaccessible. This is the problem that rank cultures have when it comes to talent. They become so unable to spot talent at the bottom that, even though they have talent “lying around somewhere”, they can’t (or won’t) find what they have. So they have to panic-buy it in this ridiculous bubble climate. What’s this about? It’s about trust.

Yahoo is not getting hoodwinked– at least, not in this. As the executives see it, they’re buying a trusted team. Capable software engineers are worth these “absurd” amounts seen in acq-hires if they are trusted by the organization. Give a good engineer full autonomy over her work, and give her important work, and she’ll deliver millions in value. That probably applies to these guys, but it also applies to Yahoo’s stronger engineers. On the hand, if you don’t trust her, use her for fourth-quadrant work, and fail to develop that talent; then she’s worth, if you’re lucky, 2 to 5 percent of her peak potential. This is only tolerable in software because 2-5% of a competent engineer’s peak potential still exceeds the market salary.

Trust sparsity

Large rank-culture companies seem to be talent hoarders, with no exaggeration in the use of the word hoarder. They bring smart people in, because the talent is “worth having around”, especially at a wage level that is insignificant to a corporation. They let it go to waste. They build up a formless array of people at the bottom (with more rigid, but pathological and constantly changing, managerial structures to organize and stack-rank them them) and, not knowing where the talent is, tend toward prevailing trust of everyone down there. Clearly some of those nincompoops at the bottom have talent and some don’t, but it’s rarely worth it to sort through the mess in the basement. This becomes a self-fulfilling prophecy. Good engineers don’t want to work in companies that don’t trust them, so they leave; good managers don’t want crappy reports, so they quit. The end result is an all-levels flight of talent from the firm. What results is a loss of trust in all levels of the firm. Executives start assuming that their reports are all morons (the “bozo bit” defaults to the “on” position) and that the only candidates for decision-making roles are “special people” hired from outside. Workers stop believing that their managers give a damn about their career development. It’s a omnilateral breakdown of trust that is very hard to reverse.

When people stop trusting each other, they become dishonest. No, I’m not saying that a company like Yahoo is full of liars– I doubt that to be the case. However, there are degrees of honesty, and the important ones (e.g. willingness to share bad news and explore difficult realities, as opposed to merely furnishing truthful answers to simple questions) require trusting the other party with the truth. That’s the endgame of trust sparsity. It creates a world in which some degree of dishonesty is not only beneficial, but necessary for one who wishes to survive.

I mentioned, in Part 1, the social currency of credibility that is supposed to come from work performance, but that Sociopaths find other ways to get. They realize that, even if the organization wants to think that social status mirrors contribution and capability exactly, it can be tricked and “merit” can be bought on a black market. Trust sparsity exists in an organization when people tend to distrust the competence and decency of the other players. Employees doubt they’re fairly compensated or well-directed, managers distrust their reports to get their jobs done and tend to micromanage, and people are afraid to work with other teams, because the default assumption about another person in the company is that he’s an unreliable idiot. The “bozo bit” starts out in the on position (meaning people are, by default, regarded as idiots until proven otherwise).

Trust density is the opposite, in which people are generally assumed to be competent and decent. The “bozo bit” starts out in the off position, and only people who prove to be really bad are distrusted (and in a functioning organization, they’ll be let go). This matter of trust sparsity versus density seems to be a binary property of social groups. Once a company “flips the switch” to trust sparsity, it becomes impossible to get anything done without disproportionate credibility. This turns into a “permission paradox” state where the only way to get a project that would confer credibility is to have it already– unless you want to take the “fake it till you make it” strategy. That’s when MacLeod Sociopaths (who, again, are not always bad people; but invariably willing to break rules) start to take over. Bad artists borrow, good artists steal.

That’s why “why not?” cultures are superior to “why you?” cultures. MacLeod Sociopaths will just take credibility, no matter what the official culture is. If they can arrogate it silently, they do so. They’ll ask for forgiveness, not permission. The difference between good Sociopaths and bad ones is what they do with that purloined freedom. A “why you?” culture ends up relying on its Sociopaths, who are a difficult crowd to aduit.

When you have trust density, honest people are at an advantage in the environment of transparency and collaboration that it generates. Getting real work done is what people recognize. When you have trust sparsity, however, you end up with communication droughts, and it tends to be dishonest manipulators who acquire credibility and push themselves ahead.

Living in Truth, and the Lie

This is the most personal topic in the MacLeod series. The other organizations I suffer abstractly, as much as anyone else does. Those traits of organizations irritate me, and I find them perniciously inefficient, but they don’t mess up my life. This is an issue that has rocked my career (in good ways, an bad) from time to time. I care a lot about this one. I’m going to talk, for a bit, about living in truth.

When you live in truth, you decide to be consistently honest, and to assume good faith from other people (although you do not take them at literal word). You live and work as if it were a trust-dense environment, and you don’t try to hide the fact that you’re doing so. You’re honest with your manager, even if he’s not forthright with you. You inform counterparties of the risk inherent in deals you wish to make, even if it’s to your disadvantage. You don’t bullshit, and you don’t tolerate others’ nonsense either. In the classical sense of the word, it’s cynical: live virtuously and honestly, assume basic goodness in others, and oppose dishonesty. 

The modern concept (and the name) comes from Vaclav Havel, who championed “anti-political politics“. The idea is that, rather than directly opposing an overbearing political force, to live as if one were free. No violence or protest needed. Just do the right thing, anyway. This is a courageous and rare thing to do, for the obvious reason that political authority (especially under Soviet rule) can be terrifying. If one person lives in truth, he gets shot or thrown in jail (as Havel did, being imprisoned for several years). If a million people do it, society changes. The Lie’s only scalable weapon in the face of exposure is further dishonesty and, eventually, it becomes absurd and falls in on itself.

It’s actually much easier for us corporate denizens to live in truth, because we really have little to lose. What might happen? A job might end. That’s the loss of a relationship with someone who didn’t value us in the first place. We might get bad references (hire a lawyer; a well-written C&D will clear that up). Then there’s the “job hopping” stigma. Okay, that’s real, because there are a lot of imbeciles out there who are stuck in the 20th century who’ll throw out your resume for having “too many jobs”, but there are non-imbeciles out there as well. These are all serious consequences, but nothing compared to what real dissidents have faced: prison and death. So what the hell is our excuse? I’m not asking for self-immolation here, but moral courage would be nice. My experience in the corporate world has convinced me that it’s thin on the ground. People prefer the comfort of the Lie over a life in truth.

What does “truth” mean in a corporate context? It means doing the right thing, even if it hurts. It means placing value on personal health and progress, profitability of the business, and cultural integrity. It means taking responsibility for strategy at one’s appropriate scope, rather than using the following-orders defense for failure. It’s never easy, and it’s often punished. A synonym for living in truth is for an employee to be (a word I’ve used before) self-executive.

In a culture of truth, employees are self-executive and it’s assumed that they will be. Trust density dominates. I don’t intend to claim that what I’m discussing in a panacea that magically causes dishonesty to go away, but a self-executive world is one in which the honest can fight back. They have a chance. They’re informed, and it’s worthwhile for them to speak because people in power will actually listen to them. Nothing can change the fact that there are bad actors out there, and good people who work together badly. Even the best organizations will have to deal with that. But a self-executive world is one in which good people can still win.

I’ve talked about truth, and we can agree that it’s good. What does The Lie look like? Well, in typical corporations, the powerful aren’t explicitly dishonest. They’re careful not to say anything on record that is literally untrue. It’s more that they’re so opaque with information that emergent dishonesty is the norm. Valuable information is so guarded that people don’t even know if they’re doing their jobs properly, which makes it easier to mask a termination as “for performance”. Managers can claim that “there isn’t money in the budget” for a raise when that is only correct with the added context, “for you.” There’s a lot of dishonesty that opacity enables.

Corporations like The Lie because it creates an executive in-crowd. A nasty joke has been told, and the target doesn’t even notice. Sociopaths get the joke, and the Clueless butts have no idea what’s happening. MacLeod Losers would get it, but they’ve chosen not to be in the same room. The Lie is also an extremely powerful weapon. If you’re in on the Lie and have some control over its direction, you can use it to take people down. That’s why reputation economies tend to be hacked by the worst sorts of people. The Lie is very good at ruining reputations. That’s how the fucking thing fights back: it reduces the credibility of its opponents with (big surprise) deceptive half-truths, opacity, and outright lies.

In terms of corporate employment, reputation damage– in forms like immediate firings, bad references, and possibly frivolous lawsuits– is all that it has. That should establish it as very weak. Why? The Lie’s counterattack has constant total strength but a variable number of targets. It’s like a fireball spell that, if it hits one target, does enough damage to kill a demigod but, if it hits fifty, barely scratches them. If everyone fights The Lie and The Lie fights back against everyone, its weapon is so diluted as to be impotent. No one will buy into The Lie if it starts smearing more people– especially if they experience getting smeared, which is one way for a person to learn viscerally that The Lie is a lie.

Right now, people are terrified of bad references, short jobs, and public terminations. People don’t “bad mouth” unethical employers for fear of severe career repercussions. Now, I tend to agree that people who air “dirty laundry” (mistakes and embarrassments within normal bounds, that any complex entity will endure, as opposed to real ethical problems) are doing something that they shouldn’t, but some companies and executives are just deeply unethical and deserve to have their secrets blown. Right now, this sort of thing doesn’t happen until it’s far too late– investors were defrauded, employees robbed, and customers left hanging– because no one is willing to risk long-term blacklisting to do something that, while desirable to society, confers little personal yield. The Lie perpetuates itself by making truth scary for the individual who might expose it.

“Stone Soup” and convex dishonesty

Why does The Lie exist? I’m going to tackle a related question: is dishonesty always bad? With dishonesty, I’m not talking about “white lies” or inconsequential politeness or even the semi-formalistic lies (such as never disparaging an ex-employer or boss, instead saying “I was looking for new challenges”) required by decorum, but rather about willful deception of other people with the goal of altering their behavior. In other words, deception means serious sociopath stuff. So, I think it should be obvious that we’re going to fall somewhere between “always wrong” and “most often wrong”. I intend to convey that it’s the latter: it’s most often wrong, but not always. There are situations that require dishonesty.

There’s a parable, probably going back to medieval times, about a village in a deep state of famine. Each villager has plenty of produce, but they hoard food, never sharing or trading it because they distrust each other. Everyone’s malnourished; they’re probably making bad decisions, and slowly dying.

A pair of outside strangers, also hungry, comes to the village and asks for food. Slammed doors. Nothing. So they camp out in the town square, put a rock in a cauldron with some water, and start boiling it. Curious villagers, from time to time throughout the afternoon, come by to ask what they’re doing. “We’re making Stone Soup, a delicious specialty where we’re from. Would you like some?” Villagers agree to partake, and the travelers suggest that Stone Soup is even better with just a few carrots. Parsley’s good too. Rice. Chicken. Soon enough, the whole village is on it, with each contributor thinking he or she is adding just a little extra to a completed product. Of course, the stone is actually inert: “Stone Soup” is just hot water! So the stone is taken out at the end, and the soup is served to the village. Everyone gets a much healthier meal than they’ve had in months. Victory. The End.

This is a case where’s pre-existing trust sparsity within the village. They don’t share food, because they don’t believe the others will be fair to them. Instead, each eats only the one food product he or she has, and they’re all malnourished. The travelers, needing to eat, do something dishonest. Asking for food doesn’t work, so they make up a nonexistent delicacy, offer to share, and ask people for ingredients one-by-one. The result is that everyone gets a bowl of real soup. They all benefit, but it’s still dishonest. This isn’t a polite white lie or a “protocol lie” where both parties understand the truth can’t be told. It’s intentional deception with the explicit goal of altering economic behavior: legally, we call that “fraud”. Yet it’s clearly a good thing they did. This is a model case of convex dishonesty.

What’s convex dishonesty? It exists when one party gets commitments from others through dishonest means, under a situation where a small number of commitments will lead to failure but a large investment will pay off multiply (convexity). The goal, of course, is to succeed and pay everyone back.

For a less defensible example, let’s say that I have a business strategy that will require investments of $1 million from five people. If all contribute, we’ll net $21 million. I take $1 million for the execution and pay $4 million back to each of the principals. If we don’t get all five commitments, however, everything is lost. It’s very risky to go in as the first, second, third or fourth investor, because you’re betting on the whims of all the others. The fifth investor experiences no risk. A devious way to maximize my chance of success would be to tell each of the 5 participants that the other four were already committed, implying that there’s no risk. If I pull it off, everyone wins. We all get a payday. However, if one of those players can’t invest, or doesn’t trust me, then we all lose.

Why is that convex? It pertains to the input-output relationship between resources and payoff. In the example above, the payoff function is zero from 0 to 4 commitments, and $21 million at 5. That’s the “hockey stick” graph that is the epitome of convexity. Typically, a convex profile means that a mediocre commitment will result in failure (hosing the investors) while a large one will deliver outsized success. Investors are effectively betting on whether they believe others will commit, and the fraud is in convincing them that the others have.

Stone Soup has a similar profile. The value of the soup is somewhat subjective, but convexity is clearly in play. If one villager puts food in the soup, he gets screwed. He’s giving away some of his food for a “soup” that he could make at home. He’d probably be very angry. If twenty villagers participate, however, they get a food that they couldn’t have made alone.

With convex dishonesty, you’re typically generating validations (often “social proof”) to create the impression that a project is almost in a desirable state, in order to motivate people to contribute so you can get to that point. You’re selling a “vision” to get commitment before you have any way of knowing whether you’ll gather enough to deliver. I’d imagine that most startup entrepreneurs understand this intuitively: one has employees, investors, customers and press, and all are looking for progress with the other groups to see general “traction”. It might be tempting (and it’s generally quite wrong) to exaggerate one’s success in other departments– for example, to hire people at a low salary with the promise that “Series A funding will be here in two months”, or to mislead investors with inflated customer numbers (don’t do that). It’s just very hard to orchestrate a situation where that heterogeneous collection of needs and resources grow together.

Convex dishonesty isn’t always good. It’s often bad. What’s wrong with it? Well, most scams look like convex dishonesty. For one game-theoretic example, consider the “drop-our-books” prank played in junior high schools across America. The butt of the joke is told that, at a certain time, the whole class is going to engage in some disruptive behavior (such as dropping one’s book on the floor, clapping, or yelling out). If disruption of class is considered “good” (e.g. positive utility in schadenfreude against the teacher) then it’s convex. If one student does it alone, he’s embarrassed and possibly punished, so he loses. If all the students do it, they laugh at the teacher’s expense but can’t all be singled out, so the group wins. Of course, the fraudulent aspect of this is that only one person (the butt of the joke) will actually engage in the behavior. The other students laugh at his expense.

In fact, phenomena that “look like” convex dishonesty can reach extremes of evil. Ostracism is a case of that. I’m not talking about mere individual social rejection, but when a community is persuaded to reject someone entirely. Influential people in that group create the usually-fraudulent perception that no one likes the target, which compels individuals to reject that person because “everyone else” dislikes him. It’s not convex in the typical sense (there’s no clear “payoff function” with a convex shape) but it has similarities insofar as it uses dishonesty to push the community from one Nash equilibrium to a worse (at least for the rejected person) one.

Trust sparsity and convex dishonesty

If we jump back to Stone Soup for a second, we find an impressive moral message. In this contrived (but not uncommon) circumstance, deliberate deception is heroic. Amid the trust sparsity of that village, a convex deception is the only thing that can get them to work together and produce a decent meal.

I contend that these travelers are archetypal MacLeod Sociopaths. Yes, they saved the villagers’ lives. They were certainly decent enough to share the Stone Soup that they created. (A modern executive would take an 80-percent cut of the soup as a “stone fee”, giving the villagers the scraps.) They also did it for selfish reasons: they didn’t want to starve either. One can argue them to be thieves: all they brought to the soup was a worthless stone, but they got to share in the final product. Their fundamental, catalytic function, however, was to make this trust-sparse group of people work together with the lubricant of convex dishonesty: the lie that this Stone Soup had pre-existing value, and just needed a little bit more from each. Whether these outsiders were good-hearted altruists or dishonest egoists (sociopaths?) is beside the point. They were necessary.

That is what trust sparsity is about. Within a trust-sparse corporate environment, to do anything requires a certain dishonesty (also known as “social proof arbitrage”). Trust sparsity means that everyone’s default will be to look at you with the “bozo bit” on, and ignore your input. The first thing you must do– the only thing that’s important– is flip that damn switch using whatever means possible. Until you’ve done that, nothing you achieve will matter. After you’ve flipped your “bozo bit” to the off position, you can get some real work done. But if your “bozo bit” is on, the only thing you’ll be able to do is fourth-quadrant work. Get out of that mess as soon as you can. Fourth-quadrant work will stink up your career if you’re on it for too long.

What exactly am I advocating?

My message might seem muddled at this point. I railed against The Lie, but I just said that people should flip their “bozo bit” to off using “whatever means possible”. It’s actually quite altruistic to do so, because you can’t get real work done till you’ve zeroed that “bit”. Does that means I’m advocating dishonesty? Possibly so.

When you live in truth and become self-executive in the honest way, you’re taking on a major risk. You’re flipping your own “bozo bit”, and letting it be known that you expect others to do so as well. You’re refusing to be deprived of credibility, and in a visible and above-board way. Often, this means you’re arrogating more autonomy than your manager has. It’s dangerous. It can get you fired. Most people prefer the safer and subversive convex dishonesty. They’re not trying to defraud anyone, though; they fully intend to pay the villagers back.

How do The Lie, and convex dishonesty, interact with the traditional MacLeod tiers? MacLeod Losers live with the Lie. It becomes an annoying landscape feature, rather than a moral calamity, to them. Clueless tend not to know that it is a Lie. They’re the “useful idiots”. Most of the MacLeod Sociopaths have, however, risen to a level (just past the Effort Thermocline) where they’re cognizant of The Lie. It’s a like a hedge maze whose structure is evident from above, but befuddling and illegible from the inside. Sociopaths, with a reaper’s-eye-view, learn how to use The Lie.

Where are the people who oppose The Lie and live in truth? I contend that those are the natural Technocrats, and it’s telling that the original MacLeod pyramid has no place for such. I guess that such people are assumed to be flushed out, and that’s not a bad assumption. That is the fundamental evil of organizational opacity, wherein truth-tellers can be isolated, punished, and ejected. The Lie can push them out, and make itself stronger. Its opponents are either pushed out in a humiliating way (“making an example”), isolated and ejected invisibly, or silenced into non-participation. At the same time, the bad MacLeod Sociopaths learn how to mix their own power with The Lie, an alloying process that makes both stronger.

The Lie loves trust sparsity, because it makes it easy to play divide-and-conquer games against the powerless. Moreover, the only way to get any work done in a trust-sparse environment is to use convex dishonesty. It’s to counterfeit credibility (go ahead, it’s usually a bullshit currency that deserves it) as far as you can, and to live in a “why not?” culture instead of “why you?” by changing your history as much as is needed. That’s a practical necessity for most people (even good people in the desirable subslice of the Sociopath type) if they want to get any work done.

I don’t like that. I don’t like that the need for dishonesty is there. Even good lies– even full-on, obvious-after-the-fact convex dishonesty– are damaging to relationships. My advice: be cautious. Be smart. If a personal relationship is valuable to you outside of the organizational context, don’t pollute it with a lie (even a convex one). But most human organizations won’t let you do X until you’re a “real X” with 5 years of experience in a 3-year-old technology. How do you become a real X? You should just become one, through any means possible. Your decision, today. Better to fake it now than to never make it. “You don’t need to hire an X. I am the resident X. Of course I have production and leadership experience!” Never claim a specific competency that you don’t have, or promise work that you can’t fulfill, but if you need to inflate experiences to tweak perceptions in the right direction, go right ahead. Your enemies are cheating in the exact same way, and they’re much worse people, so why not? If you can afford to live in truth, do it. If you can’t, then bolster your career with enough convex lies to get permission to tackle real work. But then, because you still are a decent person, it’s on you to deliver what you promised.

Ultimately, a lot of decisions aren’t made based on merit, but on gut decisions derived from social status and “feel”. That is why the Draco Malfoy type whose family “was Ivy before George III” sees his career advance just a little bit faster than everyone else’s. It’s not that there’s a conscious decision to promote him based on irrelevant social status. It’s just how people work when trust sparsity has set in and people are waving feeble lanterns at midnight. If you can push yourself forward with just little bit of convex social-proof arbitrage, then you should. Like I said, I don’t advocate this style of deception if you want a persistent personal relationship– that slight social superiority puts one just-above-zero in a trust-sparse environment, but it’s not worth it to gain that petty sort of elevation in genuine relationships– but it’s a fine way to move about at work.

Or, you can go the other way. You can disobey the Lie. Sometimes that’s the right thing to do, as well. You can get up at 4:00 in the morning when The Lie is asleep and get to work. You can live in truth. Both, as I see it, are morally valid options for the individual.

Organizational benefits of trust density

I consider it morally acceptable for a person to use convex lies to push his career forward. Why? Because most companies put people in roles that are three levels below their frontier of ability. The assignment of fourth quadrant work that is itself dishonest. How am I justified in saying that crappy work assignments are dishonest? The truth about the junk work is that it’s evaluative. It has very low importance in the function of the business, and not much is learned in doing it. Rather, it’s just there to see if the person is “good enough” for real work, a decision that often isn’t made until he’s “paid dues” and “proven himself” in a years-long wringer of boring, unimportant work where there are high expectations of dedication and obedience to managerial authority. In my opinion, this is a terrible statement about an organization. It means that it doesn’t trust its own hiring process.

Some people (MacLeod Sociopaths) bypass all that evaluative time-wasting nonsense and put themselves on real work. This can be done by public honesty (living in truth) but that tends to entail more risk of sudden income loss than most people can tolerate. So usually, they do it in dishonest ways. They fake credentials and experience, careful never to explicitly lie, but fudging on subjectives like “production experience” and “leadership role”. They find social proof arbitrages and credibility trades and hack the system as it exists. This is good for them, but it’s bad for the organization itself.

The Lie can be seen as a waste-pile of formerly convex dishonesties that were useful to the organization at one point but are now pathological. For example, let’s say that the organization was divided on the matter of who should be CEO: John or Kara. John got the job over his more competent co-founder, Kara, because he had “investor connections” that weren’t real and never came through. However, with John as the leader, they were able to work together as a group, and other funding came in later. A convex deception! Three years later, it’s discovered that John’s claim to the CEO job was utterly false. The company can either fire him or (often, the more expedient choice) assimilate his lies by changing the story.

I tend to think of organizational opacity as a core aspect of The Lie, rather than something that just enables it. For example, companies always claim that compensation is fair, but keep specifics extremely murky so that no one can really audit them. The reason they do this, of course, is so they can be unfair when it’s expedient. If they’re desperate for talent, they’ll go up by 20% without raising salaries across the board. In truth, the culture of opacity and hierarchy that companies create surrounding compensation, division of labor, performance evaluation, and pretty much anything else that matters, is all there to enable expedient lies. Those errors are supposed to cancel out over time, but MacLeod Sociopaths find ways to turn such errors into a true currency that they can trade and invest for profit. As they do so, The Lie invisibly gains strength. Virtually no one intends to build up The Lie, because almost everyone is acting out of self-interest only. It happens day by day. When compensation becomes unfair and information becomes asymmetric, The Lie gets stronger. When internal headcount limitations are put in place, and closed allocation sets in, The Lie gets much stronger.

Most convex dishonesties are “good lies” of the Stone Soup variety. People are embellishing credentials to counterfeit credibility and therefore be permitted to do real work that’s of benefit to them and the organization. Those convex lies generally don’t contribute to The Lie directly. In fact, these are people fighting against The Lie, by subverting its attempts to disempower them. Unfortunately, they’re often indirectly responsible for feeding The Lie. When a convex lie fails (i.e. the payoff is never realized, and the lied-to parties get burned) people become, justifiably, angry. They bought into a party with counterfeit credibility, and lost. This validates credibility’s necessity! (What happens when people with real credibility fail to deliver? Credibility is defined more conservatively, and the environment becomes more trust-sparse and dysfunctional.) The Lie becomes stronger. Those who are aware of The Lie being a lie are never fully comfortable with it, but they prefer the static falseness of The Lie over the chaos of unknown truth values. This gets to one of The Lie’s stabilizing social purposes. It does try to wipe out Truth, and with a vengeance. It fights that with the most ardor imaginable, because that’s an existential struggle. The Lie fights truth hardest of all, but The Lie also fights other lies, and that’s why people tolerate it.

One of the easiest ways to make enemies to counterfeit some social status currency (or credibility). That’ll piss off both sides, on the matter of how people feel about that currency. People who buy into it become enemies, in defense of what was just diluted by an attempt at counterfeit. People who oppose that currency despise the counterfeiter with equal fervor because the fakery validates it. So when people feign credibility for a convex deception and fail, they’re a common enemy for everyone. That’s good for The Lie. The Lie loves common enemies, and if those enemies are liars, it can make itself look truthful or, at the least, “credible” (there’s that concept again). That’s because people tend to assume false dichotomies on a variety of moral issues, creating “sides” that lead to wrong conclusions.

I’ve opined on moral alignment and noted that, while good always treats other forms of good with basic respect– there can be disagreement and debate, but not malicious harm– there is no such convenant among evil. Good respects good as inherently valuable. Evil does not respect other evil; it only values strength. This gives good a certain unifying strength: a more cohesive, visibly altruistic, message. While good people often argue endlessly about tactical concerns, they’re all “on the same side”. That leads to a misperception that there’s an “evil side”. There isn’t. Evil fights good and evil. It lacks that cohesiveness. What is it, then, that makes evil strong, with enough power to oppose good with almost equal force? Most people aren’t “aligned with” good or evil. They’re in the weak, indecisive middle. Evil is more willing to recruit them. Good wants to recruit people honestly, and treat them as equals. That doesn’t “scale” into the moral middle classes. Evil is much more comfortable with recruiting them as inferiors and with dishonesty. One time-honored recruiting tactic, for evil, is to choose some powerless (or nonexistent) subsector of evil and punish it brutally, thus appearing to weak souls to be an anti-evil force, thus good. The Lie works in a similar way. I don’t mean to imply that typical status-inflating convex lies are evil, but most people find them to be unethical. When The Lie smashes a caught liar on the rock, it persuades the weak-minded (often, disproportionately represented in the Clueless ranks that are an organization’s muscle) that it stands for what is (if clearly not truthful) ethical, at the least. Of course, that’s a lie on it’s own.

That is the fundamental problem with convex dishonesty. It’s sometimes expedient, and sometimes a person’s career needs it, but over time it strengthens The Lie (one of whose sources of power is a fear of status-inflators and subversives; being one justifies it). When you run a convex fraud, you’re borrowing credibility on fraudulent terms (stealing) even though you have the (morally good) intent of paying everyone back multiply and making your creditors more than whole. The problem is that if you succeed, you validate that credibility currency that you stole, strengthening The Lie. If you fail, you give the Lie and the useful idiots a common enemy in you, also strengthening The Lie. I won’t call convex dishonesty unacceptable as a means of corporate survival and self-advancement, because it’s often just necessary in a trust-sparse environment, but it is corrosive to organizations. One way or another, this class of dishonesty strengthens The Lie.

An organization that wants to be healthy can’t tolerate The Lie. It needs to kill it at root. If it’s going to avoid generating one, it needs to create a trust-dense environment where the “bozo bit” is always off. There’s no alternative, because when trust sparsity is in effect, the only people who can succeed (and acquire credibility in the pseudo-meritocracy) are those willing to partake in convex dishonesty. This generates an undesirable selection pattern in which organizational success favors convex dishonesty, which evolves into all-out dishonesty. Over enough time, this moves away from the good-faith, “team-building” convex deception and toward outright “cooking the books”.

Solving It

This is why trust is so damned important, but trust is hard to manage at scale. You might trust your friends, but do you trust their friends? At some point, the warm-fuzzy social currency of trust needs to give way to structure. You actually need to go into the painstaking process of formalizing social contracts. If you’re running a company, what does the Employee Bill of Rights look like? You don’t need one at 8 people; you certainly do need one at 300. You need to set minimum trust, by which I mean giving employees enough basic credibility that they don’t need to perpetrate convex lies to grow and take risks. You also need to set maximum trust, both to crack down on the proto-managerial thugs who’d abuse the power vacuums left by formal management’s fundamental decency to extort others into supporting their career goals, and to give meaning to the minimum trust offered. (If people are “boundlessly trusted”, that just means you’ve been lazy and will rule ad-hoc, because the concept makes no sense.) There’s work to do on where to set the posts, and while I think it’s obvious where I stand (trust people with their own time; distrust those who attempt to control others) I will flesh that out further in further installments.

In Part 17, I discussed financial trust and the use of extreme transparency to ensure investors, employees, and management that everyone’s being compensated fairly. In Part 18, I discussed industrial trust– do you trust your employees to get the work done, and to do it well?– and how it requires not micromanagement but a self-executive focus on driving toward Progressive Time. Now I’ve discussed the forces that conspire against trust. People either need or think they need convex dishonesty to get things accomplished. Organizations compensate by creating an internal social currency called “credibility”, which evolves its own pile of lies that become The Lie. The Lie generates trust sparsity as its beneficiaries fight for its upkeep, and the organizational self-loathing and dysfunction that come out of trust sparsity generate more convex dishonesty to overcome an increasingly strong Lie. The alternative is to Live in Truth– to name The Lie and stand in opposition to it. Individually, this is dangerous and impotent: you lose credibility, become “disgruntled guy”, then “fired guy“. Collectively, it’s powerful. If The Lie cannot discredit the group as a whole, it falls to pieces. Organizations, however, shouldn’t wait for whistleblowers to call them out. Reliance on individual heroism is not a good strategy, but shows the absence of such. If you want a healthy organizational culture, you have to fight The Lie proactively. Living in Truth must be a central pillar of the culture.