“Around the Godde there forms a Shelle of prayers and Ceremonies and Buildings and Priestes and Authority, until at Last the Godde Dies. And this may notte be noticed.” — Terry Pratchett, Small Gods
Every dysfunctional human system starts with a story of extraordinary achievement. In the USSR, it was Lenin and the Soviet revolution. In tech, it’s Steve Jobs. Because Jobs knew he had excellent product sense and a flair for marketing, he famously toed the line between inflexibility and abuse, which has resulted in tech reinterpreting those qualities as evidence of genius (in white men). Of course, this gets the causation exactly the wrong way around. No one would ever have put up with Jobs without constant proof that his business acumen was worth it. But people have forgotten that key context with the passage of time.
The rush to generalize behaviors that worked in a specific context, at a specific time, with specific people, has caused untold systemic problems within tech. For example, I’ve watched many tech managers research their positions by reading blog posts, books, advice columns, and so on — instead of simply listening to their team and putting effort into dealing with their problems. In fact, in my experience, they mostly use the advice to explain to their reports why they won’t take feedback seriously.
On some level, everyone in tech knows that the system and its founding stories are inadequate. They see the toll this philosophy takes on those without organizational power, and they do genuinely feel bad about it. But they also believe that there are similarly-managed tech startups succeeding all around them, and so they refuse to conclude that the problem is with the overall system. (It’s worth noting that the belief that other companies are similarly managed or successful is almost always based on buzzword-filled conversations or generic news, not on actual knowledge.) In this way, tech leaders box themselves into the only remaining option, one that saves their feelings without changing anything:
Parallel #2: Lies
I won’t go into too much detail about the public lies that everyone already knows about, like the extent and usage of collected data, wild exaggerations of the utility of new products, and so on. The bottom line is that tech leaders feel comfortable with these falsehoods because tech’s internal culture is saturated with lies. Here Ariely mentions lying to investors, and in fact it’s common for tech startups to keep three sets of goal numbers (referred to as OKRs): one for employees, one for the board, and one for potential investors. Everyone involved knows this, and the dysfunction engendered is not dissimilar to the USSR’s internal numbers vs propaganda numbers.
Throughout The Inventor, people refer to Elizabeth Holmes, the founder of Theranos, as believing that she was doing the same thing as the startups around her, with the implication that she wasn’t. But she was. The inflated expectations, hacked-together or faked demos, the pathological optimism — that’s tech in a nutshell. I don’t think there’s a single company in the Valley that hasn’t faked a demo at some point, or wildly exaggerated their internal capabilities to the point of outright lies. (Most machine-learning-based startups aren’t.) What people don’t realize is that you can only get away with this to the extent that your business relies on code, which allows you to literally rewrite the reality of your company. The Theranos/Fyre/Chernobyl mistake was applying this philosophy to situations where reality wasn’t so easily manipulated.
Everyone in tech pays lip service to the idea that failure is possible. But too many offset this by, on some level, buying into the idea that their particular system is infallible. They won’t phrase it that way, instead saying things like “Well, it works at other companies,” without knowledge of those internal contexts, or that the founder/CEO has led previous companies to success, leaving implicit the assumption that that success is repeatable. But good outcomes only heighten the delusion of infallibility.
The best employees and leaders are self-aware enough to understand the lie and take it into account when making decisions. Their mental labor makes the system powerful, and over time, it becomes easier and easier to believe the fantasy, especially since those who believe it argue less and are therefore more likely to be promoted. But over time, even necessary lies irritate the mind. They itch at our souls. The life they bring us begins to feel worthless, and the systems they prop up degenerate.
What does that look like?
Parallel #3: Toxic hierarchy
Anxious people want control, and in a large, degenerate system, no one is more anxious than the people at the top. If leaders also believe that their system is infallible, the only way to deal with their doubts is to dig in their heels and categorically refuse the mental effort of evaluating whether they’re using their power well. They know there’s more they should be doing, and they spend every minute of every day choosing not to do it.
Interestingly, tech used to pride itself on not having this problem. At the beginning of my career, the trendy startups of the day had done away with the whole idea of management. Of course, this resulted in the tyranny of structurelessness. Pretending that humans don’t form hierarchies is another form of self-delusion, and even worse, the implicit hierarchies that everyone busily denied were tied to outcomes like compensation. The upshot is that managers are now considered necessary, but most people in those positions have never actually experienced good management. So the old problems of hierarchy are reasserting themselves.
Whether the people in them realize it or not, toxic hierarchies often trace back to military-style management. The context that doesn’t make it across is that the military is also the only human institution where awareness of toxic hierarchy is (in the best case) culturally ingrained.
Although I’ve never served in the military, I suspect that this culture comes partially from the kind of work soldiers do: discrete, high-stakes missions with well-defined risks and rewards. Civilian leaders need to direct ongoing and often ambiguously low-stakes processes, which make the problems with the system feel simultaneously larger and less worth fixing.
Another form of toxic hierarchy is discrimination against socially powerless groups, which both tech and the USSR are famous for. At its root, this discrimination isn’t about the specifics of religion, gender, race, sexual orientation, or what have you. It is, instead, about access to power. In a toxic hierarchy, might makes right, so someone without any kind of social backing by definition must not be worth listening to. This is amplified by cultural differences between majority and “minority” groups, which require mental effort to understand and act on. An anxious, mentally inactive leader will interpret this as a sign that the issue is insignificant, and reassert their power.
The original scenes of most of these gifs depict harsh verbal abuse, including insults, name-calling, yelling, and so forth. But the core of abusive management is the absolute refusal to even consider putting forth mental effort. Insults aren’t necessary for a manager to refuse anything except total agreement, preferably phrased to fit managing-up requirements. In my experience, the idea that loud verbal assaults are the only mark of toxicity worsens management overall. A manager who refuses to experience emotions is often a manager who uses their power to calmly impose perfectionism on their reports.
Abusive management starts at the top, with CEOs who spend more time fundraising than making critical business decisions. (Or, sometimes, who just refuse to make any decisions at all.) The next level of leaders must then make those decisions as a committee, and defend them by managing up. This leaves them no time for their jobs, which imposes the same dynamic on their reports, and so on down the org hierarchy. Soon everyone in the company is consumed with making up for the deficiencies of the people above them. This leads to:
Parallel #4: Impatience and corner-cutting
To be sure, some amount of corner-cutting can be necessary to move forward. A good leader correctly judges where to draw that line. Abusive managers backed by the delusion of an infallible system use it as an excuse to do as they like. And if you cut too many corners, you end up with a circular peg for a square-shaped organizational hole, which leads to even greater delays, aka excuses for further corner-cutting.
When this happens systematically for low-stakes processes, the outcome is unworkable operations.
A surprisingly well-kept secret of tech companies is how many of the basic operations are carried out manually. I don’t just mean the publicized examples of content moderation or other machine-learning scandals. Manual execution is used for all sorts of fundamental processes, often in lieu of extremely simple automation. It’s baffling — you’d think any organization run by engineers would understand that automation is the fundamental value add of software. But most engineering managers drastically under-prioritize operational automation, preferring instead to focus on new features or code standards. These are both important, but the focus on them is another artifact of tech’s pure code past. A company with major human operations must first get those right, meaning quickly repeatable with a very low error rate and the minimum of manual intervention. In my experience, this is the most difficult perspective shift to ask of engineers, eng managers, and PMs.
Finally, many have already pointed out a key similarity between the USSR and modern tech: overpromising and underdelivering on products. This is a natural consequence of unworkable operations, corner-cutting, and toxic hierarchy. Everyone is so busy dealing with the system that there’s not much energy left to make good product decisions.
What happens when that failure becomes apparent?
Parallel #5: Total dysfunction
The above gif is as close to tech-startup dysfunction as makes no difference. Notice how Dyatlov and Fomin both try to shift blame onto each other through the use of precise names and titles, and how Fomin agrees that the hydrogen tank ignited, with absolutely no independent assessment of the situation. When he says it’s the only logical explanation, it’s not only because he’s unaware of the RBMK reactor design flaw. He’s also agreeing that the tank explosion is the most suitable story for an early-morning meeting with an irate committee. In other words, it must be the right explanation simply because it’s the easiest one to say out loud.
In my experience, this reasoning is also a well-tested defensive posture. There’s very little accountability for people (well, for men) who worsen situations with an initially wrong assessment, as long as everyone feels that that wrong assessment was “logical.” (In tech this is summarized by the famous saying, “No one was ever fired for recommending Microsoft.”) Crises end up being treated as litmus tests to prove that everyone is working through the same thought process. This results in pathological secrecy, especially when anxiety over competition is factored in.
And once anxious, mentally inactive leaders realize how many secrets there are, they become even more fearful.
Most startups will tell you that they’re absolutely not siloed, that they truly understand the utility of everyone being able to touch different parts of the business. But getting value from that approach requires better management than most tech leadership is capable of. So even if they’re committed to matrix management or full transparency, tech managers often end up finding new ways to silo people. For example, I once worked on a team where all the women had the same complaint about the team leadership. This took a year to figure out, because the managers told each woman that her problems were unique and that discussing them with the rest of the team would be unprofessional and pointless. Although we weren’t officially siloed, it became clear that we had been unofficially kept apart to make the managers’ lives easier, in much the same way that Theranos kept its teams apart to make its leaders’ lives easier.
The last aspect of this dysfunction is a pathological normalization of failure. After all, if everyone thinks the same way and the system is too big to change, then there’s literally no way to collectively learn from failure. As long as ideological orthodoxy is maintained, failures are written off as flukes, or excused as simply inevitable, but are never, ever a reason to change approach, unless that change is forced.
At this point, the system is no longer capable of success.