Section II

Section II

Historical Background On The Phenomenon


Using context to understand why hackers set out to build digital currency systems.

“Corporations have neither bodies to be punished, nor souls to be condemned; they therefore do as they like.”

— Edward Thurlow, Lord Chancellor of Great Britain, 1778-1792.[35]

Satoshi Nakamoto was the first participant in his own network, and left a message within the very first “block” of data produced by Bitcoin. The message within this so-called Genesis Block read:

image23

Figure 1. The message left by Satoshi Nakamoto in Bitcoin’s Genesis Block.
(Credit: Reddit)[36]

The original headline appears in the British paper The Times (see figure below). The inclusion of this note is a source of widespread confusion.

Given what we know about Nakamoto’s motivation to create a free economic space outside the purview of institutional oversight, it would seem that this message makes light of the sympathetic relationship between politicians and central bankers. Many people use this allusion to infer that Bitcoin was purpose-built as some kind of disruptor or destroyer of central banks. Taken this way, the headline would seem to be a statement of superiority or self-righteousness.

We suggest that this is a mischaracterization. If Bitcoin does evolve into a large-scale alternative currency system, then Nakamoto’s use of The Times headline will strike historians as timely, but it is more than just a political statement.

image8

Figure 2. The headline reproduced in the Genesis Block.
(Credit: Twitter)

In fact, putting a headline in the Genesis Block has a second, more practical purpose: it serves as a timestamp. By reproducing the text from that day’s paper, Nakamoto proved that the first “block” of data produced by the network was indeed made that day, and not prior. Nakamoto knew Bitcoin was a new kind of network that prospective participants would scarcely believe was real. At the outset, it would be important to send a signal of integrity to people who might join. Getting volunteers to value the project was top priority, indeed a far higher priority than mocking central bankers.

For investors outside the technology industry, understanding this volunteer-based way of working is critical to understanding why Bitcoin operates the way it does, and why it is an improvement on conventional methods of human collaboration. To get to these points, we will first explore the origins of the “war” that Satoshi is engaged in, and how the invention of Bitcoin is meant to change the tide.

The old friction between technologists and management

For the last 50 years, corporate technology companies are increasingly at odds with the engineers that build their critical systems. Recent headlines tell the story: at Microsoft, Amazon, and Salesforce, employees protested contracts with Customs and Border Patrol and ICE.[37] [38] At Google, employees protested the company’s Project Maven AI contracts for the Department of Defense, which promised to increase the accuracy of drone strikes; it bowed out from Project Maven, but has said it will continue to work with the US military in other projects.[39] [40] Google’s announcement that it would agree to censor search results inside China drew 1400 workers to protest.[41] Microsoft is facing a lawsuit by two employees who may have suffered PTSD after seeing child pornography as part of “content moderation” roles.[42] YouTube employees describe their jobs as a “daily hell of ethics debate.”[43] Facebook has experienced protests for the gentrification wrought by its tens of thousands of employees, as well as more recent protests for its “intolerant” political culture.[44] [45]

Other abuses of technological systems include the personal data leak at Equifax, and the abuse of account-creation privileges within the Wells Fargo bank computer system, where accounts were opened and cards issued—in some cases, with forged signatures—in service of sales goals.[46] [47] The worst example of abusive corporate software systems might be the maker of the automated sentencing software employed by some court systems, called COMPAS, which has been shown to recommend prison terms based on the convict’s race.[48]

Tensions between software developers and their employers have spilled out of Silicon Valley and into mainstream news. “This engineer’s lament is a microcosm of a larger trend sweeping across the Peninsula” of San Francisco, reported Vanity Fair in August of 2018:[49]

“In Silicon Valley’s halcyon days, employees didn’t have any qualms about the ethics of the companies they were joining since many honestly believed that they were going to advance a corporation that was going to—yes—change the world. The people who helped transform the Bay Area into the greatest wealth-generation machine in human history—and themselves into millionaires and billionaires in the process—are now turning their backs on the likes of hegemonic corporations who, in their own depictions, moved fast and broke things without an end in sight.”

The article quotes an anonymous Uber executive who fears that ethical issues will motivate engineers to leave en masse: “If we can’t hire any good engineers, we’re fucked.”

This is a liminal moment in business, where the “good engineers” suddenly have leverage over the wealthy and elite management of some of the largest corporations in the history of the world. This development did not arrive overnight; it has its origins in a tension that originated decades ago.

Next, we will look at how the balance of power shifted, and how Bitcoin tips the scale further for the “good engineers.” To appreciate how the software engineers got their leverage, we must begin in the early 20th century, and learn how managers and engineers got to be at odds in the first place.

The emergence of the corporate institution (1900-1929)

The study of human behavior in a business context has a rich tradition. Perhaps the first person to take a meaningful step forward in this discipline was Frederick Winslow Taylor. “Taylorism,” his conception of management science, was all about rational planning, reducing waste, analyzing data, and standardizing best practices.[50] Business owners used these techniques to drive workers uncommonly hard. Andrew Carnegie obsessed over worker productivity, becoming so frustrated with the Homestead Strike of 1892 that he hired a private police force to have picketing workers shot.[51]

Thorstein Veblen was a Norwegian-American economist who published his seminal study of practitioners of management science in 1904. He created a series of insights about the nature of “institutions,” as distinct from the “technologies” used by them. This distinction is a good starting point for understanding the problems that arise for people who create new technologies within institutions.[52]

An important aspect of Veblen's concept of "institution" is that they are by nature non-dynamic—they resist changes that don’t benefit the top people in the hierarchical structure. Hierarchy persists through what Veblen called “ceremonial aspects,” traditional privileges that served to elevate the decision-makers. It is new technological tools and processes which make the institution profitable. But so-called “spurious” tools may be also be produced because they have ceremonial aspects that make management look or feel good.[53]

After the Great Depression, the historian and sociologist Lewis Mumford would develop the idea that “technology” had a dual nature. Polytechnic developments involved complex frameworks which combined technologies to solve real human problems; Monotechnic developments were technology for its own sake.[54] Monotechnics oppress human beings, Mumford argued, citing the automobile as one such development that crowded out pedestrians and bicyclists from roads, and led to a massive annual death toll on American highways.

The institutions of the day, corporations and governments, Mumford called megamachines. Megamachines, he said, are comprised of many human beings, each with a specialized role in a larger bureaucracy. He called these individuals “servo units.” Mumford argued that for these people, the specialized nature of the work weakened psychological barriers against questionable commands from leadership, because each individual was responsible for only one small aspect of the machine’s overall goal. At the top of a megamachine sat a corporate scion, dictator, or commander to whom god-like attributes were attributed. He cited the lionization of Egyptian Pharaohs and Soviet dictators as examples.

Ceremonial, spurious, monotechnic developments could lead to extremely deadly megamachines, said Mumford, as in the case of the Nazi War Machine. This phenomenon owed itself to the abstraction of the work into sub-tasks and specialties (such as assembly line work, radio communications). This abstraction allowed the servo-units to work on extreme or heinous projects without ethical involvement, because they only comprised one small step of the larger process. Mumford called servo-units in such a machine “Eichmanns,” after the Nazi official who coordinated the logistics of the German concentration camps in World War II.

In the early 20th century, the new and trendy field of “management science” was greatly influenced by Fordism: the practices of Henry Ford. Fordist mass production was characterized by a rigorous and somewhat dreary focus on efficiency, specialization, mass production, reasonable hours, and living wages.[55] But when the Great Depression came, owners like Ford laid off workers by the tens of thousands. Wages dropped, but the punishing nature of the work remained.

Ford Motor Company laid off 60,000 workers in August of 1931. Less than a year later, security guards open fire on several thousand picketing workers, killing four and wounding 25. Henry Ford placed machine gun nests around his home, and equipped guards with teargas and surplus ammunition.[56] As the 1930s wore on, American workers continued to riot and picket against ruthless owners’ tactics.

Modern management emerges to protect workers (1930-1940)

After the Depression, a class of professionals emerged to take major business decisions away from the business owners. Industry would be run by professional managers, who would execute plans in the best interest of both the owners and the employees. They derived their positions and power from their competence, not their percentage of ownership. The greedy shareholders could be held at bay in this new structure. [57] John Kenneth Galbraith, the Harvard economics professor, studied this phenomenon at the time:

“The power passed from one man—there were no women, or not many—into a structure, a bureaucracy, and that is the modern corporation: it is a great bureaucratic apparatus to which I gave the name the Technostructure. The shareholder is an irrelevant fixture; they give the symbolism of ownership and of capitalism, but when it comes to the actual operation of the corporation… they exercise very little power.”[58]

This “bureaucratic apparatus” of the Technostructure consisted of upper tier managers, analysts, executives, planners, administrators, operational “back office” staff, sales and marketing, controllers, accountants, and other non-technical white-collar staff. [59]

In 1937, Nobel Prize winner Ronald Coase built on the ideas of the managerial scientists to theorize why these massive firms were emerging, and why they accumulated so many workers. He theorized this behavior was rational, and was aimed at reducing transaction costs. He wrote:

“The source of the gain from having a firm is that the operation of a market costs something and that, by forming an organization and allowing the allocation of resources to be determined administratively, these costs are saved.”[60]

In other words, in the hiring of skilled labor, it is cheaper to retain a salaried worker who returns each day, than to go out each day and select a new temporary candidate from a pool of contractors in a “market.” He continued:[61]

“Firms will emerge to organize what would otherwise be market transactions whenever their costs were less than carrying out the transactions through the market.”

The corporation was the most efficient way to mass produce and distribute consumer goods: it tied together supply chains, production facilities, and distribution networks under centralized management.[62] This increased efficiencies and productivity, lowered marginal costs, and made goods and services cheaper for consumers.

Managerial bureaucracy becomes abusive to the engineer class (1940-1970)

As of 1932, the majority of these corporations were, in all practicality, no longer controlled by their majority shareholders, classified by economists as “management-controlled.”[63] The management fad which became known as “separation of ownership and control” spread throughout the major public corporations.

The moral hazards of management-controlled companies became increasingly obvious as the 1930s wore on. Management-controlled companies were run by executives which, despite not owning many shares, eventually achieved “self-perpetuating positions of control” of policies, because they are able to manipulate the boards of directors through proxies and majority shareholder votes.[64] These machinations sometimes created high levels of conflict. In the early 1940s, the idea emerged that this structural divide in the corporate world was being mimicked in the social and political worlds, with a distinct elite “management class” emerging in society.[65]

Institutional economists drew a distinction between the management class and the class of “technical operators” (the people doing the work, in many cases engineers and technicians). The managerial elite consisted of the “analysts” or “specialists” who acted as the bureaucratic planners, budgetary allocators, and non-technical managers.[66]

A strange power dynamic emerged between the analysts and the technical staff in the computer companies which had emerged between 1957 and 1969; this dynamic was studied by industrial economists in both the UK and US.[67] They found that the analysts jockeyed for power, creating conflict. They won favor and influence over the company by expanding their divisions, creating opportunities to hire more direct reports, or to win a new promotion, a tactic known as “‘empire building.” [68] The overall effect on the organization was misallocation of resources and incredible pressure to grow.[69] Sales and development cycles were persistently rushed. The computer analysts’ slogan became, ‘if it works, it’s obsolescent.’” The analysts had ‘a vested interest in change.’”[70]

This dynamic had created dysfunction. Managers used a variety of social tactics to enforce their will and agenda, in spite of technical realities, reflecting Veblen’s observation about “ceremonial” institutions 75 years before.[71] Documented tactics included:

  • Organizational inertia: New and threatening ideas are blocked with advice “idea killers" including: "the boss won't like it," "it's not policy," "I don't have the authority," "it's never been tried," "we've always done it that way," and "why change something that works?"
  • Budget games: “Foot in the door,” where a new program is sold in modestly, concealing its real magnitude; “Hidden ball,” where a politically unattractive program is concealed within an attractive one; “Divide and conquer,” where approval of a budget request is sought from more than one supervisor; “It's free,” where it is argued that someone else will pay for the project so the organization might as well approve it; “Razzle-dazzle,” where a request is supported with voluminous data, but arranged in such a way that their significance is not clear; “Delayed Buck,” where deliverables are submitted late, with the argument that the budget guidelines require too much detailed calculation; and many others.

These tales from the 1960s anticipate the emergence of the popular cartoon Dilbert in the 1990s, which skewered absurd managerial behavior. Its author, Scott Adams, had worked as a computer programmer and manager at Pacific Bell from 1986 to 1995.[72]

image3

Figure 3. Dilbert captured the frustration of software engineers in a corporate setting.
(Credit: Scott Adams)[73]

Group identity develops amongst professional technologists (1980-2000)

The dictatorial behavior of the management class belied the true balance of power in technical organizations.

In the 1980s, the entire weight of many industrial giants rested upon its technologists. But their role put them in a strange position, at odds with the rest of their organization. Placed at the margins of the organization, closest to the work, they were removed from the C-suite and its power plays. Not working with executives directly, the technologists identified far less with the heads of the company than the managers, who directly reported to C-suite.[74]

The technologists’ work was enjoyable to them, but opaque to the rest of the organization. A power dynamic emerged between the technical operators and the rest of the company; their projects were difficult to supervise, and proceeded whimsically, in ways that reflected the developers’ own interests.[75]

Their power to work this way originated in their critical skills. These skills act as a wedge within organizations, earning technical operators considerable freedom of direction. The efficacy of this wedge increased when the technical operator provided a skill which was in great demand, affording them job mobility. In this instance, their dependence on the organization was reduced. Company ideology was typically not a strong force amongst technologists, in comparison to “professional ideology,” or the belief in the profession and its norms.[76] The elite technologists were becoming outsiders within their own companies.

Instead of loyalty to company or CEO, technologists developed, as a professional goal, loyalty to the end-user or client. A company’s technologists were focused on the needs of the existing customer, while the analysts and managers (whose work did not deal directly with the end-user) supported more abstract goals like efficiency and growth.[77]

The hacker movement emerges

The hacker movement had originated amongst software-makers at MIT in the 1960s.[78] Perhaps seen as an antidote to the managerial dysfunction inside the older corporate tech companies, the hacker movement’s focus on practical, useful, and excellent software spread rapidly across the country in the 1980s and 1990s.[79] MIT software activist Richard Stallman described hackers as playful but diligent problem-solvers who prided themselves on their individual ingenuity: [80] [81]

“What they had in common was mainly love of excellence and programming. They wanted to make their programs that they used be as good as they could. They also wanted to make them do neat things. They wanted to be able to do something in a more exciting way than anyone believed possible and show ‘Look how wonderful this is. I bet you didn't believe this could be done.’ Hackers don’t want to work, they want to play.”

At a conference in 1984, a hacker who had gone to work at Apple to build the Macintosh described hacker status as follows: “Hackers can do almost anything and be a hacker. It’s not necessarily high tech. I think it has to do with craftsmanship and caring about what you’re doing.” [82]

The hacker movement is not unlike the Luddite movement of the early 19th century, in which cotton and wool artisans in central England rose up to destroy the Jaquard loom which threatened to automate them.[83] Unlike the Luddites, who proposed no better alternative to the loom, hackers came up with another approach to making software which has since produced superior products to their commercial alternatives. By using the Internet to collaborate, groups of volunteer developers have come to produce software that rivaled the products of nation states and corporations.[84]

New Jersey style emerges

The “New Jersey style” of hacking was originated by Unix engineers at AT&T in suburban New Jersey. AT&T had lost an antitrust settlement in 1956 which precluded it from entering the computer business; thus it was free to circulate the computer operating system it had built, called Unix, to other private companies and research institutions throughout the 1970s. The source code was included, and these institutions regularly modified it to run on their particular minicomputers. Hacking Unix became a cultural phenomenon within R&D departments around the US.

Unix was rewritten for personal computers by several groups of developers. Linus Torvalds created his own version, “Linux,” and distributed it for free, just as AT&T had done with Unix. (As we will show, Linux has become enormously successful.) The approach taken by Torvalds’ and other Unix hackers uses playfulness as an energizing force to build useful (if difficult) free software projects.[85] The Finnish computer scientist and philosopher Pekka Himanen wrote at the time: “To do the Unix philosophy right, you have to be loyal to excellence. You have to believe that software is a craft worth all the intelligence and passion you can muster.“[86]

R&D developers realize “Worse is Better”

Out of New Jersey style, software engineers developed a set of ad-hoc design principles that went against the perfectionism of institutionalized software. The old way said to build “the right thing,” completely and consistently, but this approach wasted time and often led to an over-reliance on theory.

Written during the early 1980s by Richard Gabriel and published by Netscape Navigator engineer Jamie Zawinski in 1991, the “worse-is-better” philosophy boiled down the best of New Jersey style and hacker wisdom. It was seen as a practical improvement on the MIT-Stanford hacker approach. Much like the MIT ethic, worse-is-better values excellence in software. But unlike MIT-Stanford, the worse-is-better approach redefines “excellence” in a way that prioritizes positive real-world user feedback and adoption over theoretical ideals.

Worse-is-better holds that, so long as the design of the initial program is a clear expression of a solution to a specific problem, then it will take less time and effort to implement a “good” version initially, and adapt it to new situations, than it will to build a “perfect” version straight away. Releasing software to users early and improving a program often is sometimes called “iterative” development.

Iterative development allows software to spread rapidly and benefit from real-world reactions from users. Programs released early and improved often become successful long before “better” versions written in the MIT approach have a chance to be deployed. With two seminal papers in 1981 and 1982, the concept of “first-mover advantage” emerged in the software industry around the same time that Gabriel was formalizing his ideas about why, in networked software, “worse is better.” [87] [88]

The logic of worse-is-better prioritizes viral growth over fit and finish. Once a “good” program has spread widely, there will be many users with an interest in improving its functionality and making it excellent.[89] An abbreviated version of the principles of “worse is better” are below. They admonish developers to avoid doing what is conceptually pleasing (“the right thing”) in favor of doing whatever results in practical, functional programs (emphasis added):

  • Simplicity: This is the most important consideration in a design.
  • Correctness:The design must be a correct solution to the problem. It is slightly better to be simple than correct.
  • Consistency: Consistency can be sacrificed for simplicity in some cases, but it is better to drop those parts of the design that deal with less common circumstances than to introduce either implementational complexity or inconsistency.
  • Completeness: The design must cover as many important situations as is practical. Completeness can be sacrificed in favor of any other quality. In fact, completeness must be sacrificed whenever implementation simplicity is jeopardized.

These conceptual breakthroughs must have been exciting to the technologists of the early 1980s. But the excitement would soon be disrupted by rapid changes in business.

The shareholders use hostile takeovers to clamp down on everyone

The hacker-centric environment inside universities and large research corporations collapsed, and researchers at places like the MIT AI Lab were poached away by venture capitalists to continue their work, but in a proprietary setting.[91] The hostile take-over trend had begun a decade before in the UK, where clever investors began noticing that many of the family-run businesses were no longer majority owned by their founding families. Financiers like Jim Slater and James Goldsmith quietly bought up shares in these companies, eventually wrestling enough control to break up and sell off units of the company. This became known as “asset stripping,” and we will return to this topic in Section VII of this essay.[92]

In the 1980s, American bankers hit upon a way finance takeovers at massive scale by floating so-called junk bonds, then busting up the target company and reaping massive rewards from the sale of the parts.[93] In this way, managerial capitalism eventually lost its hold over business, and became a servant of the capital markets.

“Activist investors” came to represent shareholder interests, and took action to fire and hire C-suite executives who would maximize share price.[94] As the 1990s dawned, many hackers saw their companies struggle to contend with shareholder demands, the threat of hostile takeover, and competition from new Silicon Valley startups.

As tech companies moved faster, they developed ways for management to enforce policy and resource allocation. Microsoft and others adopted a rigorous “stack ranking” system whereby employees were assigned numerical scores on regular intervals using a “performance review” process, in order to determine promotions, bonuses, and team assignments. A certain percentage of bottom-ranking employees were fired. This system is still used by tech companies today, but Microsoft abandoned it in 2013.[95] Google adopted stack ranking recently to establish eligibility for promotions, but does not fire poorly-scoring employees.[96] Stack ranking systems are widely hated for the uncomfortable power dynamics they create.[97] [98]

Today, investors demand from their companies precise predictions about each quarter’s profitability, and less concern is paid to capital investment. Tesla is one notable technology company which has articulated the way quarterly guidance and short-termism diminish a high-tech company’s long-term prospects.[99] According to the Business Roundtable, a corporate alliance chaired by Chase Bank CEO Jamie Dimon, quarterly guidance has become “detrimental [to] long term strategic investments.”[100]

Summary

In this section, we have looked at the ways that 1940s-era management make life unpleasant for high-tech workers, and how these patterns persisted into the 1990s, disenfranchising technical workers. We’ve shown a strong “guild” identity developed which transcends loyalty to the employer. We’ve associated this identity with the growth of hacker culture and its principles.

Next, we will explore how antipathy towards the management class grew into a wider suspicion of all institutional oversight, and how their struggle to get out from under such oversight acquired a moral dimension. We will examine why hackers looked to cyberspace and cryptography for sanctuary, with a determination to build new tools outside the purview of the management class. We will consider the surprising success of free software tools produced by hackers, and consider the ways that corporate employers have alternately fought, and also tried to emulate, hacker methodology. Finally, we will encounter Bitcoin as the realization of many hacker ambitions in a single network.