Build. Learn. Invest. Iterative Capital

We are a global financial technology firm offering liquidity services to the emerging digital currency markets.

Emergent Consensus: What's Possible for Corporations with Bitcoin

In Bitcoin, the expenditure of energy by miners to create block hashes is an act which crystallizes the financial value of each transaction. As more blocks gets added to the Bitcoin blockchain, the cost of reverting a past transaction increases, and hence the probability of the transactions in the block being finalized increases. Proof-of-Work is cumulative, in the sense that with more computing power on the network, it becomes more expensive to attack it, making the ledger more secure.

In Bitcoin’s original whitepaper, Section IV “Proof-of-Work” is written as the following:

“To implement a distributed timestamp server on a peer-to-peer basis, we will need to use a proof-of-work system… Once the CPU effort has been expended to make it satisfy the proof-of-work, the block cannot be changed without redoing the work. As later blocks are chained after it, the work to change the block would include redoing all the blocks after it.”

Conceptually, Proof-of-Work burns energy in block-issuance, which allows network participants to view immutability objectively. Proof-of-Work reduces the entropy level within the system by consuming energy to create machine consensus around an ordered set of transactions. The cost of electricity consumption is borne collectively by miners to find “order” in “chaos” without a central coordinating agent. This is the process through which physical resources (ie., energy) are transformed into digital resources in the form of blocks of transactions, and the coinbase rewards which are the outcome of block production. Because these digital assets (ie., blocks and transactions) are encoded on physical computer memory, it can be said that the Proof-of-Work process sublimates electricity into a physical bearer instrument similar to the way that gold mining and minting can produce gold coins.

What is the purpose of elucidating this process in such detail? Why does this make Bitcoin the ideal foundation for a new form of organization?

Blocks order transactions

We know that Bitcoin hashes together groups of transactions to create a single, verifiable block. We also know that the blockchain creates a transaction history that cannot be changed without expending enormous amounts of energy. Accomplishing these two feats required some ingenuity on Satoshi Nakamoto’s behalf, so before we talk about what Bitcoin enables, let's talk about its most unique innovation.

Bitcoin users exist all over the world, and their individual transactions must travel slower than the speed of light, so latency causes nodes to receive messages at different times, or out of order.

In any financial system, errors in transaction-logging can create disagreements between parties because balances will appear incorrect, or transactions will be missing. If disagreements are constant, the system is not usable. Whether in a paper ledger or a digital database, cheaters or saboteurs who want to erroneously increase their own balance (or simply wreak havoc) need only to change the order of transactions (ie., their timestamp) or delete them outright to cheat other participants.

The practice of “writing” ledger data into a hard-to-alter physical record is at least 30,000 years old, as exemplified by the clay tablets used by the ancient Sumerians used before the development of paper, and the more recent wooden “tally sticks” (seen below) which were still legal tender in the United Kingdom until the 19th century.

Figure 10. Medieval tally sticks, notched and carved to record a debt on 32 head of sheep, owed to a local dean in Hampshire, England. (Credit: Wikimedia)

Of course, keeping track of changes is no sweat for a spreadsheet on a single computer. When applications span multiple computers, networks are required to carry messages between them. Multi-computer applications deal with  slow connections by using asynchronous algorithms, which are tolerant of dropped, latent, or out-of-order messages and are not driven by a time-based schedule. In an asynchronous system, computers engage in parallel processing, but without moving forward in lock-step. Instead, messages (often user actions) trigger a change on each and every machine as it hears about the message.

Nakamoto consensus is highly reliable

Bitcoin too is an asynchronous event-driven system. But unlike conventional distributed systems, participants are not permissioned, meaning they have not been authenticated and authorized prior to participating. Yet somehow they all transition the state of their ledger together without a leader or any sort of coordinating mechanism beyond their own self interest. That is, they reach consensus in an emergent way. How can self-interest be used to coordinate a group of disparate, unvetted, and possibly hostile individuals?

One of the many strokes of brilliance in Bitcoin is the use of economic incentives to keep miners producing valid blocks on schedule. Miners earn rewards denominated in the unit of account for the ledger they maintain; that is, in bitcoin. Nakamoto’s conjecture was that the desire to corrupt the ledger, which threatens the coin of the realm, would be outweighed by the desires of those with a vested interest.

This way, miners in a distributed system like Bitcoin can come to agreement about the order of transactions, even if some of the nodes are slow or even maliciously producing invalid blocks. This happens without the restrictive requirements of permissioned consensus.

Bitcoin’s the system has shown its resilience in both operational uptime and integrity of the ledger. Importantly, it can accomplish this feat without needing to vet the individual nodes on the network; machines can join or drop off at will, and the properties of the system remain the same.

In this way, we've shown how resilient financial infrastructure can be supported technologically by a dis-aggregated collection of commodity hardware operators. Similarly, the software development of the Bitcoin network is leaderless. As in most free and open source software projects, organization is done by "open allocation," that is, volunteer basis.

In the next section, we'll compare the emergent consensus of the machine network to the emergent consensus of the leaderless group of Bitcoin developers, and discuss how these parallels can hold lessons for corporations and startups looking to move faster.

Emergent consensus-based democracy

Most of the time, an open allocation group like the Bitcoin core contributors group can exist without a benevolent dictator on the project, using dialogue to reach consensus. An issue is discussed until everyone willingly reaches an agreement that all parties are willing to accept. Once no dissent remains, the topic of discussion becomes how to best implement the agreed-upon solution.

This form of governance is lightweight, blending the actual technical discussion itself with the decision-making process. Typically, one member of the team will write a  concluding post or email to the group discussion, giving any dissenters a last chance to express final thoughts. Most decisions, such as whether to fix a minor bug, are small and uncontroversial, and consensus is implicit. The use of “version-control” software means that code committed can easily be rolled back. This gives social consensus a fairly relaxed and low-stakes feel. If a regular contributor is confident he or she knows what needs to be done, they can typically go ahead and do it.

Sometimes, however, consensus is not easily reached, and a vote is required. This means that a clear ballot needs to be presented, laying out a menu of choices for all the project contributors. Like in the consensus process, the discussion of the ballot options is often enmeshed with the technical discussion. So-called honest brokers emerge who occasionally post summary updates for the contributors who are following the discussion from a distance.

The brokers are sometimes participants in the debate—they need not be above the issue—so long as they are accurately representing the views of each constituent group. If they are, then they can muster the credibility to call a vote. Typically those who already have “commit access,” meaning those people who have been given permission to write (or “commit”) code to the project repository are empowered to vote.

By the time a vote is called, there will be little debate about the legitimacy of the options on the ballot, however, obstructionists may try to filibuster. These people are politely tolerated if concern seems sincere, but difficult people are typically asked to leave the project. Allowing or banning contributors is also a matter of voting, however this vote is typically conducted privately among existing contributors, rather than on a general project mailing list. There are many voting systems, but they are mostly outside the scope of this essay.

Forking the code

A defining feature of free, open source software is its permissive licensing. Anyone is allowed to copy the codebase and take it in a new direction. This is a critical enabler of open allocation, volunteer-based governance. It means a contributor can spend time and energy on a shared codebase, knowing that if the group priorities diverge from his or her own, they can fork the code and continue in their preferred direction.

In practice, forking has high costs for complex codebases. Few developers are well-rounded enough (or have enough free time) to address and fix every nature of bug and feature that a project might contain.

Forkability puts limits on the powers of Benevolent Dictators. Should they take the project in a direction that most contributors disagree with, it would be trivial for the majority to copy the codebase and continue on without the BD at all. This creates a strong motivation for the BD to adhere with the consensus of the group and “lead from behind.”

Open allocation governance in practice

A useful guide to open allocation governance in a real, successful project can be found in the Stanford Business School case study entitled “Mozilla: Scaling Through a Community of Volunteers.” (One of the authors of the study, Professor Robert Sutton, is a regular critic of the abuses of hierarchical management, not only for its deleterious effects on workers, but also for its effects on managers themselves.)

According to Sutton and his co-authors, about 1,000 volunteers contributed code to Mozilla outside of a salaried job. Another 20,000 contributed to bug-reporting, a key facet of quality control. Work was contributed on a part-time basis, whenever volunteers found time; only 250 contributors were full time employees of Mozilla. The case study describes how this “chaordic system” works:

“Company management had little leverage over volunteers—they could not be fired, and their efforts could be redirected only if the volunteers wanted to do something different. The overall effort had to have some elements of organization—the basic design direction needed to be established, new modules needed to be consistent with the overall product vision, and decisions had to be made about which code to include in each new release. While community input might be helpful, at the end of the day specific decisions needed to be made. An open source environment could not succeed if it led to anarchy. [Chairman of the Mozilla Foundation John Lily] referred to the environment as a “chaordic system,” combining aspects of both chaos and order. He reflected on issues of leadership, and scaling, in an organization like Mozilla: ‘I think ‘leading a movement’ is a bit of an oxymoron. I think you try to move a movement. You try to get it going in a direction, and you try to make sure it doesn’t go too far off track.’”

The Bitcoin “business model” binds hackers together despite conflict

In many ways, the Bitcoin project is similar to forerunners like Mozilla. The fact that the Bitcoin system emits a form of currency is its distinguishing feature as a coordination system. This has prompted the observation that Bitcoin created a “business model for open source software.”  This analogy is useful in a broad sense, but the devil is in the details.

Financing—which in most technology startups would pay salaries—is not needed in a system where people want to work for free. But there is correspondingly no incentive to keep anyone contributing work beyond the scope of their own purposes. Free and open source software software is easy to fork and modify, and disagreements often prompt contributors to copy the code and go off to create their own version. Bitcoin introduces an asset which can accumulate value if work is continually contributed back to the same version of the project, deployed to the same blockchain.  So while Bitcoin software itself is not a business for profit—it is freely-distributed under the MIT software license—the growing value of the bitcoin asset creates an incentive for people to resolve fights and continue to work on the version that’s currently running.

This is what is meant by a so-called business model: holding or mining the asset gives technologists an incentive to contribute continual work (and computing power) to the network, increasing its utility and value, and in return the network receives “free labor.” As Bitcoin-based financial services grow into feature parity with modern banks, and use of the coin expands, its value is perceived to be greater.

Other real-time gross settlement systems, such as the FedWire system operated by the Federal Reserve, transacting in Federal Reserve Notes, can be used as a basis for comparison (in terms of overhead costs, security, and flexibility) to the Bitcoin system, which uses bitcoins as the store of value, unit of account, and medium of exchange. Without the prospect of the improvement of the protocol, as compared to banking equivalents, there is little prospect of increasing the price of Bitcoin; in turn, a stagnant price reduces financial incentive for selfish individuals to keep contributing code and advancing the system.

However, the system must also protect against bad actors, who might try to sabotage the code or carry the project off the rails for some selfish end. Next, we will discuss the challenges with keeping a peer-to-peer network together, and how Bitcoin’s design creates solutions for both.

How developers organize in the Bitcoin network

We have described how open allocation software development works in detail, but we have not yet delved into the roles in the Bitcoin network. Here we describe how technologists join the network.

There are three groups of technical stakeholders, each with different skill sets and different incentives.

Group A: Miners

The primary role of mining is to ensure that all participants have a consistent view of the Bitcoin ledger. Because there is no central database, the log of all transactions rely on the computational power miners contribute to the network to be immutable and secure.

Miners operate special computer hardware devoted to a cryptocurrency network, and in turn receive a “reward” in the form of bitcoins. This is how Bitcoin and similar networks emit currency. The process of mining is explained in detail in the following pages, but it suffices to say that the activities of miners require IT skills including system administration and a strong understanding of networking. A background in electrical engineering is helpful if operating a large-scale mine, where the power infrastructure may be sophisticated.

Operating this computer hardware incurs an expense, first in the form of the hardware, and then in the form of electricity consumed by the hardware. Thus, miners must be confident that their cryptocurrency rewards will be valuable in the future before they will be willing to risk the capital to mine them. This confidence is typically rooted in the abilities and ideas of the core developers who build the software protocols the miners will follow. As time goes on however, the miners recoup their expenses and make a profit, and may lose interest in a given network.

Group B: Core Developers

Developers join cryptocurrency projects looking for personal satisfaction and skill development in a self-directed setting. If they’ve bought the coin, the developer may also be profit motivated, seeking to contribute development to make the value of the coin increase. Many developers simply want to contribute to an interesting, useful, and important project alongside great collaborators. In order to occupy this role, technologists need strong core programming skills. A college CS background helpful, but plenty of cryptocurrency project contributors are self-taught hackers.

In any case, core developers incur very few monetary costs. Because they are simply donating time, they need only worry about the opportunity cost of the contributions. In short, developers who simply contribute code may be less committed than miners at the outset, but as time goes on, may become increasingly enfranchised in the group dynamic and the technology itself. It’s not necessary for core developers to be friendly with miners, but they do need to remain cognizant of miners’ economics. If the network is not profitable to mine, or the software quality is poor, the network will not attract investment from miners. Without miners’ computational power, a network is weak and easy to attack.

Group C: Full Node Operators

Running a “full node” means keeping a full copy of the blockchain locally on a computer, and running an instance of the Bitcoin daemon. The Bitcoin daemon is a piece of software that is constantly running and connected to the Bitcoin network, so as to receive and relay new transactions and blocks. It’s possible to use the daemon without downloading the whole chain.

For the full node operator, running the daemon and storing the chain, the benefit of dedicating hard drive space to the Bitcoin blockchain is “minimally trusted” transactions; that is, he or she can send and receive Bitcoin without needing to trust anyone else’s copy of the ledger, which might contain errors or purposeful falsifications.

This might not seem practically for non-technical users, but in actuality, the Bitcoin software does the work of rejecting incorrect data. Technical users or developers building Bitcoin-related  services can inspect or alter their own copy of the Bitcoin blockchain or software locally to understand how it works.

Other stakeholders benefit from the presence of full nodes in three ways. Full nodes:

  • Validate digital signatures on transactions sent to the network. Thus, they are gatekeepers against fake transactions getting into the blockchain.
  • Validate blocks produced by miners, enforcing rules on miners who (if malicious) may be motivated to collude and change the rules.
  • Relaying blocks and transactions to other nodes.

Worth mentioning are also two primary groups of second-degree stakeholders:

  • Third Party Developers: build a cottage industry around the project, or use it for infrastructure in an application or service (ie., wallet developer, exchange operator, pool operator). These people frequently run full nodes to support services running on thin clients.
  • Wallet Users: an end-user who is sending and receiving cryptocurrency transactions. All stakeholders are typically wallet users if they hold the coin. Many wallets are light clients who trust a copy of the ledger stored by the Third Party Developer of the wallet.

The “Delicate balance of terror” when miners rule

In a permissionless cryptocurrency system like Bitcoin, large miners are also potential attackers. Their cooperation with the network is predicated on profitability; should an attack become profitable, it’s likely that a large scale miner will attempt it. Those who follow the recent history of Bitcoin are aware that the topic of miner monopolies is controversial.

Some participants believe ASICs are deleterious to the health of the network in various ways. In the case of hashrate concentration, the community is afraid of miners’ collective ability to wage what is known as a 51 percent attack, wherein a miner with the majority of hashrate can use this computing power to rewrite transactions or double-spend funds. Such attacks are common in smaller networks, where the cost of achieving 51 percent of the hashrate is low.

Any mining pool (or cartel of mining pools) with over 51 percent of the hashrate owns the “nuclear weapon” in the network, effectively holding the community hostage with raw hashrate. This scenario is reminiscent of Cold War-era nuclear strategist Albert Wohlsetter’s notion of a delicate balance of terror:

“The balance is not automatic. First, since thermonuclear weapons give an enormous advantage to the aggressor, it takes great ingenuity and realism at any given level of nuclear technology to devise a stable equilibrium. And second, this technology itself is changing with fantastic speed. Deterrence will require an urgent and continuing effort.”

While large miners can theoretically initiate attacks that bends the consensus history to their liking, they also risk tipping off the market to their attack, causing a sudden collapse of the token price. Such a price collapse would render the miner’s hardware investment worthless, along with any previously-earned coins held long. In the case where manufacturing is highly concentrated, clandestine 51 percent attacks are easier to achieve.

Figure 17: Miner concentration by pool. (Credit: blockchain.com).

In the past few years, Bitmain has dominated the market both in the form of hashrate concentration and manufacturing concentration. At the time of the writing, analysts at Sanford C. Bernstein & Co. estimate that Bitmain controls 85 percent of the market for cryptocurrency-mining chips.

“Tyranny of Structurelessness” when core developers rule

While hostile miners pose a constant threat to permissionless cryptocurrency systems, the dominance of the core software developers can be just as detrimental to the integrity of the system. In a network controlled by a few elite technologists, spurious changes to the code may not be easily detectable by miners and full node operators running the code.

Communities have taken various approaches to counter miners’ overwhelming amount of influence. The team at Siacoin decided to manufacture its own ASIC miner upon learning of Bitmain’s Sia miner. Communities such as Zcash take a cautiously welcoming attitude to ASICs. New projects such as Grin designed the hashing algorithm to be RAM (Random Access Memory) intensive so that ASICs are more expensive to manufacture. Some projects such as Monero have taken a much harsher stance, changing the hashing algorithm just to render one manufacturer’s ASIC machines inoperable. The fundamental divide here is less about “decentralization” and more about which faction controls the means of producing coinbase rewards valued by the marketplace; it is a fight over control of the “golden goose.”

Due to the highly dynamic nature of decentralized networks, to swiftly act against power concentration around miners could lead to the opposite extreme: power concentration around developer figureheads. Both types of concentration are equally dangerous. The latter extreme leads to a tyranny of structurelessness, wherein the community worships the primary committers in a cult of personality, and under a false premise that there is no formal power hierarchy. This term comes from social theorist Jo Freeman, who wrote in 1972:

“As long as the structure of the group is informal, the rules of how decisions are made are known only to a few and awareness of power is limited to those who know the rules. Those who do not know the rules and are not chosen for initiation must remain in confusion, or suffer from paranoid delusions that something is happening of which they are not quite aware.”

A lack of formal structure becomes an invisible barrier for newcomer contributors. In a cryptocurrency context, this means that the open allocation governance system discussed in the last section may go awry, despite the incentive to add more development talent to the team (thus increasing project velocity and the value of the network).

Dominance of either miners or developers may results in changes to the development roadmap which may undermine the system. An example is the erroneous narrative perpetuated by “large block” miners. The Bitcoin network eventually split into two on August 1, 2017 as some miners pushed for larger blocks, which would have increased the costs for full node operators, who play a crucial role in enforcing rules on a Proof-of-Work blockchain. Higher costs might mean fewer full node operators on the network, which in turn brings miners one step closer to upsetting the balance of power in their own favor.  

Another example of imbalance would be Ethereum Foundation. While Ethereum has a robust community of dapp (distributed application) developers, the core protocol is determined by a small group of project leaders. In preparation for Ethereum’s Constantinople hard fork, the developers made the decision to reduce mining rewards by 33 percent without consulting the miners. Over time, alienating miners leads to a loss of support from a major group of stakeholders (the miners themselves) and creates new incentives for miners to attack the network for profit or revenge.

Market consensus is achieved when humans and machines agree

So far we have discussed human consensus and machine consensus in the Bitcoin protocol. Achievement of these two forms of consensus leads to a third type, which we will call market consensus:

Consensus in the marketplace results from human and machine consensus. (Credit: Narayan et al., Bitcoin and Cryptocurrency Technologies,  p.169)

The three legs are deeply intertwined, and they require each other for the whole system to work well. Many cryptocurrency projects including Bitcoin, have suffered from either a “delicate balance of terror” and/or “tyranny of structurelessness” at various times in their history; this is one source of the rapidly-changing perceptions of Bitcoin, and the subsequent price volatility. Can these oscillations between terror and tyranny be attenuated?

Attenuating the oscillation between terror and tyranny

Some projects have chosen to reduce the likelihood of a “delicate balance of terror” by resisting the participation of ASIC miners. A common approach is to modify the Proof-of-Work algorithm to require more RAM to compute the block hash; this effectively makes ASIC miners more expensive (and therefore riskier) to manufacture. However, this is a temporary measure, assuming the network grows and survives; as the underlying cryptocurrency becomes more valuable, manufacturers are incentivized to roll out these products, as evidenced in Zcash, Ethereum, and potentially the Grin/Mimblewimble project.

What does this have to do with corporate organization?

The system described above operates without central coordinators, making it possible to build business which themselves do not rely on central coordinators, and are thus able to eschew traditional settlement systems to do global business, 24/7, without every employee being under complete managerial oversight.

Before continuing, it may be useful to review some of the secular socio-economic trends that are threatening institutions in the US and abroad. The summary is: technology is enabling the world to move faster, which is prompting companies to experiment with a corresponding reduction in oversight for employees at the margin (and new accompanying organizational designs generally).

  • Buurtzorg (Netherlands home care provider) self-managed teams; resulted in 20-30 percent reduction in per-patient costs, saves the Dutch social security system hundreds of millions of euros every year.
  • FAVI (French copper alloy parts manufacturer) organized itself into a dozen or more “minifactories” that operate independently. Reduces costs by an average of 3 percent per year. Teams at FAVI have end-to-end control and responsibility. On-time delivery a social expectation. Profits are shared across workforce.

Platform Cooperatives

  • Businesses that bring the structure of traditional cooperatives, including worker ownership and governance, and formalize the interaction of stakeholders via a software platform.
  • Cooperatives in Japan: Conceived as response to slow economic growth environment; 35,600 stores, $145b per annum commerce, over 65M people members.
  • Cooperatives globally: 1B members, $2.5T annual commerce

Responsive or Open Allocation Companies

In an effort to become more like Github, which organizes more like a networks than hierarchy, some corporations have reduced hierarchy and hired more freelancers to become more network-like:

  • Microsoft and Yammer developed the "Responsive Organization" movement to break down hierarchies into SLAM teams (self-selecting, lean, autonomous, and multi-disciplinary)
  • WL Gore has been working open allocation since its inception in 1957 with no managers.
  • Spotify is so famous for its no-boss org design that it created a two hour video series about how Spotify Engineering operates.

But, becoming an open allocation network-company would be a big jump for the largest companies, or for any workers trained in the 20th century—is there a “missing link? organization design”

Introducing Dual-OS, Bitcoin's Match for Business Organization Design

Harvard Business School professor John P. Kotter originated the concept of “the Dual-OS company.” It combines a traditional hierarchical organization with a “network” of supertemps, some full time and some part time, some paid and some volunteer.

Hierarchies are good for planning, creating budgets, defining roles, HR functions, and measuring results. “What they do not do well,” he wrote in the Harvard Business Review in 2012, “is identify the most important hazards and opportunities early enough, formulate creative strategic initiatives nimbly enough, and implement them fast enough.” The network half of the dual-OS is like an immune system for the traditional corporate hierarchy, constantly surveying the business, its markets, and its competitors to bring new information and practices in.

  • Kotter says about 10 percent of the network should be comprised of full-time employees
  • Rest carefully-selected part-timers and volunteers, who are given highly structured processes
How organization design may develop, towards an open allocation service mesh economy without distinct employers or entities.

Challenges of enacting an actual Dual-OS global corporation today: there is no financial infrastructure or ERP system that could handle it!

  • Payroll (global, unscheduled, workers dropping on/off, variable compensation…)
  • Staff jump on / off ad hoc, HR nightmare
  • Spending / budgeting permissions; how to give contingent workers access to petty cash?
  • How to write incentive-payment logic for many contract workers all over the world

Cryptocurrency emerges as:

  • Programmable remittance tech
  • Always online, global, secure
  • Bridges big economies (US-CN)
  • Cheap to send and free-floating

What happens when large organizations, fearful of losing relevance, adopt Dual OS and new digital infrastructure such as cryptocurrency? They can reduce full-time headcount, reduce costs, adapt faster, and avoid the innovator's dilemma. In the long term, broader structural changes might be afoot.

“In the absence of transaction costs," Coase wrote in 1987, "it does not matter what the law is, since people can always negotiate without cost to acquire, sub-divide, and combine rights whenever this would increase the value of production. In such a world, the institutions which make up the economic system have neither substance nor purpose.”

The Service Mesh: Vision for a Zero-Transaction-Cost Economy

As transaction costs reduce thanks to cryptocurrency infrastructure, organizations will move faster, and increasingly operate by "emergent consensus." A digital bazaar of self-selected groups of skilled individuals, working in loosely organized groups, with highly automated processes to prevent total chaos. Here's how it could look:

  • Emergent groups will engage in decision making coordinated by software. Today, version control systems like Github enable collaborative work and limited discussion. More sophisticated decision making tools like Glassfrog and Parabol, drawn from the highly-regimented Holacracy organization design, add nuance and layers of permissions to automated decision making infrastructure. Built into on-chain governance systems like Decred's Politeia, these decision-making systems ca result in direct pecuniary action once completed.
  • It's said by industrial economists that industrial economic activity is non-zero-sum, and commercial endeavors are limited by transaction costs. Systems like Bitcoin allow communities to create a workspace with extremely low transaction costs, lowering the threshold for productive collaboration and peer-to-peer commerce.
  • Self-selecting, skill-based, pseudonymous groups are extremely tight knit, and do not share qualities with traditional definitions of “voting blocs,” will not fit into existing political coalitions.
  • Representational decision making (ie., democracy) is unnecessary in an environment with extremely low transaction costs. Law, corporate entities, and personal property also become moot. (R. Coase).
  • Emergent consensus (ie., ad hoc decision making and resource allocation) is only possible in environments with low transaction costs.

Eventually, new business models will develop for the Internet, where all the “products” are data. “The law of supply and demand can’t work for a product such as information that can be replicated at almost no cost,” wrote Esther Dyson. Neither can the organizational design built for 20th century commerce.

Recent

Load More