Revisiting the arguments against Proof-of-Work

Leo Zhang

By Leo Zhang

Editor's note

By introducing financial incentives into open allocation software governance, Satoshi Nakamoto managed to solve the impossibility of distributed consensus in the context of a currency system. The key component is the design is Proof-of-Work mining. Recently some old criticisms on Proof-of-Work are resurfacing in the community, mostly regarding energy consumption and hashpower concentration. Both topics are frequently covered topics in the scope of this newsletter. In the series of articles below, we discuss why Proof-of-Work is still the most practical.

The Yin and Yang of Bitcoin
(LaurentMT)

"Once again, the metrics suggests that the system has become more efficient over time but the most interesting insights gathered from this metrics are certainly related to the analysis of its multiple oscillations. To my surprise, it remarkably synthetizes multiple past observations made about the interrelation between the market price and hodling & mining behaviors."

PoW is efficient
(Dan Held)

"The average cost per transaction isn’t an adequate metric for measuring the efficiency of Bitcoin’s PoW, it should be defined in terms of the security of an economic history. The energy spend secures the stock of bitcoin, and that that percentage is going down over time as inflation decreases. A Bitcoin “accumulates” the energy associated with all the blocks mined since its creation. LaurentMT, a researcher, has found empirically that Bitcoin’s PoW is indeed becoming more efficient over time: increasing cost is counterbalanced by the even greater increasing total value secured by the system."

Randomness in blockchains
(TokenDaily, by Aparna Krishnan)

"Naturally, any alternative consensus mechanism trying to replace PoW would also need a way to limit the messages passing through the network. The way most Proof of Stake (PoS) protocols approach this problem is by selecting a subset of validators (nodes that maintain/govern the blockchain) to form a subcommittee based on how many coins they have staked. This subset of validators can then reasonably communicate back and forth with each other within the network constraints to achieve agreement in a timely manner."

This week's review

Tuesday: How to get away with m̶u̶r̶d̶e̶r̶ an ICO
Wednesday: The bell tolls for Mr. ICO Advisor
Thursday: Vote, or your blockchain dies!

Further readings this week

Battle of the privacycoins: Why Monero is hard to beat (and hard to scale)
(Bitcoin Magazine, by Aaron van Wirdum)

"Making matters worse, the Monero blockchain cannot be pruned in its entirety. Where Bitcoin’s full node users can opt to get rid of old transaction data, much of Monero’s transaction history remains relevant and must, therefore, be stored forever. This is currently 20 gigabytes and growing. (The total Monero blockchain is currently 60 gigabytes.)"

Two EOS gambling platforms suffer a $260,000 hack
(Toshi Times, by Dennis Sahlstrom)

"The hacker exploited a flaw in the code and could, therefore, bypass a transfer function, which meant that their funds were not deposited to the smart contract. When they lost, they did not have to pay, but when they won, they won real money which they could cash out. The developers say that the platform is back online and that the rest of the funds are safe."

Critiques on blockchain sharding

Grin is hard forking every six months to keep ASICs off the network
(CoinDesk, by Alyssa Hertig)

"Siacoin lead developer David Vorick told Grin developers he's worried that ASIC manufacturers will be able to easily tweak their hardware with painless software updates in an effort to keep functioning on a network with a changing algorithm."

Lehman bankruptcy 10 year anniversary
(Pierre Rochard)

"Lehman used the reverse logic of the standard by asserting that their right to repurchase does not exist if they did not borrow enough to replace the securities (i.e. if they borrow $100 cash by repoing $105 of securities then the $100 cash is insufficient to replace the securities if the lender defaults), thus the third condition for a sale was met. Had the overcollateralization been 120% then the accounting treatment could have been more ambiguous under this standard."