Blockchain technology isn’t as widely used as it could be, largely because blockchain users don’t trust each other, as research shows. Business leaders and regular people are also slow to adopt blockchain-based systems because they fear potential government regulations might require them to make expensive or difficult changes in the future.
Mistrust and regulatory uncertainty are strange problems for blockchain technology to have, though. The first widely adopted blockchain, bitcoin, was expressly created to allow financial transactions “without relying on trust” or on governments overseeing the currency. Users who don’t trust a bank or other intermediary to accurately track transactions can instead rely on unchangeable mathematical algorithms. Further, the system is decentralized, with data stored on thousands – or more – of internet-connected computers around the world, preventing regulators from shutting down the network as a whole.
As I discuss in my recent book, “The Blockchain and the New Architecture of Trust,” the contradiction between blockchain’s allegedly trust-less technology and its trust-needing users arises from a misunderstanding about human nature. Economists often view trust as a cost, because it takes effort to establish. But people actually want to use systems they can trust. They intuitively understand that cultures and companies with strong trust avoid the hidden costs that stem from everyone constantly trying to both cheat the system and avoid being cheated by others.
Blockchain, as it turns out, doesn’t herald the end of the need for trust. Most people will want laws and regulations to help make blockchain-based systems trustworthy.
Problems arise without trust
Bitcoin’s creator wrote in 2009 that “The root problem with conventional currency is all the trust that’s required to make it work.” With government-issued money, the public must trust central bankers and commercial banks to preserve economic stability and protect users’ privacy. The blockchain framework that bitcoin introduced was supposed to be a “trustless” alternative. Sometimes, though, it shouldn’t be trusted.
In 2016, for instance, someone exploited a flaw in the DAO, a decentralized application using the Ethereum blockchain, to withdraw about US$60 million worth of cryptocurrency. Fortunately, members of the Ethereum community trusted each other enough to adopt a radical solution: They created a new copy of the entire blockchain to reverse the theft. The process was slow and awkward, though, and almost failed.
A new type of investment, called initial coin offerings, further illustrates why blockchain-based activity still requires trust. Since 2017, blockchain-based startups have raised more than $20 billion by selling cryptocurrency tokens to supporters around the world. However, a substantial percentage of those companies were out-and-out frauds. In other cases, investors simply had no idea what they were investing in. The blockchain itself doesn’t provide the kind of disclosure that regulators require for traditional securities.
The initial coin offering faucet slowed to a trickle in the second half of 2018 as the predictable abuses of a “wild west” environment became clear. As regulators stepped in, the market shifted toward selling digital tokens under the same rules as stocks or other securities, despite the limits those rules impose.
The myth of decentralization
The other reason that regulators have a role to play is security. Blockchain networks themselves are typically very secure, and they eliminate the vulnerability of a single company controlling transactions. However, blockchains identify the owner of an account based on its cryptographic private key, a random-seeming string of numbers and letters. Steal the key, and you’ve got the money. Ten percent of initial coin offerings proceeds has already been stolen.
Most users acquire their cryptocurrency through an exchange such as Coinbase, which trades it for dollars or other traditional currencies. They also let the exchanges hold their private keys, because that makes transactions easier and more efficient. However, it also creates a point of vulnerability: If the exchange’s records are breached, the private keys aren’t secret anymore.
Some users hold their own keys, and there are new exchanges being developed that don’t require users to give them up. These will never be as convenient, though, because the burden of managing keys and keeping them safe falls on users. Regulation will be needed to protect consumers.
Government authorities will also have a role in restricting money laundering, terrorist financing and other criminal uses of cryptocurrencies. The more decentralized a system is, the harder it will be to identify a responsible party to police illicit conduct. Some users may not care, or may see that as a necessary cost of freedom. But networks optimized for criminals won’t ever achieve mainstream success among law-abiding citizens. Ordinary users will be scared off, regulated banks and financial services firms will be prohibited from interacting with them, and law enforcement will find ways to disrupt their activities.
Regulators around the world are working to balance the flexibility to transact in new ways through cryptocurrencies with appropriate safeguards. They aren’t all taking the same route, but that’s good. When the state of New York adopted rigid registration requirements called the BitLicense that few companies could meet, other jurisdictions saw the implementation problems and took different paths. Wyoming, for example, adopted a series of bills that clarify the legal status of cryptocurrencies while imposing reasonable protections. New York is now reevaluating the BitLicense, to avoid losing business activity.
If people trust blockchain systems, they’ll use them. That’s the only way they’ll see mass-market adoption. The jurisdictions with the best regulation – not the ones with the least – will attract activity. Like any technological system, blockchains combine software code and human activity. It’s not enough to trust the computers – which, after all, are built and programmed by people. For the technology to be used widely and wisely, there must be mechanisms to hold the humans accountable, too.
By Kevin Werbach, Associate Professor of Legal Studies and Business Ethics at the Wharton School, University of Pennsylvania
This article is republished from The Conversation under a Creative Commons license. Read the original article.