The notion of Artificial Intelligence taking over humankind is far from new. But what if blockchain is also not as benevolent as it seems? Adam Kolber, Professor of Law at Brooklyn Law School, thinks that we should consider the downsides of the technology saying that it bears significant “artificial responsibility.”
In his paper “How Blockchains Increase Artificial Responsibility,” he argues that artificial intelligence has its advantages but it also has its threats. For example, what will happen if the machines fail to recognize that their actions are detrimental to humans? “A suitably intelligent AI that seeks to maximize the number of paper clips might, as Nick Bostrom has suggested, enslave humanity if doing so will best achieve its cold, calculated objective,” writes Kolber.
Thus, he proposes a term which will be referred to as “artificial responsibility” or “the ability of machines to control important matters with limited opportunities for humans to veto decisions or revoke control.” In light of this, he mentions blockchain stating that while the technology could be considered “unintelligent,” as it cannot recognize our voices, for example, or pass a Turing Test, it nevertheless bears a great deal of artificial responsibility.
For instance, blockchain can make BTC transactions instant compared to bank operations which take up a significant amount of time. “Unintelligent as it may be, bitcoin still has substantial artificial responsibility because the network accomplishes the important task of transacting billions of dollars in value through a network spread across the globe with no person, bank, or government in charge of it,” Kolber says.
On top of that blockchain allows creating ‘smart contracts’ that can be combined and turned into a ‘decentralized autonomous organization’ (‘DAO’). This kind of project was first created in 2016 and has become a cautionary tale.
“A bug in its smart contract code was exploited to drain more than $50 million in value. And here was can see our willingness to endow blockchains with artificial responsibility: despite the loss of funds, there was no easy mechanism and certainly no central authority that could recover the money,” he emphasizes.
Thus, the damages, in that case, were not fully mitigated. And Kolber warns that in the future these types of contracts can become even more dangerous: “guests at a DAO hotel might be locked out of their rooms; DAO self-driving cars might drive off bridges.”
He concludes by saying that blockchain is a promising technology. Yet it should be treated cautiously. “We should be thoughtful about how we endow machines with artificial responsibility, even when (and perhaps especially when) these machines are not very intelligent,” point out Kolber.
This is a good notion to keep in mind, as the blockchain technology is becoming ubiquitous. Thus, every week more governments announce blockchain adoption while China is testing a blockchain trading platform.