by Landon Manning, Aug 24, 2018

Blockchain Experts Discuss Scalability Issues at Distributed 2018

Scalability Problems Panel

It can be a trade-off trying to manage decentralization and throughput on systems of smart contracts. The trick is to maintain a network without a central processing hub and to simultaneously process a large volume of transactions without significant overhead costs. The scalability problem that the majority of decentralizing projects face is that it is exceedingly difficult to accomplish both of these at once.

Befitting a problem of this significance to the blockchain space, the recent Distributed 2018 conference in San Francisco included a panel of various industry experts, moderated by BTC Inc’s Tyler Evans, to discuss attempts at solving this puzzle.

Jordan Earls, the lead developer at Qtum, discussed the potential use of existing technology that’s compatible with the smart contract ecosystem to potentially solve the scalability issue. Chris McCoy, the creator of Storecoin, compared the inefficiency of smart contracts to the early days of oil exploitation, in which only the most easily isolated kerosene was actually used to fuel anything. McCoy’s model approach to the problem is “to take that waste — orphan blocks, chain forks — and actually use that to create faster throughput,” avoiding sacrifices to the integrity of decentralization as much as possible. Zaki Manian, the executive director of the Trusted IoT Alliance, relied on his experience to acknowledge the significant bottlenecks of the scalability problem, but affirmed his belief that technological advancements could alleviate the problem within approximately five years.

As expected, this panel was a very high-concept one, with each of the specialists going into great detail on the state of the industry. The full video provides significant insight into the scalability problem, and other similar panels on assorted distributed ledger topics can be found on Distributed’s YouTube channel.