Popular Posts


Blockchain Must Overcome Similar Challenges as Hadoop

For Blockchain to be considered a global enterprise level database (ledger), it must scale at the transaction level, in real time, ensure security based on token (incremental keys) that guarantee authenticity, and must be transparent.

Hadoop tried to create real time transactions to mimic traditional databases, yet Map Reduced limited its ability.  It wasn't until Map Reduce was pushed up a level, to become just another tool in the toolbox, that we began to see improvements in query speed.  I'm not sure they were able to insert new records the Hive databases to match standard OLTP databases, although I have not been keeping up to date on this.

So for BlockChain to scale enterprise wide, it will need to overcome the challenges that Hadoop faced.  Hadoop was typically contained within an org or in the Cloud, where Blockchain is scattered across the globe, so distance is potentially greater.  And I imagine once the record is placed on top of the stack, the other nodes must be notified to establish the agreed upon contract to know its legit.

Also, the bandwidth must be able to handle thousands of transactions per second, to mimic OLTP databases, which handle insertions via locks and such.

So BlockChain must handle increased Volumes, across great distances, negotiate valid contracts and update across the chain, in potentially real time.  And since these contracts could be used for stock trades, currency exchanges, and voting polls, it will need to be 100% accurate, secure and transparent.

Tough order to fill.  Let's watch as time progresses, see how things pan out.