The latest rage is all about Big Data.
And what do many people think of first, Hadoop.
Which is basically a distributed file system over multiple servers, either internally or in the Cloud.
With Map Reduce jobs to query your data sets, typically written in Java, however not always.
So you can query a bunch of files stored in the file system.
What if we took that concept another level.
What if the entire network at work was the file system.
And you could query every document on the network.
As in a live query run against the live network in real time.
Wouldn't that be powerful.
And if you wanted, you could expose certain folders to the outside world.
For world consumption over the internet.
Of course you would have to apply a valid namespace for identification and perhaps some security.
But then you could offer this up as a Service.
Big Data Network Distributed File System.
An example would be to expose Census data for the Federal Gov. Or just about any data you could think of.
So you wouldn't have to spend a gazillion dollars on traditional databases.
You would simply add a bunch of files to your network or Cloud which you want to expose, apply your namespace, open up to the outside word, similar to a Web Service, in this case, a Public Big Data Service.
An entire new industry!
Built on current technology and methodology.
Could this concept have legs?