The mainframe took up entire floors. Bugs were actually moths. Code was input using reams of punch cards. Memory was low, code had to be tight. These were the days before Assembly language.
Think of the minute details involved at such a low level. Perhaps bundle some logic, create a reusable function. Perhaps allow parameters. With 2KB of memory, you need to control your code the nth degree.
Eventually, higher level languages appeared. And its continue to progress exponentially. So little to no coding involved.
You can almost see the same pattern emerge in the machine language spaces. At first, colleges had access to large data sets and could attempt recognition systems on university time and budget, perhaps in effort to earn PhD. And big government agencies had access to powerful machines for such use. And large corporations like IBM had financial backing, resources and expertise to take on complex data projects.
Eventually, machines learning got a rebirth. Software became accessible, hardware became available, memory was available, utilization of distributed servers came to fruition, as well as Cloud resources to spin up some heavy deep learning processing power. Software, Hardware, Memory, Large Data sets and enough information became available for a surge in machine learning.
You can Cluster, Classify, Anomaly Detection, Recommend Engine from your living room couch. Many websites and applications have already bundled some level of machine learning models into everyday products.
Large corporations are working to teach machines how to learn on their own, advanced unsupervised learning, such that the machine can teach itself to become expert level experts in video games, chess, go and more.
Artificial Intelligence has financial backing across public and private sectors. With drones that can fly unassisted, cars that can maneuver on highways, robots than can perform tasks in factories. IBM Watson was fed tons of data on medical information, and it can assist medical practitioners diagnose and treat patients, as the machine can search really fast, look for patterns, past use case studies across multiple domains, very quickly.
The pattern of mainframes with low level code into sub-routines and functions into higher level languages into little to no coding, is duplicating with Machine Learning and Deep Learning and Artificial Intelligence. We already got passed the resource, software, technology into the hands of laymen, next, we'll see higher level languages to stream line the process of data cleaning, data modeling, data ingestion, model training time reduced, cross collaboration of domains in real time. It's going to be easier, cheaper, faster to get Machine Learning and AI into production with little to no coding, with more automation and re-usable code.
That's how technology works. We use what we have today, tomorrow it becomes cheaper and easier to use. From Mainframes to AI. We already see Cloud based Machine Learning in AzureML, drag and drop components, easy to bundle and deploy to production, with easy access via exposed endpoints web services, which can be called from a variety of applications and languages in real time.
My early computer days were sitting on the family computer IBM PC in the early 1980s with PC Dos and BBS, then the VAX in college, then a 286 with AOL, then OS2 / Windows 3.1, then Internet, then C++ / Visual Basic 4 / Oracle / Crystal Reports, eventually getting to today, where technology has advanced beyond comprehension and fragmented into thousands of tiny slices of vendors and products and offerings and interconnected world including Machine Learning and Artificial Intelligence.
Pretty soon AI will become a commodity and integrated into small chips that reside in Internet of Things (IoT) devices, just about every industry public and private, and coding will be at higher level with less thinking required. Perhaps AI devices will create new AI models, on the fly in real time, as needed, then archived for later use and re-use across cluster of AI devices. Perhaps AI will monitor security across the globe, audit BlockChain transactions in real time for fraud detection. One thing is for certain, technology will continue its march forward to simplify the complex and infiltrate the core of society to a high degree. And for now, while humans still hold the keys to progress, let's try not to make enemies with technology, for one day in the future, we may be answering to it.
Thanks for reading~!
I signed up for the Hortonworks Certified Associate exam last Thursday. Figured if I sign up, I'd have to take the test. And if I tak...
Data becomes information. Information adds value if used properly to align business practices, streamline processes with net result of incr...
Data is the new oil. Sort of a good analogy. Except new oil is constantly required. And there is only so many oil wells on the planet. A...
What do you want to do when you grow up. For some of us, we still haven't decided. After close to 50 years. Chances are, if you chos...