The great Industrial Revolution. It was sold as the best thing since sliced bread. Don't wait 6 months for your rocking chair to be built, with IR, you can have it in under a week? How you ask? We created assembly lines to produce goods in mass production. Oh, that sounds great, I'll place an order for a new rocking chair. Sign me up! And so, we sent quality to the graveyard, in exchange for faster goods at lower costs. Fast forward a hundred years, where are we? We have over production of useless goods, that don't last, and aren't that inexpensive. First, as we know basic economics, that increased quantity of goods should reduce the price. That is not what we see in the marketplace. There's several stores with the exact same business model, with the exact same products, with the exact same price. Prices do not drop based on increased supply. Second, the quality of products has decreased to such an extent, sometimes I walk out of the store, deposit the item directly into the rubbish can right outside, why wait to get home and use it once before it breaks. Quality shmality. I'm not buying it. The quality of material is not meant to last more than a few uses. Third, have you ever walked down some of the isles of a mega warehouse store. Ask yourself, how many of these products existed during my grandparents day. I'd say less than a quarter. We've invented new products that haven't existed for very long, that are specialized to a specific niche, with 28 different scents or varieties. In our grandparents day, they didn't have teeth whitener or mildew remover. They used basic products that actually worked without inflated prices. Fourth, the packaging has gotten so creative. When I say creative, I mean deceiving. The product sizes are slowly shrinking. While the prices are slowing increasing. Pay more for less, on products you don't need, which won't last. Why not put it on credit, can pay it off over the next 7 years. We've been duped. Industrial Revolution was supposed to get us the same products faster and less expensive. What we actually got was crap products that we don't need which don't last at inflated prices. Luckily, nobody has much money left to spend on goods. Enter Robots and Artificial Intelligence. They'll do nicely as a low cost substitute for the tapped out humans. The new acronym, B2R or Business to Robot.
I wrote my first Crystal Report probably in 1996 as Version 5 was just released. And Crystal Info too, that original code resides in today's Business Objects. The thing about it, reports were based on data. And we used SQL or Structured Query Language to get the data out of the DB into a Report. Sure they had a wizard, but after 5 minutes, you had to write custom SQL. And for a very long time SQL didn't change. They've added new things like JSON and XML support. And some other goodies over time. It hasn't changed drastically. Now we have Self Service Business Intelligence. Put the power of the data in the hands of the people. It really is a revolution. It is powerful. And easy to use. You can build a dashboard in about 5 minutes. Just connect to the data source, add a few fields, colors, labels, publish, out the door. Bam! So why do we need report writers or ETL (Extract Transform & Load) developers, or Data Warehouse developers? Well, Self Service BI does not include a button to push to clean the data. The data is still dirty. That extra space at the end of row 3854, or 5 different ways so store Firstname field, Lastname field or Firstname + " " + Lastname or Lastname, Firstname Middle Initial or Lastname. Or duplicate data. Or missing data. Or manually extracting data feeds from other reports. Or data only being correct for the day the report ran. Or business rules changed in May, how do we report both the old metrics and new metrics year end reporting, and on and on. Self Service will get you brownie points for fast, visually appealing Dashboards, just keep in mind, there's no easy solution for data quality issues. If we set that to the side, self service bi could very well displace traditional report writers. Because the end users know the data and the business rules. And they don't need to be trained in coding SQL. Report writers just know the data for the most part, slight generalization, but true, they may not be able to decipher the meaning behind the numbers. If I was strictly a report writer, with no other value added, I'd be slightly worried. The Self Service BI is taking over.
We all start out with a clean slate or Tabula Rasa. At some point we recognize people and learn their names. Momma. Pappa. And so, the journey begins.
Soon we are taught the alphabet. Letters, pronunciation, learn to draw them cursively, and how they are combined to make words. Those words have meaning and can be combined to form sentences. Which are combined to make paragraphs. Which are combined to make stories. That opens up a whole new world.
Soon we are given assignments, designed to help us learn, and then we are tested, and receive a letter grade validation. If we receive high marks, we are given positive reinforcement and negative feedback for low marks. After much training, we become experts, where we transition from learning mode to production mode, taking a job to earn a living, and our training is put to good use.
So too, with computer algorithms. We create a system by which a machine is capable of receiving input, learning the patterns through positive reinforcement and tuning. After reaching desired levels, in that it can predict statistical probability with a high degree of accuracy, the system has graduated from it's learning stage. We then send real data through as input, to the trained model and the algorithms can produce some great results.
DeepMind, an Artificial Intelligence division at Google, was able to produce an environment in which the machine taught itself Atari games without any prior knowledge or expected results. Soon it mastered the game at expert level, better than almost any human. Later, it taught itself the game Go, after being fed millions of games which were scraped off an online site. The machine went on to beat one of the best players of the past decade, surprising many.
So as we can see, machines are capable of learning and performing great feats. Algorithms learn from input data over time. Then duplicates the behavior with high degree of accuracy, not just facts, but strategy. Although some may say this is the brute force way of training, it seems to work.
What's next? Well, we need more data. More input. More algorithms. More training. More developers. There are now tools available to train and monitor electronic activity via websites, for instance, gaming sights. You point your AI algorithm to the site, it learns in the background.
As more of the world goes online, with smart phones, social media, payment transactions, etc. the amount of data and events are growing exponentially. Not only that, those events are tied to people, places and things. The AI is learning the patterns from multiple sources continuously.
One way to train a model is through the technique called Supervised Learning, in which the data fed into the model is known in advance. If the data is composed of images, perhaps the label says dog or cat or bird or mountain.
Yet there's another technique used to train the model's known as unsupervised training. The input data does not have any labels or identifiable information, so the machine has to learn more without assistance. There are techniques to find the patterns, but it's more complicated than Supervised Learning.
As the data grows, and becomes more available, the AI machines have more input to crunch. Like completing a jig-saw puzzle, each single piece doesn't add much value. As you put more pieces together, a picture starts to take shape. AI will eventually put the pieces together across multiple domains, to create a giant puzzle comprised of bits and pieces to complete it's own puzzle.
Watching patterns over time is what AI systems do. Patterns can be interpreted. AI systems can detect patterns from the data, similar to fraud detection system's looking for anomalies.
They used to say any device hooked up to the internet can be compromised. How about AI systems. Anything electronic, from Internet of Things devices to cameras, social media, smart phones, payment transactions, banking accounts, location data, etc. could potentially be ingested, processed and interpreted to detect patterns over time.
Perhaps we have passed the basic learning of the letters of the alphabet, to forming words and sentences and paragraphs. And we about to graduate from our 12 years of learning, into the real world. Where our knowledge is put to the test. Artificial Intelligence is graduating from school and entering the real world. Hang on to your hats. AI is ready for takeoff.