Executive Viewpoint 2017 Prediction: Denodo – Predictive Analytics: 2017’s Hottest Data Trend
We often joke about the inaccuracy of weather predictions, but the reality is that they have become increasingly accurate over the past decade. This has allowed us to take preventative measures when necessary, such as evacuations in the case of oncoming hurricanes. Forecasting weather, however, is just one application of predictive analytics. The automotive industry is using it to provide autonomous cars; the retail industry is using it to roll out artificial-intelligence (AI)-enhanced products like Amazon Go; the healthcare industry is using it to improve the prediction of disease outbreaks, and the insurance industry is using it to support the advent of cyber insurance protection.
It’s clear that predictive analytics will likely be the hottest data trend of 2017, since companies in several industries are heavily leveraging predictive analytics to create innovative products, differentiate from the competition, and increase revenue by harnessing the vast amounts of new data that is flowing from business operations.
The Journey Toward Predictive Analytics
Initially, each of these companies were driven toward developing predictive analytics by their own sets of challenges. The number of cars on the road was increasing every day, so the automotive industry was facing the issue of cars breaking down with greater frequency. The retail industry could no longer effectively predict buying behavior, as coupons and in-store sales were no longer adequate measures to attract the millennial generation into department store giants like Macy’s. Clearly, the inability to predict this trend has led to mass layoffs at these same stores. The healthcare industry was facing a more serious challenge, as people’s lives were being threatened by delayed diagnoses. The insurance industry was facing the ever-challenging issue of cyber security, the failure to address potential vulnerabilities, and in some cases, predict where they might occur has led to hacking and identity theft. As a result, many companies across these industries have decided to put an end to their setbacks by creating a platform that leverages large volumes of data to enable predictive analytics.
The Power of Big Data Fabric
To create these platforms, companies need to harvest vast amounts of operational data across various sources, whether they be POS or ERP systems, cloud-based services, etc. Because of this, organizations are look toward Big Data fabric, a system that enables global access to all data assets and leverages storage and processing power from multiple heterogeneous source systems. Within this solution is a number of advanced technology components that collectively enable predictive analytics, including data ingestion, data preparation and, most importantly, data virtualization.
Let’s look at one way in which Big Data fabric can integrate and make sense of the data as it comes in from disparate sources. In order for a physician to gain a holistic view of a patient’s condition and take further preventative action, the physician would need integrated information from the electronic medical record system (EMS), the emergency room system detailing symptoms, and the vitals from various medical devices, all in real-time. However, when structured, semi-structured, and unstructured data comes from hundreds of on-premises and cloud sources, data integration can become quite challenging, and this is precisely where Big Data fabric come in.
Using data virtualization, Big Data fabric “stitches” the data together, using the underlying data sources, while providing it in a consistent format to business users. What is key here is that Big Data fabric provides a common access point for consumers and the ability to abstract data from disparate data stores. This solves the underlying issue of integrating data from disparate sources. However, with data virtualization in place, Big Data fabric has taken one critical step further to enable predictive analytics.
The Big Data fabric enabled by data virtualization integrates data, prepares it for predictive analytics, and makes it available to the consumer in real-time by virtualizing the access to the data without replication. This enables the physician in the above example to make timely decisions, and take preventative action to ensure the health and safety of the patient.
Let’s now consider the case of a heavy equipment manufacturer. This company uses heavy equipment in nearly every mining and drilling operation, and each piece of equipment is armed with sensors that constantly transmit data. This manufacturer laid out a Big Data fabric to gather this streaming IoT data and combine it with the company’s existing parts and location information. With this Big Data fabric in place, the manufacturer was able to predict when a machine might require servicing or part replacement. As a result, the company saw improved customer satisfaction and increased revenue, while reducing machine downtime to save millions of dollars.
A world with predictive analytics in place is a world of much more than just accurate weather. Decreased disease, improved automation efficiency, and heightened cybersecurity are just a few of the potential benefits. However, these benefits can only be realized by deploying a Big Data fabric powered by data virtualization. With Big Data fabric, predictive analytics will indeed be the hottest data trend in 2017.