Predictive Analytics Myths Debunked

We recently presented at the Predictive Analytics World Workforce conference where one of the industry’s hot topics – predictive analytics – was sliced and diced in a hundred different ways. As PhDs, we geek out when talking about big data, modeling and analyses. That also means we can’t help but point out some fallacies in some of the statements we heard while at the conference. Our takeaway? Companies are making strides toward understanding and embracing predictive analytics but there are still many that need to take one, two or maybe three more steps to achieve true predictive analytics. We don’t fault companies for not being 100 percent quite there yet – we spent years in graduate school to feel comfortable with data science. So, to help move us all in the right direction, we’ve debunked a few predictive analytics myths below.

Myth: As long as you have a “model,” it’s predictive. It doesn’t need to be reliable, validated or even tested to see if it’s working.

Gasp! A model on a PowerPoint slide does not make predictive analytics. A model must meet two key criteria: (1) Testable: The model should be able to be tested --plain and simple, yet vital. In order to know if your model works or, if it accurately and reliably predicts the outcome(s) of interest (e.g., turnover, sales, profit, etc.), you have to test it. Without testing your model, all you have is a theory. (2) Directional: It should be clear how the variables are interrelated and connected. We saw many ‘models’ presented that more or less consisted of a group or cluster of variables with no information on how these variables led to each other or the outcomes they were trying to predict. It is important to remember that when modeling, it is not a catch-all exercise. The goal is not to list every single variable that you can think of as a predictor of your outcomes. Instead, modeling is about examining the interrelations between your input variables to find the strongest combination of predictors of your outcomes.

Myth: Predicting engagement is a good business practice.

We’d like to give you a little background on why employee engagement became such a fad. Back in the 1990s, a book was published called “First, Break All the Rules” from Gallup that introduced the concept of employee engagement. Unfortunately, it was based on flawed research and purported to show that 12 magical things were all that any company - of any size and industry - needed to be successful. The research was flawed because companies that were already successful were examined and common themes emerged. Discovering that companies that make a lot of money and are well-run also have happy employees is not surprising. The problem is that HR leaders (and Gallup) jumped to the conclusion that making people happy or engaged is what caused those companies to be successful. Wrong! Actual cause-effect research/analytics that we conducted showed that the employees were engaged because those companies were successful. Bottom line: employee engagement is not a business outcome and it is not a driver of business outcomes; it is a by-product of a well-run organization. As such, it is definitely not a good business practice so no need to expend time predicting it. Think of the movie Field of Dreams and its “If you build it, they will come” mantra. If you work on the key drivers of business success, engagement will naturally happen.

Myth: Predictive analytics can’t link to ROI.

With sound predictive analytics approaches, such as structural equation modeling, you can certainly ascertain return on investment (ROI). For example, SMD’s patented technology, SMD Link, does all of the work to quantify the impact of employees on business outcomes; calculate an expected ROI for investments in employees, and define the relationship between HR processes and business outcomes. The ROI is calculated based on the complex algorithms from the statistical models built in SMD Link. That means you can actually focus on the drivers that have the largest ROI on your business and watch associates in action! Our stance is, if you can’t track whether your prediction is valid or not, why do it?

We could talk about predictive analytics all day! If you’re interested in the hour-long version, join us at our “Big Data and Predictive Analytics the Right Way” webinar on April 16 at 1:00 p.m. ET. Register here.