The 21st century has brought with itself a plethora of technological advances and commodities unavailable to man at any point in his history. Among these, the rarest of them all; data and the power to harness it.
There is no other commodity on the planet that is as powerful as data. Yes, you can extract oil from a well and burn it in a combustion engine to produce the energy necessary to propel a car…but data is far more powerful than that. Data is like the raw materials to the most rudimentary time machine we have produced. When the quality and consistency across the data is pristine, one can almost get a glimpse into the future with a far higher degree of certainty than the best methods currently employed.
In order for this commodity to be cleaned and be given value, it must be refined, as in the case of oil. However, this refining does not demand heavy capital expenditure or plants to distill and clean the final product. Instead, it needs the minds of data scientists, economists and statisticians in order to be cleaned, transformed and analyzed. This transformation is what we call data analytics, or the refining of this data in order to convert the commodity (data) into the final product (insight).
Continuing with the oil-gasoline analogy, the oil extracted is quite valuable, but remember, the gasoline is what powers your car every day from your house to the office. The same is with data. Data without a mind to analyze it properly and effectively is the same as having an oil well without the ability to extract it nor refine it. This is sadly the case with most businesses that have been collecting and managing this precious commodity for years without having the ability to harness its power.
In order to give data value, it must be transformed and put through different models or estimators to obtain some form of insight. These insights are then condensed and conveyed (preferably visualized) to be presented to a specific audience. This is what we call the refining process; this is data analytics.
If data analytics is to gasoline as data is to oil, then predictive analytics is the premium gasoline that you see at the Formula 1 pumps. Predictive analytics is taking the process of cleaning, transforming and analyzing the data a step further to make an estimated guess about the future (or some future action/outcome).
In essence, the more data available, and the better the quality, the better the estimated guess about the future. It is worthy to note that just like any guess about the future, it has its flaws. However, unlike most subjective guesses about the future, predictive analytics gives you the ability to be empirical about this guess and examine parameters as well as relationships among underlying factors. Including those that are undetectable to the naked eye.
In order to obtain any insight from your data warehouse, it is necessary to begin with a question. Only then, can you begin to extract the useful materials necessary for the analysis. In this context, the materials necessary to obtain insights from your data are variables. By beginning with a question, you avoid making the mistake of giving your data scientists ten years’ worth of historical data and expecting him/her to return with a solution to all the company’s problems.
At Arcum Partners we wanted to help payments processors and acquirers reduce merchant attrition. So we began with questions such as what drives merchant attrition? Are there internal factors such as an account manager or customer service issues driving attrition? Or external drivers such as an economic slowdown in a particular area? Once these questions have been established, we can begin to formulate a hypothesis and start testing these by leveraging the data available. Once a hypothesis is formulated we can create a predictive model that identifies those merchants that are at risk of leaving. By doing so, our clients can proactively engage with their merchants at risk in order to create better retention campaigns; thus, reducing merchant attrition.