“Data! Data! Data! I can’t make bricks without clay!” — Sherlock Holmes
The importance of good data cannot be overstated. Good data guides the decision-making process in any single endeavor, ensuring that information is on hand to direct the selection of the most ideal option. For organizations, good data lays the foundation for successful negotiations that leverage thorough information.
However, the line of argument is contingent on one word: “good.” Good data is information that can be processed, analyzed, and made relevant in a timely manner. Therein lies a modern problem: Practically every single action we take today — and certainly every transaction we make — generates volumes of data, from how much we paid for a pair of socks to what song we were listening to when we bought them.
Processed effectively and placed in the right hands, this data can be used to predict behaviors and trends. However, we are now generating far more data than can be effectively gathered and analyzed without considerable delay. This problem is what underlines, and defines, Big Data. By definition, Big Data is a data set so large that tools cannot process it. It is an ever-growing, always-unobtainable concept.
So how do you tackle Big Data and make it work? Some might say to ignore it altogether.
The Washington Post recently reported on a trend called nowcasting. Government departments, faced with lower budgets, are putting out fewer statistical reports with greater delays. For example, most consumer reports and job reports are on a two-month lag, as agencies struggle to gather and process information from myriad departments nationwide, centralize it, and then produce a coherent, useful report.
In response to this less-timely output, a number of universities and companies are now researching faster methods to generate identical information from alternate, manageable data sources — most commonly, social media channels. As a Google executive points out, these analytics are not any better at predicting the future, but they are drastically better at determining the present. Hence the “nowcasting” name.
Examples mentioned in the Washington Post include university-based research that was able to predict government unemployment figures and, in some cases, produce more accurate results on a weekly cycle by combing Twitter for terms like “downsized” and “axed.” Another research project, this time performed by, and using readily available information from, Google, accurately tracked unemployment rates among young men by focusing on fluctuating daytime search frequency for terms related to entertainment and pornography.
While there are not many things you can deduce from monitoring searches for porn in your organization — other than maybe your employees have a little too much free time — the overall principle of nowcasting is something that can be applied to more pressing day-to-day business. That principle — analyzing readily available alternate data for results that are more often produced from larger, limited-access data sets — is especially prudent in the procurement and sourcing departments.
One way in which alternate data is used at Source One occurs within the telecom group’s Wireless and Wireline Telecom Optimization services, rightsizing an organization’s telecom services to ensure minimal wastage while still eliminating overage fees. The optimization uses historical audits of usage to determine which users need which plans, along with plan and credit negotiations with the carriers.
Telecom optimization can also include predictive, anticipatory changes to plans and packages to avoid overages — like increasing mobile minute and data allotments for a group, for example, a sales group that will be spending more time outside the office as they are onboarding a new client. A rise in minute and data usage is recorded by the carrier and reported on the monthly invoice and changes could be made at that time. But by using alternative data that is readily accessible — either from the group or by a quick analysis of scheduled travel plans and client records — the need for greater minute and data allotments could be increased based on a prediction rather than a reaction; this eliminates the threat of overage fees in the process.
Unfortunately, all of this comes tempered with a caveat. While nowcasting is certainly interesting, it is most definitely novel. There is a lot to be done in this field before the results it produces can be considered reliable. Worded another way, do not disregard traditional data sources and analytics on pork futures just because #crab_legs is trending on Twitter this week. But, with caveat in mind, there is still potential. How would, and how does, your organization use alternative data sources?
Nicholas Hamner, Esq., is the business development manager at Source One Management Services LLC. Source One is a provider of procurement services, helping clients with strategic sourcing and spend management solutions. The company is based in Willow Grove, Pa.