Contact us
contact us
The Aptitude Blog

Big business, big analysis, just not big data

December 18, 2014
Posted by Sarah Werner

We live in an age of intelligence. Accountants drive strategy, marketers are scientists and logical coders helm global organisations. The development of this intelligent business has been the inevitable result of drastic improvements in the way we manage information both in the way we store it and interpret it. We all know that the term “big data” has been around for far too long and that the concept of dealing with huge data volumes to inform business decisions has been around for even longer, but where have we actually, truly seen it utilised?

It’s only now, with the maturing of technologies such as Apache Hadoop that we can take data science away from sand-box experiments and one off analyses and ignore the methods enough to focus on the business outcomes. So where have companies actually shown the world how we can incorporate big data into day to day operational business and start the process of the term finally becoming inconsequential.

At Aptitude Software, we’ve been very close to the banking sector for a long time and we’ve seen how reporting on a small number of products can become enormously complex once you start your analysis. When you factor in the number of cost centers, products, channels and customers available to potentially sell through, then calculating granular profitability becomes a highly complex task both in terms of the data being crunched and the individual calculations being performed (often reaching hundreds of billions). By combining data integration, process management, business user control and high volume data processing banks can now run the analysis they need rapidly and repeatedly. They can optimize sales promotions and product lines without a thought for the complex calculations involved.

Some new companies can now afford to bake this kind of functionality into their business to make all data available quickly at all times. A major global media-streaming service recently invested in a ground up Hadoop-based architecture, allowing them to avoid the restrictive access and processing speeds (not to mention cost) that traditional approaches carry with them. Crucially, this gives them the ability to scale linearly as they move to new territories and expand their business models. It also means they have the freedom to record and keep every single transaction that occurs to calculate vital royalty payments accurately in an industry where cash days are in short supply.

Finally, some companies are implementing more stable solutions to allow them to constantly monitor global trends to be able to react quicker than competition. An online ticketing company are using Hadoop to monitor constant streams of structured and unstructured data to drive a dynamic pricing engine. This gives them the opportunity to ensure their pricing is truly elastic and optimizes sales volumes and profitability.

So what do these all have in common? They all take the idea of big data and make it a routine, background part of the organization and they all require huge volumes of processing and calculations in their daily operations.

So let’s finally put the term “big data” to rest and focus on the business outcomes that the using the full suite of data technologies can finally provide.

Back to blog

This blog post was written by:

Sarah Werner
Read More