Gartner Symposium: The not-quite-life and strange death of big data.

RR BWRachel Russell, Head of Client Service, at Gartner Symposium               

The opening keynote speeches at Gartner Symposium Barcelona were a fitting reflection of the challenges IT faces over the next few years. A common thread was the need for IT to work more closely with other facets of the business, in order to harness the innovation of others and empower them with the best of technology.

It was two seemingly unrelated points that illustrated this most vividly. UK-based analyst Lee Weldon made a strong case for the CIO to position themselves as the trusted ally of the CEO. And Darryl Plummer, an analyst working out of Atlanta, made a forceful argument for why big data is dead. That might have come as a shock to some of those in the room – but those without a big data product to sell have always known that it was never really alive in the first place, more a sort of Frankenstein’s monster, lacking the soul of a definite business purpose.

Data is useless without effective analysis, and for analysis to be effective, it needs to address a clear business challenge. Big data promised to create insights out of almost nothing at all. But an insight is useless unless it offers an answer to a pressing question, and business leaders cannot rely on serendipity to solve the overall challenges facing the business. However, if CIOs begin by considering how they can tackle those key business challenges, then that is how they can become a trusted ally of the CEO.

Of course, that can’t be done without data. And sometimes the datasets involved will be as large as those that characterise big data. However, they certainly shouldn’t include data that is not relevant, and they definitely won’t be analysed through a process of ‘exploration’, ‘discovery’ or anything else that relies on chance or curiosity. The key to success is knowing which datasets should be used, and what model we should use to analyse them. Fortunately, the two questions can be answered by the same process.

For example, if the CEO (in, say, retail) wishes to increase gross sales, the CIO can build a model that analyses all the factors driving gross sales, and pinpoint where efficiencies and improvements could be made. This begins with an integration of two very simple figures – total transactions and average transaction value – but the factors influencing those are described by some highly complex datasets, such as footfall, conversion rates, and the performance of individual products. To identify the actions that will increase performance, those datasets will have to be arranged so as to allow a granular level of visibility across all relevant dimensions of the business. But, once all the relevant datasets are connected, the driving factors have been mapped, and the calculations that define their interactions have been defined, we have an analytical model that connects only the datasets relevant to the business challenge in question.

That is no less impressive than gathering together a mighty dataset and then throwing a huge amount of computing power at it. But it is a great deal more effective, and much more likely to be trusted by CEOs as a means of improving business performance.


Leave a Reply