Gartner Symposium: the (not so) dark art of the algorithm

,

Simon BittlestoneSimon Bittlestone, Managing Director, at Gartner Symposium

Algorithms have been a key theme of this year’s Gartner Symposium in Barcelona. In fact, algorithms have overtaken data as being the real source of excitement amongst those CIOs and technology leaders who shape how enterprises adopt and use technology.

‘Algorithm’, however, can be an intimidating label, and that is something that many purveyors of analytics and artificial intelligence technology are not keen to change. Also, a little like big data, it is such a loosely defined term that that mystique can be spread across all manner of applications.

As long as algorithms remain a ‘secret sauce’, locked in the black box of suppliers’ software, then artificial intelligence, machine learning, and, most worryingly, predictive analytics will continue to be something that is imposed by suppliers on businesses, without any real consideration of whether or not they address a business need. That gets the technologies involved an undeserved bad name and, in certain contexts, ‘algorithm’ has already attracted negative connotations – think of such instances as ‘flash crashes’ attributable to high frequency trading, or the ‘creepier’ end of predictive marketing.

However, even some of the most closely guarded algorithms are not necessarily as sophisticated as their creators make out. High-frequency trading, for example, is usually based on trend following, probability functions, and a handful of exceptions – the teams who write those algorithms are highly talented mathematicians, but they still use relatively simple, well-proven maths, because (most of the time) it is the most effective option available to them.

The same principle applies to predictive analytics designed to support strategic decision making and performance improvement within a business. At first glance, modelling a complex system (the business) interacting with a dynamic network (the economy) looks like it ought to be a job for super-computers, and only the most complex data science. However, unlike, say, fluid dynamics, or meteorology, the relationships between economic inputs and outputs (e.g. between sales and profit) are relatively easy to define. There are a lot of them, and their interactions must be precisely mapped, but the fundamental mathematics is not complicated.

This process starts with a graphic representation of the strategically important metrics in the business, and a visual map of the factors influencing each one. Once those relationships have all been defined mathematically, the resulting stack of calculations is a long and complex algorithm that describes the business model of the organisation – but, crucially, the constituent parts of that algorithm are all relatively simple calculations. This means that the algorithm is not a piece of mathematical black magic, but something that can be understood and manipulated by both technology and business leaders.

That is crucial, because even the best artificial intelligence algorithms cannot take account of all changes that effect a business, or its wider market. They rely solely on the data sources they have access to and, if new, relevant, sources of data become available, then the team using the algorithm must understand how it will interpret the new data. If they cannot, then there is no way to ensure that the algorithm will remain an accurate, reliable, predictive tool.

Leave a Reply