Spreadsheets persist in business like no other technology. Why? Because there’s no alternative. But before you format another cell, read this.
There is literally nothing like a spreadsheet for its agility and low-cost handling of sometimes complex numerical data. Spreadsheets possess a familiarity and flexibility that have made them de rigueur for many financial professionals. And yet, like fire, spreadsheets are a good servant but a bad master; they harbour the potential for costly mistakes – in both a financial and reputational sense – unless controls are put in place.
Errors of judgement
Looking for evidence of the fallibility of spreadsheets reveals an extensive research base. The European Spreadsheet Risks Interest Group (EuSpRIG) offers a wealth of evidence, advice, comment, and academic papers on the subject and should be a first port of call for anyone interested in the life and times of spreadsheets.
“If the uncontrolled use of spreadsheets continues to occur in highly leveraged markets and companies, it is only a matter of time before another ‘Black Swan’ event occurs (a major unexpected event), causing catastrophic loss,” says Grenville Croll, a specialist in spreadsheets, spreadsheet applications and spreadsheet risk research, and a former Chairman of EuSpRIG.
There is an abundance of academic study placing human error as the major downside of spreadsheet use. EuSpRIG’s research suggests that the majority (more than 90%) of spreadsheets contain errors. Research conducted in 2006 by Professor Raymond Panko at the University of Hawaii indicated that spreadsheets are rarely tested and as such these errors remain in circulation.
There are a number of elements that undermine the efficacy of spreadsheets. David Banks and Ann Monday in a 2002 paper highlighted the potential of spreadsheets to facilitate fraud if there are no auditable trails. They also found users exhibited overconfidence in the accuracy of their spreadsheets leading to them not looking for errors and subsequently not finding them. And error of interpretation was also cited, where business decisions could be based entirely on spreadsheet data even where there was ‘evidence to the contrary’.
Most treasury professionals are acutely aware of the advantages and pitfalls of spreadsheet usage. In the two most recent Corporate View articles in Treasury Today, the interviewees (representing very large international corporates) were almost apologetic about using them, but stated, quite rightly, that there was nothing to take their place. Malcolm Cooper, Global Tax and Treasury Director at UK and US energy distributor, National Grid, said: “I do whatever I can to reduce the use of spreadsheets, but I’m afraid it is impossible because they are so useful. So often you need that ad hoc piece of analysis that a TMS or anything else will not do.” Similarly, Manish Kapoor, Head of Airtel Centre of Excellence Cash and Bank at Indian mobile telecoms giant, Bharti Airtel, admitted that “the treasury function cannot completely get rid of spreadsheets”. He too explained that they enable his team to perform some data analysis functions which he felt could not be performed on any other immediately available technology.
The spreadsheet’s capacity as a core treasury tool was investigated in June 2011 in a Treasury Today ‘Question Answered’ article. Phil John, EMEA Treasury Director of Mars Incorporated referred to spreadsheets as “our flexible friends” and noted that if used properly they were a “tremendous tool”. However, he warned that if used inappropriately “they quickly become an Achilles heel”.
“The weaknesses,” said John, “are many: individual designs are not always transferable; formulae errors are not transparent and can easily be introduced when amendments are made; large designs become clumsy and error-prone; manual data entry frequently leads to error; work is often duplicated; storage is frequently not on shared servers; and there is no audit trail.”
It is perhaps unfair to criticise the technology itself: it does what it does perfectly well. But because it responds only to data entered, it would be more appropriate to state that it is only ever as reliable as the people who use it – and when they prove to be confused, inefficient and prone to making errors, so too are the spreadsheets.
One respondent to an article looking at how to minimise spreadsheet error using monitoring technology fired back that it was “patronising to say we all make these big typos everywhere”. Professional accountants, he argued, know what results to expect from their spreadsheets. Excel, he continued, “is the only bit of kit we have that provides the flexibility we want”.
Notwithstanding the professionalism of most treasurers, the lack of care and attention over spreadsheet control in some organisations has been officially commented on (if not actually legislated against) by regulatory bodies such as the Basel Committee on Banking Supervision and the UK’s FSA.
The Basel Committee issued a consultative document in June 2012 – ‘Principles for effective risk data aggregation and risk reporting’ – in which it warns banks to be ‘fully aware of any limitations’ that prevent full risk data aggregation. It talks of the need to aggregate data ‘on a largely automated basis so as to minimise the probability of errors’.
Automation is recognised as a force for good by the FSA in its September 2012 report on the country’s insurance industry, ‘Solvency II: internal model approval process data review findings’. It noted that many firms had automated spreadsheets, using macros to reduce manual intervention. Others had taken further steps and had deployed spreadsheet management tools to determine the dependency of interlinked spreadsheets and were able to apply controls such as audit trail and version management.
The FSA accepted that automation of spreadsheets reduces the risk of manual error, but warned that it can also ‘introduce different problems such as reduced oversight, inadequate transparency about the extent of linking and proliferation of nested linked spreadsheets’. Linked spreadsheets, it added, ‘typically pass only single numerical values, without an indication of the date of their last update, creating the risk of passing stale data around the system’.
In May 2012, Oracle and Accenture issued a report which claimed that 67% of companies still use spreadsheets – and 8% use spreadsheets and nothing else – to assist with tasks in financial close, reporting and filing.
These are fair points that concur largely with the academic views presented above: although targeted at the insurance industry, the FSA’s report could easily stand for most verticals in that it describes the use of “often uncontrolled” spreadsheets as “pervasive” and that “nearly all the data reviews identified issues with them”. The FSA in particular reserved criticism for firms that did not have an inventory of critical spreadsheets, “classified by use, by the impact on the internal model, and by complexity”.
Mind the gap
Whatever systems treasurers use – whether they are built in-house or bought in – the capabilities of those systems will have been defined by specifications that were written “at best” two or three years ago, claims Ralph Baxter, CEO of spreadsheet and data management software vendor, ClusterSeven. “And most technology-refreshes in big organisations are considerably slower than that.”
The gap between what corporate systems (a TMS or ERP, for example) can do and what the business needs is entirely understandable because few IT budgets can keep pace with all the requirements for change, especially those forced upon organisations at great speed by the regulators. Baxter believes that every regulatory report required by Dodd-Frank, for example, will be prepared in a spreadsheet “because there is no other way of compiling the information from all the different sources in the time that’s required”.
In May 2012, Oracle and Accenture issued a report which claimed that 67% of companies still use spreadsheets – and 8% use spreadsheets and nothing else – to assist with tasks in financial close, reporting and filing. The report’s author found it interesting that while more companies with three separate solutions for close, reporting and filing used spreadsheets (73%), a “significant 59%” of companies with a single solution also use them.
Paul Bramwell, Senior Vice President, Treasury Solutions at SunGard, looks upon the reliance on spreadsheets and manual processes alone as “making it difficult for corporates to respond quickly to fluctuations in the market with any degree of confidence”. Without automation, he says the time it takes to pull fragmented data together, as well as the potential inaccuracy of data, “can have a negative impact on investment and borrowing decisions”. Whether or not Bramwell’s contention is correct that spreadsheets are thus “quickly losing their appeal” remains to be seen.
Baxter offers some evidence to the contrary. Firstly, the fact that spreadsheets are still very much in use (with plenty of anecdotal evidence to support this) suggests that the business world has not found a suitable alternative and neither is one waiting in the wings to take over. Secondly, that their usage is not constrained by size, value or project complexity suggests an almost limitless appeal. In respect of the latter, he claims that one of the biggest treasury-related projects of the past few years – the $700 billion Troubled Asset Relief Programme (TARP) in the US – had to be managed in spreadsheets. “Despite the supposed foresight of all the vendors in the market place, I doubt any of them had built a TARP programme that could be bought off the shelf.”
Regardless of who’s right, because of the continual dysfunctional conversation (and the publication of many spreadsheet horror stories) that has sown the idea that spreadsheets are bad and must be removed, many finance professionals don’t want to admit the extent to which they use them. But like any problem that’s swept under the carpet, the result tends to be the creation of a bigger problem because if no one’s admitting what they are doing, the task of managing spreadsheet usage becomes that much more difficult.
According to a 2000 study by the University of Hawaii’s Raymond Panko, only one approach to error reduction has been demonstrated to be effective. This, he said, is code inspection, in which a group of spreadsheet developers checks a spreadsheet cell-by-cell to discover errors (so ensuring that what is supposed to happen has and that nothing unexpected has occurred). Quoted by EuSpRIG, Panko said that even this process, exhausting and expensive as it would be, could only catch about 80% of all errors.
But Baxter believes that as long as the business process integrity for each key spreadsheet can be defined, the checking process can be automated thus making it quicker, cheaper and more consistent. Such a checking tool cannot reach spreadsheets that have been taken outside of the corporate system (on a user’s laptop, for example), but this marks a move away from business process integrity towards compliance and security issues, both of which should be handled by their respective policy units.
And it is perhaps policy development where the real answer to all this lies. There certainly needs to be a change of direction in the approach to spreadsheets, steering thoughts away from shame and secrecy towards an acknowledgement that, according to need, there is a balance to be reached between spreadsheet usage and the deployment of corporate systems. And if, as the academics and the regulators state, the problem is one of control, the obvious need is to implement a policy-led framework to create that control, and then to implement it enthusiastically.
PwC’s Internal Auditor publication has noted five common effective spreadsheet controls:
Setting documentation standards.
Establishing data entry procedures.
Using good security measures.
Backing up data frequently.
In addition to this high-level set of pointers, the T2P Information Governance Research Community has published a few more pertinent guidelines:
Expect errors and test regularly.
Manage spreadsheet change, preferably using versioning software, at least where spreadsheets are part of a critical business process.
If possible, use software to audit spreadsheets, right down to cell level.
Automate critical business processes.
Monitor centralised application adoption.
Encourage a balanced mix of enterprise application and spreadsheet usage.
Enforce policies and procedures.
Companies need a clear understanding of where their existing and new business-critical spreadsheets are being used and what people are doing in them. In an ideal world, the gathering of this knowledge would take the form of a centralised business/IT team capable of assessing existing spreadsheets, monitoring and controlling the creation of new ones across the organisation. This implies a level of group-wide connectivity which few companies possess and, in the real world (where siloed technology and fractured data sets are par for the course), knowledge of what is happening locally is usually only a local capability.
An alternative is to adopt a more distributed approach to understanding spreadsheet usage, where centrally the need is only to know that everyone is following policy, but then giving all the tools, capabilities and support to enable teams to manage spreadsheets locally. In this way, spreadsheet control becomes a policy issue, with the assumption that, as long as the policy is well-crafted, most users will comply.
Ultimately, the flexibility of the spreadsheet is not in question and neither is its capacity to fill a technology gap between what is needed for immediate and ad hoc developments and what is offered by more substantial but less wieldy corporate systems. But just as the Japanese samurai used to carry two swords – the powerful long-blade katana for most combat use, and the agile short-blade wakizashi for close-quarter (or indoor) fighting – modern businesses need the combination of power and agility to cover all eventualities and it gets this through intelligent combination of the tools available, including spreadsheets.