The humble Excel spreadsheet has carved out a place in history as one of the most-used types of business software. More than a billion copies of Microsoft Excel have been sold since its launch in 1987, underpinning accounting and data analysis tasks for thousands of organizations. It’s no surprise, as Excel spreadsheets are a perfect fit for small data tasks. But the story isn’t the same when they are used to perform more complex, multi-dimensional planning and analysis activities on larger datasets, to which they simply aren’t suited.
There are numerous examples of significant losses – financial and otherwise – caused by manual errors in spreadsheet-based formulas, calculations, or data manipulation, which could have been avoided using a more robust (and appropriate) solution. So far, these stories haven’t put companies off using spreadsheets for more complex activities, but here’s why now really is the time to stop relying on Excel to run your organization.
One of the most recent examples of Excel spreadsheet blunders comes from the under-reporting of COVID-19 case numbers in the UK. Public Health England’s use of the old XLS file format, instead of XLSX, meant that when case data was collated from multiple COVID testing agencies, it exceeded the row capacity of the XLS spreadsheet format. This resulted in almost 16,000 cases being cut off from the results and going unreported over the course of a week, skewing numerous other analyses and painting a false picture of the situation.
This is just one in a long line of horror stories. Look back through the news archives and you’ll find many a story about businesses that have lost millions, or even billions in lost income, fines, bad deals, and reputational kudos as the result of spreadsheet-based errors. The reason is that spreadsheet software is simply not designed to manage complex, collaborative planning and analysis tasks, for several reasons…
Without a doubt, the manual nature of Excel spreadsheets is the biggest source of both errors and delays. Across multiple touchpoints in a spreadsheet-based planning or analysis process, data must either be manually entered, imported, combined, or manipulated. During these activities, even the most diligent of workers can make mistakes, which are then embedded in all future iterations of the file. A rogue formula, decimal point placement, or shift in a row can easily go unnoticed, having a significant impact on a multitude of other processes and also on the final decision make using erroneous data. Decision-making is also slowed, as users must often wait for data to be collected and compiled before it can be used to provide any kind of insight.
Linked closely to the point above, spreadsheets are not easy to keep control of. Yes, validation rules or restrictions on editing certain fields can be applied, but there’s nothing to stop people deleting or overwriting information in cells they have access to. Edits are not tracked, meaning there is no audit trail of who changed what – whether accidentally or, in the worst case, maliciously. Sheets can be password protected but these passwords can easily be leaked or shared, enabling the data contained in the spreadsheet to be accessed by all who receive it, presenting potentially serious security risks. When it comes to large-scale, business-critical plans or analyses, this is not a viable approach.
The newer Excel spreadsheet format (XLSX) can handle just over one million rows of data. But with today’s data warehouses containing several terabytes or even petabytes of data, it’s obvious that spreadsheets are not equipped to deal with these vast data volumes. A large supermarket chain, for example, may generate billions of records each day relating to individual transactions, which are impossible to combine and analyze effectively in a spreadsheet. With this much data to-hand, organizations need a more robust solution which can scale up to provide strategic insights from the mountain of data generated by the business.
The data entered into an Excel budget spreadsheet, for example, is the only data that’s available for analysis or planning. If underlying or supporting information is required, it must be added into the file to be useable. This makes detailed analyses of cross-departmental data extremely difficult to achieve without a significant amount of manual, time-consuming work to create a master file. The risk of errors in transposing or importing data here is high, resulting in misleading results to analytical queries or plans. Users or managers with a query relating to a spreadsheet-generated chart or table will often have to go digging elsewhere to answer it, slowing down the decision-making process further. The data also remains static until it is manually updated, meaning that it soon becomes out of date – along with the related analyses.
On the subject of speed, Excel spreadsheets lack the rapidity required to deliver timely business insights. Once the volume of data in a spreadsheet starts to increase, it starts becoming slow to open, and when volumes approach maximum capacity, there are lags in each analytical operational or application of a formula or macro. Short delays become frustrating for users, and over multiple users add up to a significant amount of wasted time and a delay in providing real time insights needed for a quick decision.
We’ve discussed why spreadsheets are unsuitable for modern business analytics and planning requirements, but what should your organization be using instead? The answer is an Integrated Business Planning and analytics approach which automatically combines data from across the organization with the capability to plan, analyze, and forecast in real time.
With the uncertainty presented by today’s marketplace, now really is the time for companies to stop making the same mistakes and upgrade their spreadsheet-based approach.
An exclusive research study by BARC, exploring why the solid integration of corporate planning and its unification with analytics is the basis of modern, data-driven corporate management.