One of the constants during my twenty years in the EPM and BI space has been the desirability for scalability and performance when the prospective buyer is considering the different solutions available to them in the marketplace. If anything, over the last five years the importance attached to these capabilities has increased in prominence as terminology such as Cloud, Big Data, and Internet of Things float around businesses, conferences, and social media.
As a leading vendor in business intelligence and performance management, Board is no stranger to questions around scalability and performance. In this blog I’ll share how, during a discussion with a large, multi-national organization, we proved that EPM scalability and High Performance really can go hand–in–hand.
So first, how do we define the scalability of an EPM solution? It is very likely a project spec will include the desire to have a large user base, perhaps up to thousands of users (many of them concurrent) without a degradation of the user experience. Scalability is also about complexity and the number, diversity, and organisational reach of the models and applications built in the solution. After all, you probably want a future-proof tool that can grow across the business from Day 1, or more gradually as the business expands.
Gone are the days when an EPM tool would only be rolled out to say 10–15 users in the Finance function, so scalability must also mean the ability for the solution to load multiple, varied sources of data across the business. This must happen quickly, possibly in real time, without users being ‘kicked out’ while it is taking place.
These implementations need to perform too. Reports must render in seconds and data entry must similarly be a rapid task – even when thousands of people are doing these things at the same time. Calculations must be brisk, especially when the end user needs to run them from the user interface, perhaps after performing some data entry or analysis.
The Board decision–making platform has always excelled in both scalability and performance. We have a wide customer base of organisations with large user numbers, high concurrency rates, complex requirements, and high performing processes – all handled by their Board Cloud instance. Something that is not always readily accessible though are the hard facts and performance statistics that clearly back up any performance and scalability claims, so during a recent Board proof of concept I took the time to track these numbers down so we can consider them in this article. In my experience every vendor claims that they can scale, but how impressive are the numbers in the cold light of day? Let’s take a look.
We were talking to a large organisation with a truly global presence: thousands of locations, 1,500 accounts, and hundreds of regions, markets, departments, and cost centres. The existing landscape consisted of multiple disconnected, standalone applications and (of course) spreadsheets. Several hundreds of hours a month were spent just moving data around between these applications. What the organization wanted was a globally connected, future-proof planning and reporting solution.
The business generates between 100 and 250 billion records daily and for the Proof of Concept (POC) we were to load a 300 million record data set. This was not a problem for Board and once completed we were able to get the ball rolling with hypothetical reports of 9,000 rows and 30 columns, rendering in a second. Being a global business, an essential requirement to demonstrate is the follow-the-sun approach of one location closing and the next location opening, with new users carrying out data entry as the ‘closed’ location’s data gets loaded into the Board database at the same time.
Aside from demonstrating Board’s value to the business in the relevant business processes, during the POC the key aspect of the performance testing involved the automatic simulation of 2,000 concurrent users, with 300 users undertaking data entry (and other actions) at the same time. The former number reached 2,016 at the peak of the test (lasting 30 minutes out of the three hour total test time) and at the height of data entry we were able to simulate 406 users pressing the ‘enter’ key at the same time – years ahead of anything I have seen in a planning tool in twenty years.
At the same time, a data load of six million records ran in under three minutes. The data entry ‘bots’ created 27 million cells of data during the tests and 96% of screen queries ran in under a second. The slowest data entry was three seconds for a commit to the Board database. When you look at the average query time against number of users, the chart gradient was pretty constant, indicating that the Board engine was performing optimally and consistently as user numbers rose. Overall the Board server was receiving 18 to 20 thousand requests per minute on average.
The combination of scale and performance in this example is clear, showcasing Board’s ability to make light work of data complexity.
So how do you communicate and reinforce the message behind these results, as evidenced in the Board performance test log files? Well, you load them into Board too of course, with the result being that you can then take a chart showing peak activity in the system and drill down to see the individual ‘bot’ users that made the change. But not just that. The remaining data in the log files continued the drill-down experience by providing the old data value, new data value, time of entry, and the intersection in the database in which the data was entered.
Can your EPM solution do all this?
If you’d like to see Board’s performance for yourself, request a demo and speak to us about a Proof of Technology (PoT). We’ll put your data to the test in a Board environment and show you platform’s speed, flexibility, and scalability. We see this as an essential part of the decision to invest in Board and would love to discuss it with you further.
Request a personalized demonstration, then go on to experience Board’s superior performance in the context of your organization by requesting a Proof of Technology (PoT) that uses your own data.