You can view the evolution of software through the lens of “integration and differentiation.” The first computers, the code breaking computers of the Second World War, were single-function, built to execute just one task. To increase the computer’s usefulness and versatility meant enabling multiple applications to run on a single computer. That in turn required a separation between the operating system and applications. That was the first differentiation of capability – the first of many that followed in its wake.
Programs were written in machine code, then in assembler, then in third generation languages, gradually introducing a separation between the language of the programmer and a series of instructions the computer could work with. As software progressed applications were built that separated the application logic from the data storage logic. This functional differentiation morphed into application and database, with the immortal SQL occupying the space in between. Soon after that there was further differentiation, between interface logic (for the GUI), application logic and data storage logic.
As we have briefly described it, such differentiations seem rational, orderly and inevitable. Rational they surely are, but in reality the process is disorderly and smoothly functioning long-lived applications are anything but inevitable. In a perfect world, as innovation occurred and diversification followed, software activities would adhere to generally agreed integration standards. And, of course, there are efforts to make that happen, but they usually fall short and sometimes they fail completely.
Powerful forces of disruption stalk the world of IT. They have two multiple sources. Most disruptive are fundamental changes to the way computers work. The evolution from punched card and magnetic tape to terminals and spinning disks was dramatic. The introduction of the personal computer was dramatic. The advent of the internet was dramatic. You can think of these as technology earthquakes. They have a habit of bending well-intentioned standards out of shape or sweeping them aside.
And there are lesser tremors that still also upset the applecart. There is always a chain of innovation, like a row of dominoes that fall one against another. Change at the hardware level forces innovation in compilers and operating systems. Hardware, networking or OS innovations have a knock effect on database technology. This in turn affects software development tools and eventually applications. And incidentally, innovative change can be introduced in any domino at any time. All the while these components gradually speed up and as a combination of all this, wholly new applications are born.
Stacks of technology coalesce around specific hardware and operating system platforms. Historically, there is an IBM mainframe stack, a Unix stack, a Windows stack and a Linux stack. In recent years we’ve seen the emergence of a scale-out Hadoop stack. In the mobile market there’s currently an Apple stack and an Android stack. The dominant vendor in the stack does its best to popularize the stack, but as adoption accelerates the technology environment around it grows under its own momentum.
In these IT ecosystems industry standards are complemented and occasionally usurped by “de facto” commercial standards. Whether it is file formats or interface standards matters little. At some point the ecosystem tends to lock users and developers in. There are micro-ecosystems too that surround more niche areas of activity: high-performance computing, semantic applications, games, video processing and so on.
The upshot is that, with the passage of time, most corporate computing environments find themselves with a dastardly mix of these ecosystems. The integration of applications and data between such systems has become burdensome and expensive. And to make matters worse, the collection of technology rarely stays still for very long. Businesses are continually adding new applications and replacing older ones. This inevitably obliges them to get involved in new IT ecosystems, both on premise and in the cloud. Even if they can avoid this, the technology they have sunk their investment in changes beneath their feet. This can happen in a gradual way, but sometimes it’s a step change.
There is a definite cycle here. It begins with innovation, which means differentiation into new areas of technology or its application. The integration problems created soon give rise to standardization efforts or de facto commercial standards. As the technology stack ages its limitations become increasingly visible and onerous. However, this does not necessarily lead to its decline. That is usually provoked when a wholly new platform emerges and the cycle repeats.
This happened with Windows. Which gradually evolved from the desktop to the server. It never really challenged the IBM mainframe but it never blended with it either. The Internet revolution was not spawned by Microsoft but by Unix, principally driven by Sun Microsystems. It led to web software stack that cared little for Windows or the mainframe. This was in turn usurped by Linux and the open source style of software it inspired. And recently Hadoop and its rapidly evolving ecosystem stepped into the picture.
The pendulum swings from one technology ecosystem to another and technology users swing with it.
So what can a business do to buffer itself from these technology swings and disparate architectures? A company may not be able to insulate itself completely, but it can certainly invest in researching software architectures that have remained unified and survived as it grew functionality across generations of technology. One example is SAP with its ERP system, which is both remarkably popular and seriously long in the tooth. The SAP ERP architecture was well engineered from the get-go, and never rushed to adopt new ideas (such as data warehouse and data marts). It could not afford to simply because it had to support such a large customer base. But it eventually did integrate such capability.
For the Business Intelligence and Analytics software markets, BOARD International’s unified platform for BI, Analytics and CPM offers another example of this phenomenon. It too has proved its longevity, and right now its broad set of BI and analytical capabilities sports a very modern in-memory architecture. Besides a unified architecture that enables its enterprise-wide functionality, BOARD takes an easy-to-model, programming-free toolkit approach that is proving quite popular, particularly in the Cloud.
In my view, that’s the nature of the IT game. None of the various technology disruptions must inevitably lead to terminal entropy. But they can do that if technology choices are made too hastily and without giving due consideration to how all the software components will integrate not just immediately, but in the long run.
Admittedly for the likes of BOARD for Business Intelligence and analytics software and SAP for ERP, software is their business and they’d be foolish not to focus on the long term. However there are many examples of software companies that have failed to do that and continue to do so in favor of quick profits and rapid market share gains. As for the IT user, it pays to think long and hard about technology adoption; there’s a great deal of sense in focusing on integration and a unified architecture. It offers a less painful path to the future.