The triumphant advance of the mainframe began in the 1950s, a milestone being the IBM 360 model (in 1964). When client-server technologies emerged a few decades later, it seemed to some observers that the end of the mainframe had come: it was rumored that the mainframe would no longer play a role in the 21st century.

True, columnists and IT analysts kept heralding the end of the mainframe; but the technology proved to be as robust as it was versatile – despite all the prophecies of doom. Today, if you use a bank account, the health care system, various government services, and insurance companies, there’s a good chance the transactions will be handled by mainframes.

This blog answers the question: What is the actual relevance of the mainframe anymore? What are specific strengths and weaknesses? And how future-proof is mainframe technology?

By the way, when we talk about the mainframe, we are usually referring to the mainframe from IT pioneer IBM. Although companies such as Tandem, Fujitsu, Siemens, Unisys, Hitachi or NEC also offer a mainframe; the US company IBM, however, clearly dominates the mainframe market with a share of around 90%.

Meaning of the Mainframe

The mainframe is particularly relevant in those industries that need to process a high volume of transactions. These include banking and finance, retail, utilities, healthcare, and governments. About 87% (!) of all credit card transactions are processed by the mainframe. So it’s no surprise that 92 of the world’s 100 largest banks worldwide run their core business on mainframes; and across industries, the same is true for 71% of Fortune 500 companies.

There are more than 10,000 mainframe computers in use worldwide, primarily at banks and insurance companies. To make the volume of processed transactions tangible for once: While Google processes about 68,542 transactions (search queries) per second, mainframes operating today process about 1.3 million transactions per second [sic!].

By the way: Thanks to its superior capacity to process (mass) transactions the mainframe is also used in machine learning and for the blockchain.

Advantages of the Mainframe

The main strength of the mainframe is that it processes large amounts of data highly efficiently. And it does so much more efficiently than cloud and traditional servers. Although conventional servers kept gaining computing power, so far, however, the mainframe has been able to maintain its lead.

The IBM mainframe also has the edge when it comes to security. And security in particular is becoming increasingly important in view of dynamically rising cyber attacks and damage caused by cyber crime (see also the blog post: Chronicle of Cyber Attacks). IBM claims that its mainframes can encrypt data 18 times faster than x86 platforms (traditional chip architecture).

IBM also claims that the mainframe has a competitive edge when it comes to cost. According to the company’s own data, mainframe systems account for only 6% of IT costs worldwide, even though they handle 68% of IT workloads in production. Yet the larger the workload, the more cost-effective they are. Mainframe technology typically operates at close to 100% CPU efficiency, while most Intel-based server farms operate at less than 60% utilization, providing a better return on capital expenditure. Nevertheless, it is true that some IT processes run more cost-effectively on server farms, taking into account IBM‘s licensing and pricing policies. A blanket cost advantage for the mainframe cannot be claimed for all use cases.

Disadvantages and challenges around the mainframe

The #1 challenge around the mainframe – as I see it – is not actually a challenge that affects the technology; but rather a challenge that stems from the legacy software that runs on mainframes. Specifically: in 1959, the quite powerful COBOL programming language was developed (COmmon Business-Oriented Language) – by IBM, among others. Today, mainframes run COBOL programs with an estimated 220 billion lines of code (LOC) in COBOL. This corresponds to about 300 shelf kilometers [sic!] of files (double-sided printed A4 paper, 30 lines per page).

The problem: The number of mainframe/COBOL developers is continuously decreasing. In recent years, moreover, some universities have dropped COBOL courses from their curricula. IBM has therefore launched several initiatives to alleviate this skills gap. For example, IBM, in collaboration with the Linux Foundation‘s Open Mainframe Project, launched a series of initiatives to promote interest in and access to COBOL for new programmers. As part of the IBM Z Academic Initiative, the company is working with more than 120 U.S. schools (located near IBM customers) to integrate key enterprise computing content into the curriculum. These courses often include an introduction to COBOL.

But: The challenge around legacy code remains enormous – despite these initiatives. The skills gap is the #1 challenge for companies with mainframes.

Besides this key challenge, the complaint about slow development cycles appears a rather minor challenge. “Development cycles are slow – usually measured in quarters or years,” says, for example, Jedidiah Yueh, founder and CEO of Delphix. “Tech giants, on the other hand, release new software thousands of times a year.” Looking at the upgrade cycles and months of testing before new software goes live in the financial industry certainly puts this into perspective. The update cycle of an eCommerce website is simply not comparable with the update rhythm of core applications of a corporate IT. Nevertheless, it is true that DevOps and the like are also becoming increasingly common in the mainframe environment.

Conclusion and outlook: The Future of the Mainframe

The continuing high relevance of the mainframe (despite all prophecies of doom) has, of course, good reasons; and these reasons are foreseeably also valid for the future, which gives the mainframe – so much should be revealed in advance – a positive prognosis for the future. Let’s take a look at this in detail.

First, the mainframe has – as I pointed out above – continuously defended its lead in the efficient processing of large amounts of data; it is plausible to assume that it will continue to do so in future.

Second, despite the aforementioned issue around legacy software in COBOL, the following is true: the mainframe is gradually adapting to new use cases and modern technologies. The mainframe can be used for Artificial Intelligence (e.g. Tensorflow), for Blockchain. You can use mainframe operating systems like z/OS on the mainframe, but also Linux. Mainframes can also run Docker containers. Applications like Red Hat OpenShift and other open source applications are available on mainframes. Development is also possible with modern open source languages such as Apache SparkML, Python and Scala – as well as JavaScript or C++. This makes the mainframe more attractive to young programmers.

Third, the mainframe and mainframe application form the core of IT and business processes in numerous companies that have grown over decades and have enormous complexity. “We like to say that the mainframe, legacy, or back-office applications hold an accumulation of 30 to 40 years of business process and regulatory compliance evolution that is nearly impossible to replace,” said Lenley Hensarling, who is the chief strategy officer at Aerospike. “Why would you? Put your money into driving new capabilities that tie into those systems and add real value in terms of customer satisfaction, customer understanding, and increased efficiency in sales, supply chain, and product innovation.”.

Fourth, the trend in sales as well as the analysts’ forecast suggest that the mainframe has by no means been sidelined. Following the launch of the new IBM z15 mainframe model in September 2019, IBM reported revenue growth in the order of 60% for subsequent quarters. And Gartner predicted that installed mainframe capacity will even increase in the coming years. John McKenny, senior vice president and general manager at BMC, illustrated these numbers with a customer example: “One of our customers in the financial services industry shifted their mainframe outlook from cost reduction to creating a long-term strategy for the mainframe. Realizing the platform’s strength in resiliency, they added a new mainframe for their environment and expanded overall capacity to handle their business-critical applications, while leveraging the cloud to support their front-end applications.”

Whoever asks the question about the future of the mainframe will also make the following observation: The future of IT infrastructure is hybrid. Companies will not exclusively run software on-premise, nor will they push all applications to the cloud; companies will also use different cloud providers in parallel (hybrid cloud strategy). Public cloud providers such as AWS are also gearing up for this hybrid future, with offerings such as AWS Outpost. This also means that the mainframe will continue to handle select IT workloads (transaction processing, blockchain, machine learning, etc.) in addition to client-server technology.

The bottom line is that no one can predict the future of the mainframe, the world of technology is too dynamic for that; and this caution in predicting the future should not be applied to mainframe technology alone. However, in view of the assessment of numerous experts and observers, one may state: The mainframe will remain part of the IT infrastructure in many large enterprises (at least) for the next decade.


The author is a manager in the software industry with international expertise: Authorized officer at one of the large consulting firms - Responsible for setting up an IT development center at the Bangalore offshore location - Director M&A at a software company in Berlin.