The success story of the cloud is impressive: Key technological foundations were laid in the late 1990s with the so-called multi-tenant software architecture. This made it possible that multiple companies could use a single software application via the browser without the possibility of spying on each other’s data. A pioneer in commercializing this principle was the company SalesForce. Only shortly thereafter, in the early noughties, the buzz word Cloud Computing appeared. Then, next milestone in 2006: Amazon Web Services started offering cloud infrastructure services. AWS is one of the four hyperscalers (next to Azure, Google and IBM).
Cloud today is a 214 billion USD market (as per 2019). However, the market potential is far from exhausted. In an interview with the economic newspaper Handelsblatt in July 2019, the CEO of AWS, Andy Jassy, claimed that only 3 percent of all IT tasks are handled in the cloud. ”In ten or 20 years, most companies will no longer have their own data centers. Only tasks that require proximity will be handled locally in the future – for example in a factory.”. Market developments prove him right: While, for example, Daimler migrated its Big Data Analytics to the Azure Cloud, the company keepd data centers on edge in its factories to prevent a failure scenario. Many companies already rely on Cloud Only instead of Cloud First; DB Systel, for example, has been implementing such a strategy for several years.
You need to refresh your memory about some buzzwords around the Cloud? Check the Glossary: Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS), Software-as-a-Service (SaaS).
Cloud: The advantages
When Cloud Computing came up, one of the core (Marketing) arguments was the following: Using the cloud was cheaper than operating your own data center. Based on today’s experience, however, this holds true only in special cases. Those who use the cloud should expect that the costs across the entire life cycle are rather higher. Exceptions to this: Companies with a very uneven distribution of the IT workload, which requires a lot of overcapacity for the peaks. Under the following link you will find two comprehensible cost comparisons between the cloud and a data center. The first example refers to a research cluster, the second example describes the use case for a SaaS provider: Cost comparison cloud vs. data center.
There is, however, a cost advantage for companies: The costs / investments are distributed differently over time. CAPEX becomes OPEX. High initial investments are omitted, instead monthly user fees are paid, which is convenient for companies with a weak cash flow situation (this includes, as is well known, start-ups).
The key driver for todays growth of the Cloud market: Cloud provides the advantage of agility or also: scalability, flexibility. This is particularly relevant for companies that need to upscale their IT infrastructure in fast-growing markets, e.g. the Smartphone Bank N26, which at times has activated up to 10,000 new customers per day (sic!). Shortage of skilled IT professionals is another reason for companies to shift their IT infrastructure into the Cloud.
Here’s yet another reason: If sunk costs of investments for IT hardware/software are eliminated by using the Cloud, then time-consuming investment requests and lengthy decision-making processes become obsolete. This may inspire a more innovative culture, it prepares the ground for a “fail fast, fail cheap” corporate culture (or also: start-up culture). Many companies are eager to establish precisely such a “mindset of innovation”, as they are facing the challenge of digital transformation.
SaaS-business models do also reduce the lock-in effect. Let’s assume a company detects severe issues with a SaaS CRM software one year into using it. There’s no sunk costs for the scenario of changing the provider. So, this client company is more likely to look out for an alternative, compared to a scenario where they have purchased a software license. However, there is still sunk costs considering configuration, data migration, training, etc. But you should get my point.
A further effect of cloud migration projects: It enforces (at least to a certain extent) the use of automation technologies. If, for example, only little scripting has been used so far in a company’s data centers, this changes immediately when rebuilding the IT infrastructure in the cloud. Migration to the cloud is therefore an efficient measure to break up encrusted structures and processes in IT departments – and we can assume that companies use Cloud migration projects to instill a momentum of modernization and automation into their IT operations.
Cloud: costs and disadvantages
I have already covered the cost/commercial aspect of operation in the previous chapter. Let me complete the picture: Especially companies with only narrow bandwidth should be prepared to invest in improved bandwidth when switching to cloud-based services.
Anyone migrating an IT infrastructure to the cloud naturally needs a sufficient number of infrastructure specialists. In addition, adaptations to the core applications, that is: the programming code are also required. This includes: The removal of hard-coded paths. Changes in connectivity to external systems. And this also requires an adjustment of the process for upgrading libraries. If a software is migrated to the cloud with only a minimum of changes, we talk about Lift and Shift (or: Rehosting). The so-called Replatforming, also known as Lift, tinker and shilft, involves higher costs: Some optimizations / adjustments are made to the software, however, the core architecture of the application is not changed.
If an existing software is optimized for the cloud – or even rewritten – it is called Re-Architecting. The result is a Native Cloud Application (in short: NCA), which enables optimal scaling and easier upgrades. The prerequisite for this is a software architecture based on Microservices. That means: The software application is highly modularized. Each module constitutes a small service that runs as separate process (with its own database and storage). The services communicate with each other via APIs. Since these Microservices are independent of each other, they can run in distributed environments. In addition, each microservice can be written in its own programming language. Microservices can be easily scaled in a container infrastructure. If something fails, this usually only affects the module, not the entire system.
The re-architecting of a formerly monolithic application (in short: migrate to a microservice-based architecture) involves considerable effort. Although a monolithic architecture also has advantages (e.g. performance), it is also associated with clear disadvantages with regard to continuous integration and delivery. Anyone switching to microservices-based software architectures usually has to overcome further additional challenges: An adapted development process, accompanied by a new communication culture. Testing also becomes more complex.
Last but not least, usage of Cloud in a company comes with the emergence of a shadow IT: BEFORE the era of Cloud Computing, when IT requirements in the cloud required a business department to go to the company’s internal IT department. Today, a marketing department, for example, can use SaaS software such as SalesForce or Sugar CRM without having to rely on the IT department. Companies have to find a practicable answer to this dynamics: Companies need some flexibility on the one hand, they also need to ensure a certain level of compliance with group guidelines on the other hand. We’re entering the era of Big Data, there’s huge benefits from having a consolidated data pool rather than many scattered IT applications with disconnected databases. One possibility is, for example, “Rapid Reaction Teams” of specialists from the IT department who work together with a business department.
Cloud: Security
Last but not least, here’s a few thoughts on security. From a purely technological perspective, the migration of data in a public cloud of a hyperscaler should increase security. Why?
Firstly, as a specialist for this service, a hyperscaler should generally have more security experts on its payroll than its customers. Not least because a successful hacker attack / security incident directly threatens the business model. In practice, this means a very high incentivization for high security standards.
Secondly, the multi-tenant architecture actually makes it more difficult to find data: While data is stored on-premise, the IP address serves as a simple orientation for hackers to locate the data. In the public cloud, on the other hand, such an orientation is not given.
Third, about 60 percent of all data leaks are committed by insiders. The public cloud has the advantage here that data taps always leave traces: This serves as a deterrent or allows easy tracking.
This comparison naturally only holds true if you compare a private cloud (i.e.: hosting a web-based application in your own data center) with a public cloud. There is of course also a (albeit declining) number of on-premise applications that are not remotely accessible via the Internet; these naturally have a much lower vulnerability to cyber attacks. Migration of these applications to the cloud (whether to the private cloud or the public cloud) obviously increases the security risk significantly here.