11 Feb The Next Generation of Enterprise Infrastructure (Hint: It’s no longer about infrastructure)
As the cliche goes, software is eating the world. In its wake, every business, whether high tech or not, is ultimately a software business. The rate at which a business is able to deploy and operate software applications is a direct indicator of its growth and competitiveness in delivering new products and services.
Enterprise software and its life cycles are complex. A few high-tech firms in Silicon Valley have built out the software capability with modern applications and environments that allow them to dominate their respective industries. While those practices can be replicated using open software and modern models, most enterprises are too deeply tied to legacy applications and environments to adopt them.
The world is more interesting than the well-behaved 12-factor apps of Silicon Valley. There is a whole swath of companies out there representing traditional industries, such as finance and health care, with a zoo of existing application monoliths that long predate modern microservices. They come in a variety of forms (e.g., Linux, Windows) and are packaged into a variety of form-factors (e.g., bare metal, virtual machines, containers). Operating those suckers is fraught with inefficiencies. Just keeping the lights on takes up most of the budget.
Disaggregating monolithic systems has been the dominant approach to addressing the stubborn challenges of legacy applications and infrastructure over the last couple decades. Well before the paradigm of breaking up software into multiple loosely coupled microservices, virtual machines drove the first wave of disaggregation that occurred between software and infrastructure.
Virtual machines introduced a layer of separation between software applications and underlying physical machine infrastructure. That brought unprecedented levels of utilization and flexibility to otherwise rigid infrastructure. They carried a substantially larger footprint than the applications they hosted, but applications deployed over virtual hardware could be brought up and moved about from host to host with relative ease.
A similar disaggregation soon extended to storage and network with software-defined storage and software-defined networking paradigms. With those, software was decoupled from infrastructure on all three fronts — compute, storage and network. To some extent, that completed the first level of infrastructure reformation.
We are now amid the next level of transformation with the focus shifting to application-aware primitives such as containers. Containers decouple applications from infrastructure at just the right layer in the software stack to be transparent both to applications running on top and infrastructure at the bottom. They maintain the right level of balance to be simple and lightweight while remaining application-agnostic. They operate at a higher level than virtual machines, which removes the need to carry a guest operating system within the abstraction, and at the same time, they strictly operate below the application such that no change to the application or even its configuration is needed. There is charm in being able to rapidly bring up an application with all its dependencies on any infrastructure target literally with one command.
Read the Full Article at Forbes!