Juan Diego Echeverri Mesa
As the title of this article says, its purpose is to expose, a little, about the famous software containers and the reason for their well-earned fame. However, before entering the world of containers, I want to bring up and remember some of the aspects under which quality software has been considered, over time.
The first of these is Flexibility, which is defined as the ability of the software to add, modify, or remove functionality without damaging the system. We know that change is inevitable in software development, the changing requirements, and the iterative increase in software solutions establish this, therefore, this is one of the most relevant properties for quality software. Very close to the concept of Flexibility is that of Maintainability, although the latter focuses more on solving incidents than on product changes.
On the other hand, we find the concept of Performance, which is defined based on the response time of the software, which in turn must consider the cost of resources, for which it must be supported in the appropriate use of them. Hand in hand with Performance, Scalability appears, that is, the ability of the software to adapt to performance needs as its load grows (users and / or transactions). This aspect can be solved by increasing the hardware (vertical scaling), which generally provides a temporary solution, or in an ideal scenario, establishing a network of nodes working as a whole (horizontal scaling).
Additionally, the Availability, according to which, as its name says, the software must remain available and be able to recover even if any of its components fail (Fault Tolerance), in addition to the fact that its information and behavior must continue to be consistent (Reliability).
Finally, Compatibility and Portability appear, which point out that quality software should work on as many platforms as possible. Quality software needs to be certified and the possibility of being tested under conditions close to productive environments must be as high as possible (Testability), environments in which the software must be administrable and easily configurable after deployment (Management).
Like the aforementioned aspects, there are some other features for quality software, however, I have decided to skip them as they don’t apply as much to the world of containers. Later we will see and understand why to start this article from this perspective.
What are Software Containers?
To explain what software containers are from a simpler level of abstraction, the clearest analogy with the real world is, in fact, that of merchandise containers that we normally see being transported by boat from one place to another. In principle, for this analogy, the most important thing is its modular form to be stored and transported from one place to another, can you imagine what would be the cost of carrying all the content that normally travels inside a container, from one ship to another? before a possible eventuality? Most High, surely yes!
Currently, in the world of computing, and specifically in the world of software development, there is a lot of talk about software containers. The concept of software container is that of a logical package of elements (files, variables, libraries, and even code), which allows an application to run correctly regardless of the operating system on which it is deployed, this includes the scenario in your runtime environment changes.
All of the above is achieved through a configuration file, which contains a light description of the container’s execution characteristics, which also has the advantages of being easily versioned, reused, and replicated. The container configuration file is enough to tailor the runtime environment and configure the server where it will be deployed, and from it, an image can be generated that can be deployed to a server in seconds.
As you will see, it is a concept very similar to that of virtualization, although not identical, it has some very significant differences. Virtual machines enable virtualization of the computing infrastructure (resources), while containers enable the software infrastructure for applications. On the other hand, containers make use of the operating system of the machine on which it runs, instead of providing their operating system, which makes them more efficient, lightweight and versatile.
A quick look from the Software Architecture
Until a few years ago, the world of applications was dominated by monoliths, a set of elements (UI, business layer, data access, etc.) that had to be developed, deployed, and managed as a single component, that is, , were fully coupled. This made scalability difficult, or even impossible, and a developer had to install and test the entire solution for a minimal change, and going out to production was inevitably a headache.
A little less than a decade ago, the concept of Microservices-based Software Architecture appeared. Small sets of applications, with particular responsibilities, that communicate with each other to offer a complete solution, allowing a lighter environment configuration, development, and deployment, as well as providing greater reliability and availability, since when modifications to the software are required, only the particular microservice responsible for what is subject to modification intervenes and deploys. This brings us closer to the concept of containers that we have already been discussing.
You may wonder, what do microservices and containers have to do? A lot! The different components that make up a software application implemented under the microservices architecture must be orchestrated, that is, related to each other so that, as is their purpose, they can act as a whole.
The way of developing and releasing applications naturally changed with the entry of microservices, which in turn brought about a change in the management of the infrastructure required for each of the different components. For example, previously a single component accessed the information in the database, processed it, integrated it with other applications, presented it to the user, etc., now there are several components in charge of carrying out each, or several, of those chores. Given the above, the containers provided the possibility of encapsulating the microservices, facilitating the configuration, parameterization, maintenance, and administration of the infrastructure of different components and applications with similar characteristics. In summary, containers allow you to create infrastructure templates and minimize configuration effort for each group of applications, even by automating some tasks.
As you can see, containers today play a very important role in the architectural evolution of software and how its development and deployment has evolved.
The concept of containers is not new, although in our environment it still seems so. Now, why has this concept caught on so much in recent years? Because software containers are very useful from application development itself, but it is not limited to this, it also has an enormously positive impact from the efficiency of migration processes from one platform development to another; its administration before, during and after deployment, and its transition from a development environment to a productive environment.
Under a software development framework away from the world of containers, there are a host of additional challenges that the development and operations team must contend with throughout a project and even after its completion. One of them is having a standard development environment that works well on different machines, and even surprisingly, on the same machine when, for some reason, a reinstallation or reconfiguration process is required.
Thanks to the software containers, developers can isolate the execution environment of their application from the configurations of the operating system of their machine, this also being an important input for scenarios such as the entry of new members to the development team. The long times it previously took to set up a development environment (days or weeks) are now limited to minutes.
Thanks to the software containers, developers can isolate the execution environment of their application from the configurations of the operating system of their machine, this being also an important input for scenarios such as the entry of new members to the development team. The long times it took previously to set up a development environment (days or weeks) are now limited to minutes. On the other hand, transitioning between environments is one of the biggest headaches for any development and operations team. Generally, the developer has his local environment suitable for the correct operation of the development and the application, but in time, it is possible that some specific configurations of it are forgotten, which represents a high risk when making a step to a Second environment, whether it is quality assurance or worse, production, that’s where our popular battle phrase “my machine works” usually appears.
Some companies have a large number of environments for the same application (laboratory, development, testing, certification, QA, UAT, pre-production, production, in others), can you imagine the effort it takes to make each of the transitions until reaching production? All this disappears with software containers, since through them you can replicate an application anywhere, without the need for the environment where it is deployed to comply with the particular configurations of the application itself, which also It allows having more applications running at the same time on the same server, each with its configurations without generating competition scenarios between them and their configurations.
Advantages of Software Containers, for what?
As I have already mentioned before, software containers print a very interesting portability component, since once you have the application inside a container, you have the possibility of taking it easily and quickly, as well as almost transparently, to the operating system that is considered most convenient at any given time. The differences that operating systems have, which cause problems for applications when changing environments, disappear almost entirely.
Containers are a great bet on the way applications are deployed on the technological infrastructure regardless of the environment in which it is configured and are very useful for the automation of deployment processes, which in turn is a trend. extremely important in today’s world of software development.
Time environment configuration, since the development phase itself, is drastically reduced, additionally, developers work with nearby environments QA and production, allowing them even testing the operation of the application under certain features provided by the container and predict certain conditions or behaviors of the deployment environment. Similarly, applications can be distributed deployments components separately, allowing developers to work on individual components and not on the whole solution.
On the other hand, by working under the isolation of the containers, each container is responsible, separately, for the deployment and configuration of the software and applications packaged in itself, which reduces the dependence on the administration of the operating system.
From the server and its management, there is greater visibility and manageability of physical resources, since the applications that run within a container can only see the information and devices enabled for it, which allows greater resource control by system administrators and the possibility of rapid scaling, as well as a significant decrease in the load on the servers, managing to deploy a greater number of applications if required.
Finally, the operations and development teams have independent and defined responsibilities. Development teams can focus on the application and its dependencies, while operations teams can focus on managing the infrastructure without worrying about applications. Now, that the responsibilities are defined does not mean that the teams work completely independently, in fact, within methodologies such as DevOps, for which the containers are of great value, the intention is precise that they work together, but each with their clear duties in this regard, so that they now spend less time identifying differences between the multiple application deployment environments and more time offering new functionalities.
Almost from their very definition, software containers provide many significant benefits based on isolation, portability, performance, scalability, and control throughout the entire application life cycle. Not only are they responsible for making life easier for us concerning development, but they help us achieve, almost implicitly, the concepts of software quality mentioned at the beginning.
I invite you to learn a little more about the world of software containers in practice and to see for yourself its added value for the development and deployment of applications, and therefore, why it is so well received.
See you in the next installment of this and other topics of interest.