Picture of Stefan Lieser
Stefan Lieser

DevOps CI/CD with Docker

Update 18.11.2024: I have added dependencies between the containers to the diagram for the third model "Multiple containers with external dependencies". Thanks to Ralf Westphal for the hint.

Throw it over the fence

As a developer, I can hand over the results of my work to the company, which then takes care of everything else. That's great, because I no longer have to deal with the annoying stuff. The picture: I throw my stuff over the fence and the company does the rest. Cool? Not at all!

Under the term DevOps I understand close cooperation between development and operations. As a developer, I also consider the operation of the software. In an agile environment, operation doesn't just take place after months or years of development, but from the first breakthrough. It is important to be as close as possible to the operation of the software right from the start and to develop this aspect as well. This allows potential operational problems to be identified at an early stage. It also enables the product owner to provide feedback. And the company is put in a position to do its part of the work, such as setting up the monitoring of the software.

DevOps

Initially, the software system may only be put into operation internally. However, everything should be as close to the future "real" operating environment as possible. If you're wondering how this works in practice, the key element is called Docker.

Docker and the monolith

Before we get into further details: Docker is often mentioned in connection with microservices. But monoliths can also be deployed as Docker containers. This is even independent of whether the monolith is well structured internally or whether it is a "chaotic" legacy system. Not every software system should be split up into microservices. The necessity must arise from the requirements. Just because we developers think something is cool is no reason to try out all the buzzwords on the real system.

If you create a Docker image for your monolith (or microservices), this can drastically simplify deployment. The days when we copied directories to deploy an application are over. Deployment must be possible at the push of a button and, above all, must be continuous. If you don't "believe" in the great benefits of continuous deployments: the book Accelerate gives sound advice!

Another advantage: you can simplify automated testing with a Docker image of the application. The application can thus also be executed on the developer machines. This is also an important step on the way to continuous integration, as integration tests can then also be executed on the CI server. These run against the same container that is later deployed. This means that the integration tests are as close as possible to the real deployment.

Why is Docker so well suited for CI/CD?

  • Containers bring all dependencies within the operating system environment with them.
  • A reproducible environment is achieved. This makes it very easy to move to another server.
  • The integration of external dependencies such as MySQL, PostgreSQL, RabbitMQ etc. is simplified.
  • Deployment of the containers on the server(s) is significantly simplified.
  • With TestContainers automated unit and integration tests can be created.
  • The automated tests can also be executed on the Continuous Integration System.
  • Docker containers start very quickly.
  • Docker images are small due to their layered structure.
  • Docker containers can also run on the developer machines so that the developers can work very close to the real environment.

The different models

When building a software system, we can distinguish between three models based on the number of containers:

  • 1 container without external dependencies
  • 1 Container with external dependencies
  • Multiple containers with external dependencies

External dependencies are containers that our software system uses but that we do not create ourselves. These include, for example, MySQL, PostgreSQL, RabbitMQ, Keycloak and many others. The big advantage of using Docker is that we can easily provide these dependencies because they are also available as containers. The desired version is also specifically selected so that the software system does not react unexpectedly incorrectly because a different version is "accidentally" used. As you can already see, Docker is all about the reproducibility of the environment.

In the following, we take a closer look at the three models.

1 container without external dependencies

1 Container No Dependencies

In this model, our software system consists of a single container. We create the corresponding image in the build process. This container is self-contained, i.e. it has no dependencies on other containers, neither its own nor external ones.

Such simple systems are the exception rather than the rule, but they are a very good starting point because they are very easy to handle during deployment. The issue of persistence could be solved via the file system, for example. The application simply saves its data in files. To ensure that this data is not lost when the container is restarted, the corresponding directories are created via Volumesmapped to the host operating system.

It is advisable to start with such a simple solution and gain experience with it. As it is less complex, not all Docker concepts need to be mastered.

1 Container with external dependencies

1 Container With Dependencies

The next step is a software system that still only consists of one container, but has external dependencies to other, third-party containers. In this way, for example, the topic of persistence can be realized via an SQL or NoSQL solution.

When deploying such a model, the topic of Docker Compose comes into play. This allows several containers to be deployed together. The dependencies between the containers are also defined to ensure that the containers are started in the correct order. After the Volumes were required, communication is now possible via Networks a new topic.

Multiple containers with external dependencies

Multiple Containers With Dependencies

The final step is to assemble your own software system from several containers. Here too, Docker Compose is used to define which containers are to be started. In order to provide all custom images, this must be taken into account in the build process. This makes the build process more demanding.

A small side note: not all software needs to be split across several containers. A well-structured monolith is still a possible solution for many problems.

Scaling

Another advantage of Docker is that we can also start multiple instances of an image. On the external dependencies side, for example, several instances of PostgreSQL can be started, even in different versions. This is necessary if an external dependency itself consists of several containers. For example, Keycloak requires a PostgreSQL database. If we use Keycloak in our software system and also want to use PostgreSQL ourselves, it makes sense to work with two instances of PostgreSQL. Then, on the one hand, the data is stored separately and, on the other hand, different versions can be used.

Multiple instances of a container can also be used for load balancing. Such horizontal scaling can be carried out with Kubernetes, for example.

Conclusion

Docker is a very powerful tool that has become indispensable in the areas of continuous integration and continuous deployment. Docker also offers many advantages for the automation of tests, as I explain in this Blog post about the use of TestContainers have presented.

However, I don't want to hide the fact that CI/CD with Docker has a certain learning curve. This is where our training courses can help. If required please contact us.

Our seminars

course
Clean Code Developer Basics

Principles and tests - The seminar is aimed at software developers who are just starting to deal with the topic of software quality. The most important principles and practices of the Clean Code Developer Initiative are taught.

to the seminar "
course
Clean Code Developer Trainer

Conducting seminars as a trainer - This seminar is aimed at software developers who would like to pass on their knowledge of Clean Code Developer principles and practices or Flow Design to others as a trainer.

to the seminar "

Leave a Comment

Your email address will not be published. Required fields are marked *

en_USEnglish