Docker Enterprise on Tour


docker-enterprise-on-tour-welcome

A couple of days ago I attended the Docker Enterprise on Tour event. The event took place at the Hofspielhaus in Munich. It was an unusual, but good venue to host such an event. Everything was very well organized. I have been interested in container technology and especially Docker for some time. I heard a lot about it and wanted to find out, whether it is just another overhyped buzzword or really a game-changer in terms of development and operations. After the event I reached out to two experts in this field to clearly distinguish between marketing and reality. My conclusion is, that Docker is no overhyped buzzword, but can improve operational efficiency a lot.

Let us have a deeper look and explain it in simple words. This can be done by drawing an analogy to logistics, an area that has implemented the concept of containers around sixty years ago. This proves one of BrazzoTech’s major points. We should always have a look at other areas. In doing so, we will see concepts, that can be applied to our own area. In this example it was IT, that learned from a well-established concept in logistics.

 

Industry Experts

Thorsten Aye and Reiner Rottmann are both experts when it comes to DevOps, container technology and Docker in particular. However, both have a different background. Whereas Thorsten is coming from a software development background, Reiner is looking at it with his broad infrastructure and operations background. This is very interesting, because it shows, that Docker brings benefits for software development as well as for operations. This is crucial for implementing DevOps. Here at BrazzoTech we aim to explain technology in simple words. Therefore, this article will not go into each and every detail. However, if you are interested in learning more about Docker or plan to implement it, please reach out to Thorsten Aye and Reiner Rottmann for more information.

 

A Bit of History

The concept of containerization is not a new one and was definitely not invented by Docker. As Reiner put it: “IT archaeologists found evidence as early as the 1970s”. This is actually true, because back then Sun Solaris had implemented the concept of containerization. However, as with all inventions the timing is crucial. In the 1970s the IT ecosystem was not ready for containerization. This changed roughly ten years ago. IT operators struggled with dealing with different hardware, different operating systems, different databases and so on. One answer was virtualization. Virtualization brought tremendous savings and to some extent independence. It enabled IT operators to run different virtual machines with different operating systems, different databases and different applications all on the same infrastructure.

 

Dependencies as Boundaries

However, there were still some dependencies. For example, every virtual machine could have only one operating system. Hence, the applications running in the virtual machine had to be developed for this operating system. There were some more dependencies. These dependencies meant, that developers had to be aware of them and consider them when developing software. It is obvious that this took more time and therefore cost more money. Ideally, software should run on a developer’s PC in the same way as on a bare metal server, a virtualized server or some cloud server. This is where Docker came into play in 2013.

 

The User Perspective

Docker addressed the topic of platform independence and therefore tore down several boundaries for software developers. Thorsten put it as follows:

“Docker gives the ability to quickly build and ship applications. Since Docker containers are portable and can run in any environment (with Docker Engine installed on physical, virtual or cloud hosts), developers can go from dev, test, staging and production seamlessly, without having to recode. This accelerates the application lifecycle and allows to release applications more often. IT ops teams have the ability to manage, deploy and scale their ‘Dockerized’ applications across any environment. The portability of Docker containers allows you to migrate workloads running in AWS over to MS Azure, without having to recode and with no downtime. You can also migrate workloads from their cloud environment, down to their physical datacenter, and back.”

Reiner explains the advantages in a very similar way:

“Docker made it very easy to use and thus it became a hot topic for DevOps people. In my opinion, the reasons for Dockers success can be found in two things: First of all, Docker standardizes the process to ship code together with runtime modules, system tools and libraries. It runs on your notebook? Great! With Docker it will most certainly run everywhere. Secondly, Docker enables developers to achieve more in less time. For example, there is a huge collection of pre-made containers available on Docker Hub. A developer now finds ready to use software products and does not need to dig too deep into installation and configuration manuals. This improves the speed to market!”

An Analogy

So far, we have seen that Docker definitely has its usage within today’s IT ecosystem. However, would it not be good to explain Docker’s functionality in a less technical way? A way that people outside of IT can relate to? Indeed, Docker or containerization can be explained by using the analogy, that gave it its name. Containers – heavily used everywhere in Logistics. Let us assume we are looking at a port around 100 years ago. At that time big trading companies were already shipping all kinds of goods around the world. Depending on the type of goods, there were different types of vessels and different types of cranes to take the goods off the vessels. In order to transport different kinds of goods the trading companies had to operate different types of vessels and the ports had to operate different types of cranes. This was neither efficient nor cheap. Additionally, it was not easy to transport the goods from the ports to other destinations.

 

The Intermodal Shipping Container

To solve this logistics problem Malcolm McLean developed the intermodal shipping container in the late 1950s. This system soon became international standard. Its overwhelming success is based on similar factors as described above by Reiner and Thorsten.

  1. Due to the standardization of the containers they could be transported via vessels, trains or trucks. It simply became irrelevant what type of vehicle transported the container. In the same way for IT containers it is irrelevant whether they run on a developer’s PC, bare metal server, virtual server or a cloud environment.
  2. Another advantage of the container’s standard measure is that they can be quickly and simply relocated from one environment (e.g. a vessel) to another (e.g. a train). For IT containers this applies to what Thorsten mentioned above “developers can go from dev, test, staging and production seamlessly”.
  3. The container system enabled trading companies to use larger vessels in a much more efficient way. Nowadays modern vessels can carry up to 20.000 containers at once. Before the container system, that has not been possible. Again, the same applies to IT containers. With containers a much higher workload can be put on the same infrastructure than before (even when using virtualization).

Challenges with Docker

The analogy might not be scientifically correct in every point. However, I am convinced, that it visualizes the advantages of containers in a very simple way. So, all good? Containerization offers only advantages? Well, like all types of technology containerization, too, has certain limitations. I asked the experts, what they believe potential users should be aware of. Reiner outlines some security challenges:

“Docker is also a two-edged sword. While the creation of a container is quite easy, there are several pitfalls to avoid. By using pre-made or by building upon existing containers, you have to trust its maintainer that the application got configured with security in mind and that identified vulnerabilities get fixed fast. Then there is the ‘Not-Invented-Here-Anti-Pattern’. Sometimes companies neglect, that in the Cloud Age you already have a plethora of services to choose from and try to use their own containers for all dependencies (often because they are afraid of vendor-lock-in). They waste a lot of time when they have to scale their applications and complexity increases even for seemingly easy topics. For example, the persistence of a database gets very interesting, when you need a highly-scalable and/or highly-available infrastructure. Some companies have been there and done that and now even offer their solution as a ready-to-use Cloud product.”

Thorsten states, that expectations, especially in terms of speed, should be managed:

“Containers do not run at bare-metal speeds. Containers consume resources more efficiently than virtual machines. But containers are still subject to performance overhead due to overlay networking, interfacing between containers and the host system and so on. If you want 100 percent bare-metal performance, you need to use bare metal, not containers.”

Virtualization vs. Containerization

In the past during my time as an IT architect, I have worked heavily with different virtualization concepts. I have always been a great supporter of virtualization as it helped us to reduce costs tremendously. When getting into containerization I asked myself immediately – where is the difference? Separating the actual workload from the bare metal infrastructure sounded familiar. After reading many whitepapers on it, I have to say, there was one video on YouTube, that explains it perfectly. Instead of repeating Nick, I recommend to have a look at his short and easy to understand video:

An Outlook

I do not believe, that containerization and especially Docker’s solution is just another hype, that will pass quickly. The advantages are simply too big to ignore them. I believe, many companies will use Docker in some way. It may not be directly, but in combination with moving towards public cloud solutions, such as Microsoft Azure or Amazon Web Services, or when transforming IT teams into DevOps teams, containerization remains a hot topic. Thorsten thinks, that we have not seen the peak yet:

“I do not think Docker usage already reached its peak. Containers, virtual machines and serverless functions will have their place in futures data center.”

Reiner believes, that containers will be relevant for the near future, but they will not necessarily be called Docker:

“One might ask, what will bring the future for Docker. I think that container solutions will remain relevant in the next couple of years. Are containers still called Docker? I am not so sure. I foresee that the technology will be abstracted away from the user. Cloud services already take care of that with Container as a Service (CaaS). But there is still too much handiwork. For some time, Serverless Computing seemed to supersede containers. But in every hype cycle there comes the Trough of Disillusionment and the inherent latency issue might be a show-stopper for a lot of use cases. I definitely see a growing need for containerization with Industry 4.0. Container technologies will become key ingredients for Edge Computing and connectivity to IoT devices.”

Last but not least, I want to say thank you to Reiner and Thorsten for sharing their input. If you have any questions regarding Docker or containerization, feel free to reach out to them directly. Feel free to share this article with everybody that might be interested.

Any Feedback?

Leave a Reply

Subscribe to Our Newsletter

Select list(s):
We keep your data private! Read Privacy Policy.