See all articles

Docker

iRonin IT Team

Want to build software that works on any machine, any configuration? Use Docker. Bridging the gap between programmers and other stakeholders involved in producing a software product for desktop, enterprise, web, or mobile (i.e. testers, system admins, the DevOps team, management and even customers!), Docker is what is known as a containerization management tool that is a unique new twist on traditional virtualization.

Today’s article gives you a brief introduction to Docker.

Docker counts heavyweights such as PayPal, Indiana University, Business Insider, eBay, and General Electric amongst its userbase, so you know that if it’s working for them, then it might just work for you, too.

It’s a relatively new tool that has made inroads into reducing the amount of resources required for software production, testing, management, and deployment, by the use of containers. Containerization has quickly become a more popular alternative to traditional virtualization, for many reasons.

Why containerization over virtualization*?

Virtualization imports a guest operating system onto a host operating system. This means that you can run multiple different operating systems at the same time, on one machine. Binaries and libraries are then loaded into each of these virtual machines. However, this compromises machine performance, and compromises efficiency of the virtual machine, due to the influence of the hypervisor, which negotiates between the host machine and the virtual machine/s.

Containers do away with the hypervisor and the virtual machines, instead, replacing it with a container engine. This makes for a leaner, more efficient set up. Binaries and libraries of each container, unlike virtualization, are run on the host kernel, which means that the processing speed is fast. You cannot, however, use straight containers on a different platform, for instance, a Windows application on a Linux box. Docker requires the underlying architecture to be the same (e.g. x86, ARM), unless there is a virtual machine also running.

*You should note that containerization is not completely different to virtualization, it is a new way to do virtualization that is more portable and lightweight than running up virtual machines.

What is Docker?

Docker is a way to package your application, its dependencies, and structure into a container, so that it can run, as well as be developed, or tested, under any environment. Each container may contain multiple applications, if these applications share the same environment. It packages a runtime environment all into one neat little container, for use almost anywhere.

Docker is considered a DevOps tool, as it is used throughout the continuous testing and continuous deployment phases of DevOps.

How does Docker work?

Docker works as a client-server process, by running up a Docker host (which hosts the containers) via a daemon, and a Docker client, which communicate between each other via a Rest API, Socket.IO and TCP. This can be all run up and configured via the terminal in Linux systems, however in Mac and Windows you’ll need to use the extra Docker tools that come with the Docker installation to set everything up.

To create containers, you will use a Docker Image, or multiple images, which are templates from which a Docker container can be created. You can either use someone else’s image, or build your own. Docker Registries, where these images can be stored, are in a public or private cloud. The images that are used for your containers are pulled from the registry to the Docker host, so they can be run to facilitate your containers.

How does Docker benefit non-developers?

For testers, the dev ops team, management, and customers, Docker makes for a great way to transport applications that are not in bundled application form yet.

* No need to install dependencies, set up environment variables, etc.

* Saves time and effort in setting up

* Evaluate software quickly, without overheads

* Application environment the same for all stakeholders in the project

* Can be deployed on a large number of systems quickly

* Can be integrated with other DevOps tools, such as Jenkins, Vagrant, Ansible, AWS, and more

How does Docker benefit developers?

While the advantages of Docker for non-developers carry over to developers, too, there are some distinct advantages of developers using Docker:

* Can easily work on projects on any machine

* Can share workload with others, without sharing machines or setups

* No need to think about how you are going to deploy the code on other machines, just write it

Other benefits of Docker

There are multiple other advantages of using Docker, that benefit everyone from a wider perspective:

* Docker will allocate memory on the fly, unlike virtual machines, where memory needs to be allocated before the virtual machine is started (and unused memory within the VM cannot be reused elsewhere)

* Docker deployment is easily scalable

* Docker is fast to boot

* Docker can run on all major cloud computing

* Docker has in built version control

* Docker isolates resources

* Docker containers can run multiple instances of tools like Jenkins and Puppet, or share them easily with other containers

Looking to integrate Docker into your development cycle? Let us help set you up with using Docker infrastructure and the training needed to take full advantage of this powerful and resource saving tool. At iRonin, we are providers of quality, cutting edge web and mobile application development solutions, as well as DevOps resourcing and management to drive your business further.

Similar articles

Previous article

iRonin @ wroc_love.rb 2017 Ruby Conference