How to install Docker on Ubuntu — Practical Guide

Have you ever wondered how to streamline your application development and deployment process? Docker is here to revolutionize the way we package and run applications. Think of Docker as a shipping container system for your applications. Just like a shipping container can carry a variety of goods, Docker carries your applications along with all the parts it needs – libraries, other dependencies, and even the operating system itself – and ensures they run smoothly, regardless of the environment.

Docker leverages the power of containers, providing an isolated, resource-friendly environment that can run multiple instances simultaneously on a single host. This isn’t just any environment, it’s a highly efficient one that doesn’t compromise on performance or security. In this guide, we’ll delve into how to install and use Docker on Ubuntu 20.04 or Ubuntu 22.04.

Whether you’re a seasoned developer or a beginner looking to dip your toes into the world of Docker and Ubuntu, this guide will serve as a comprehensive resource. So, are you ready to uncover the power of Docker and supercharge your application management process? Let’s dive in!

TL;DR: What is Docker and why should I use it?

Docker is a platform that uses containerization to package and run applications consistently across different environments. It’s like a shipping container for your applications – packing all the parts an application needs into a standardized unit. This ensures your applications run smoothly, whether on your local machine, a physical server, or the cloud. For more in-depth information, advanced methods, and tips, continue reading the article.

For more information on all things Kubernetes, Docker, and containerization, check out our Ultimate Kubernetes Tutorial.

At its core, Docker is a powerful open-source platform that simplifies the process of creating, deploying, and running applications. But how does it achieve this? The answer lies in containers. If you imagine your application as a ship, containers are the cargo boxes. They package up the application along with all its parts – libraries, dependencies, even the operating system – into a single unit. This container can then be shipped off to any environment, ensuring your application runs smoothly, whether it’s on your local machine, a physical server, or the cloud.

Table of Contents

The Benefits of Docker

The benefits of Docker are manifold. Docker’s containerization approach ensures application consistency across multiple environments, making your applications more secure and manageable. Each Docker container runs in its own environment, not sharing the operating system with other containers. This isolation enhances security as each container is separated from others, and any issues in one container don’t affect others. Plus, Docker containers are lightweight and start up quickly, making Docker a resource-friendly choice.

Docker’s Versatility

Docker’s versatility is one of its standout features. It can be used for a wide range of applications, from small personal projects to large-scale enterprise applications. Docker containers can run anywhere, on any machine that supports Docker, without worrying about differences in OS distributions and underlying infrastructure. Docker is also widely used in continuous integration/continuous delivery (CI/CD) pipelines, microservices architecture, and more, making it an excellent choice for developers and businesses looking to streamline their application development and deployment process.

Docker’s Client-Server Architecture

Docker operates on a client-server architecture. The Docker client communicates with the Docker daemon, which does the heavy lifting of building, running, and managing Docker containers. The Docker client and daemon can run on the same host, or you can connect a Docker client to a remote Docker daemon. The Docker client and daemon communicate using a REST API, over UNIX sockets, or a network interface.

Example of Docker client communicating with Docker daemon:

docker -H tcp://[IP address of Docker daemon]:2375 ps

Prerequisites for Installing Docker on Ubuntu

Before you can install Docker and start deploying your applications, there are a few prerequisites you need to have in place. You’ll need:

PrerequisiteDescription
Ubuntu 20.04 or Ubuntu 22.04 systemThe operating system for running Docker
User account with sudo or root privilegesTo perform administrative tasks
Command-line/terminal accessTo execute commands

Docker Engine and Docker Desktop: A Comparative Overview

As you embark on your Docker journey, you’ll encounter two key terms: Docker Engine and Docker Desktop. Though they might appear similar at first, they cater to different user experiences. Docker Engine, the backbone of the Docker platform, is a server-side application responsible for creating and managing Docker containers. It’s ideally suited for production environments and is typically run on servers.

Contrastingly, Docker Desktop is a desktop application tailored for developers using Mac and Windows machines. It offers an intuitive graphical interface and encompasses Docker Engine, Docker CLI client, Docker Compose, Docker Machine, and Kitematic. Docker Desktop is the perfect solution for developers aiming to build, share, and run containerized applications directly from their machine.

Docker EngineDocker Desktop
User ExperienceServer-side applicationDesktop application
Ideal forProduction environmentsDevelopers
IncludesDocker EngineDocker Engine, Docker CLI client, Docker Compose, Docker Machine, and Kitematic

Enhancing Security with Docker’s Non-Privileged Mode

A key feature that sets Docker apart is its ability to function in non-privileged mode. This implies that Docker containers can operate without root access, thereby bolstering the security of your applications. Running Docker in non-privileged mode minimizes the risk of a container impacting the host system or other containers. Docker simplifies the best practice of running containers with the least necessary privileges.

Example of running Docker in non-privileged mode:

docker run --cap-drop all --cap-add net_bind_service -p 80:80 -d nginx

The Appeal of Ubuntu for Docker Container Management

When it comes to managing Docker containers, Ubuntu holds a special place among developers. This can be attributed to its stability and reliability, which are indispensable when handling containerized applications. Furthermore, Ubuntu’s vibrant and active community ensures easy access to comprehensive documentation and community support. Being free and open-source, Ubuntu is a cost-effective solution for managing Docker containers.

Exploring Docker Engine Installation Methods

Installing Docker Engine on your Ubuntu system can be accomplished in several ways. You can opt to install Docker from the official Docker repository, which provides the most recent version of Docker. This method is generally recommended for most users as it guarantees access to the latest features and security updates. An alternative method is to install Docker from the default Ubuntu repositories. While this method is simpler, the Docker version in the Ubuntu repositories may not be the most current. In the following section, we will delve into both installation methods, enabling you to make an informed decision based on your requirements.

Why Choose Ubuntu for Docker or Kubernetes Containers?

Ubuntu is a preferred platform for managing Docker or Kubernetes containers for several reasons. Its stability, robust security features, and compatibility with the latest versions of Docker and Kubernetes make it an ideal choice for hosting containerized applications. Moreover, the ability to run Docker in non-privileged mode on Ubuntu enhances the security of your containers. Ubuntu’s active and large community ensures that you’ll always have access to extensive documentation and community support, attributing to its popularity in the Docker community.

Options for Installing Docker on Ubuntu

When it comes to installing Docker on Ubuntu, you have two options: installing Docker from the official Docker repository or from the default Ubuntu repositories. The official Docker repository offers the latest version of Docker, ensuring you have access to the newest features and security updates. The Ubuntu repositories may not have the latest version, but the installation process is simpler.

Installing Docker from the Official Docker Repository

Here’s a step-by-step guide to install Docker from the official Docker repository:

  1. Update your existing list of packages:
sudo apt-get update
  1. Install a few prerequisite packages which let apt use packages over HTTPS:
sudo apt-get install apt-transport-https ca-certificates curl software-properties-common
  1. Add the GPG key for the official Docker repository to your system:
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
  1. Add the Docker repository to APT sources:
sudo add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable"
  1. Update the package database with the Docker packages from the newly added repo:
sudo apt-get update
  1. Make sure you are about to install from the Docker repo instead of the default Ubuntu repo:
apt-cache policy docker-ce
  1. Finally, install Docker:
sudo apt-get install docker-ce

Installing Docker from the Ubuntu Repository

If you prefer to install Docker from the Ubuntu repository, follow these steps:

  1. Update your existing list of packages:
sudo apt-get update
  1. Install Docker:
sudo apt-get install docker.io

Importance of Removing Older Docker Installations

Before installing a new Docker version, it’s important to remove any older Docker installations. This is because older versions can cause conflicts and result in Docker not functioning properly. You can remove older versions with the command:

sudo apt-get remove docker docker-engine docker.io

Choosing Ubuntu for managing Docker containers is a decision backed by its robustness, security, and extensive community support. Known for its stability and consistency, Ubuntu is a critical choice for managing Docker containers. Its compatibility with Docker and its ability to run Docker in non-privileged mode further contribute to its popularity.

Docker Images: The Building Blocks of Containers

Docker images serve as the foundation for Docker containers. They’re read-only templates, housing the instructions necessary for creating a Docker container. Each image is composed of several layers, each layer representing instructions in the image’s Dockerfile. Docker utilizes these layers to construct and run a container.

Example of creating a Docker image:

docker build -t my_image .

An image can be built upon another image, allowing for additional customization. For instance, you might create an image based on the Ubuntu image, but with added elements like the Apache web server and your application, along with the necessary configuration details to run your application.

What are Docker Images?

Docker images are read-only templates used for creating Docker containers. They contain the necessary instructions and dependencies to build a container. Docker images ensure consistency in application deployment across different environments.

Searching and Pulling Docker Images

Docker offers several commands for interacting with images. The docker search command, for example, allows you to search for an image from the Docker CLI. Running docker search ubuntu will return a list of available Ubuntu images.

To download an image, you can use the docker pull command. For instance, docker pull ubuntu will download the latest Ubuntu image. Docker images are normally tagged with versions. If you need a specific version, you can specify it, such as docker pull ubuntu:18.04.

Example of searching and pulling Docker images:

docker search ubuntu
docker pull ubuntu:18.04

Viewing Docker Images

You can view the images downloaded to your machine using the docker images command. This will return a list of images, along with details like the image ID, creation date, and its size.

Docker Hub: A Repository of Docker Images

Docker Hub is a cloud-based repository where Docker users and partners can create, test, store, and distribute container images. It houses images for a wide array of applications, ranging from databases to web servers, and much more. Docker Hub also features official images, which are Docker images that have been curated and optimized by the Docker team.

The Variety of Docker Images

Docker Hub hosts a diverse assortment of images. You can find images for different Linux distributions, databases, web servers, and more. Images for different application frameworks and even for building and deploying machine learning models are also available. This variety of images simplifies the process of building your Docker containers.

The Role of Docker Images in the Docker Ecosystem

Docker images play a pivotal role in the Docker ecosystem. They allow you to package your application and its dependencies into a standardized unit for software development. Docker images ensure that your application operates identically, regardless of the environment in which it is run. This consistency simplifies the development, testing, and deployment of applications. Docker images also facilitate the sharing of your application and its deployment at scale.

Docker Containers: The Heart of Docker

Docker containers are the running instances of Docker images. They encapsulate everything needed to run an application, including the code, runtime, system tools, libraries, and settings. Containers are lightweight and fast, as they run directly on the host machine’s kernel without the need for a hypervisor.

Running, Viewing, Starting, Stopping, and Removing Docker Containers

Interacting with Docker containers is done using Docker commands. To run a container from an image, you can use the docker run command followed by the name of the image. For example, docker run ubuntu will start a new container using the latest Ubuntu image.

To view the running containers, you can use the docker ps command. If you want to see all containers, not just the ones that are running, you can use docker ps -a.

Starting and stopping Docker containers is as simple as using the docker start and docker stop commands, followed by the name or ID of the container. For example, docker start my_container or docker stop my_container.

To remove a container, you can use the docker rm command, followed by the name or ID of the container. Be careful, as this will permanently delete the container and its data.

Example of running, viewing, starting, stopping, and removing Docker containers:

docker run ubuntu
docker ps
docker ps -a
docker start my_container
docker stop my_container
docker rm my_container

Docker Containers: Isolation and Benefits

Each Docker container runs in its own isolated environment. This means that it has its own file system, its own networking, and its own isolated process space. It’s like running a virtual machine, but without the weight of the entire operating system. This isolation guarantees that each container will always run the same, regardless of where it’s run.

Interacting with Docker Containers

Interacting with a running Docker container is done using the docker exec command. This command allows you to run a command in a running container. For example, docker exec my_container cat /etc/hosts will run the cat /etc/hosts command in the my_container container. You can also use it to enter a running container with a shell, like docker exec -it my_container /bin/bash

Example of interacting with a running Docker container:

docker exec my_container cat /etc/hosts
docker exec -it my_container /bin/bash

.

Container IDs and Names: Key Identifiers

Each Docker container has a unique ID and a name. The container ID is a unique identifier for the container, and is used in commands like docker stop and docker rm. The container name is a human-friendly string that you can choose when you start the container. If you don’t specify a name, Docker will generate a random one for you. Both the ID and the name can be used to identify a container when running Docker commands.

Docker Containers: Impact on Application Deployment and Scalability

Docker containers have revolutionized application deployment and scalability. By packaging applications and their dependencies into containers, developers can ensure that their applications will run the same way, regardless of where they’re deployed. This eliminates the ‘it works on my machine’ problem and makes it easy to scale applications across multiple servers or cloud providers.

Committing Changes in a Docker Container

Understanding the Process

When you make changes to a Docker container, those changes are not automatically saved to the Docker image that the container was started from.

Executing the Commit

To save the changes, you can use the docker commit command. This command creates a new Docker image from the changes in the container. For example, docker commit my_container my_new_image will create a new image called my_new_image with the changes from my_container.

Example of committing changes in a Docker container:

docker commit my_container my_new_image

Docker Volumes and Networks: Enhancing Container Functionality

Docker volumes and networks are two powerful features that enhance the functionality of Docker containers. Docker volumes are the preferred mechanism for persisting data generated by and used by Docker containers. They are completely managed by Docker and are stored in a part of the host filesystem that’s managed by Docker (/var/lib/docker/volumes/ on Linux).

Docker networks, on the other hand, enable communication between Docker containers and between Docker containers and other network endpoints. Docker networks provide isolation and security for containers, allowing you to control which containers can communicate with each other and how they do so.

Docker Volumes: Creation, Removal, and Inspection

Docker provides commands to create, remove, and inspect Docker volumes. Here are some examples:

  • To create a volume named my_volume, you can use the docker volume create command:
docker volume create my_volume
  • To remove the my_volume volume, you can use the docker volume rm command. Be careful, as this will permanently delete the volume and its data:
docker volume rm my_volume
  • To inspect a volume and display detailed information about it, you can use the docker volume inspect command:
docker volume inspect my_volume

Example of creating, removing, and inspecting Docker volumes:

docker volume create my_volume
docker volume rm my_volume
docker volume inspect my_volume

Docker Volumes and Data Persistence

Docker volumes are crucial for data persistence. When a Docker container is deleted, any data written to the container’s filesystem is lost. Docker volumes ensure that your data persists even after the container is deleted. This is particularly important for applications such as databases that need to store data across multiple runs of a container.

Additionally, Docker volumes can be used to share data between containers. By mounting the same Docker volume into multiple containers, you can share files and other data between those containers.

Docker Networks: Creation, Removal, and Inspection

Docker provides commands to create, remove, and inspect Docker networks. Here are some examples:

  • To create a network named my_network, you can use the docker network create command:
docker network create my_network
  • To remove the my_network network, you can use the docker network rm command:
docker network rm my_network
  • To inspect a network and display detailed information about it, you can use the docker network inspect command:
docker network inspect my_network

Example of creating, removing, and inspecting Docker networks:

docker network create my_network
docker network rm my_network
docker network inspect my_network

Docker Networking and Running Containers

Docker networking plays a crucial role in running Docker containers. Docker provides several network drivers to use: bridge (the default network driver), none, host, overlay, and macvlan. Each driver offers different capabilities and is suited to specific use cases.

Docker’s Integrated Networking and Volume Management System

Docker’s integrated networking and volume management system is one of its most powerful features. It allows you to manage your containers’ networking and storage needs using a single, consistent set of commands. This system simplifies management tasks and enhances the security and isolation of your Docker containers. By using Docker’s networking and volume management features, you can build applications that are secure, scalable, and easy to manage.

Docker and Ubuntu: A Powerful Combination for Streamlined Application Deployment

Our journey through the Docker landscape has been enlightening. We’ve discovered the transformative potential of Docker, a platform that is redefining how we build and deploy applications. We’ve explored the unique advantages of Docker’s isolation and resource-efficiency and how they make application management a breeze.

We’ve navigated the installation process of Docker on Ubuntu from both the official Docker repository and the Ubuntu repositories. We’ve ventured into the realm of Docker images, grasping how to search, pull, and inspect them, and appreciating the central role Docker Hub plays in offering a diverse range of Docker images.

We’ve mastered the art of managing Docker containers, learning how to initiate, view, start, stop, and delete them as required. We’ve understood the crucial role of Docker’s container isolation and its profound impact on application deployment and scalability.

Lastly, we’ve delved into Docker volumes and networks, understanding their creation, removal, and inspection processes, and appreciating their critical role in ensuring data persistence and facilitating container communication.

Armed with this knowledge, you’re all set to harness the power of Docker on Ubuntu, revolutionizing your application development and deployment process, and optimizing your server resources. So, what are you waiting for? It’s time to embark on your Docker journey and explore the numerous possibilities it offers!