Skip to content

How To Live Without Docker Desktop — A Developer’s Perspective

Technology

Dec 1, 2021 - 8 minute read

2099 Blog Post How To Live Without Docker Desktop Blog 416X300
Objectivity Innovative leader in technologies

Our specialty is designing, delivering, and supporting IT solutions to help our clients succeed. We have an ethical framework that underpins everything we do. Our underlying philosophy is that every client engagement should result in a Win-Win and this is supported by our four values: People, Integrity, Excellence, and Agility. Our clients are at the heart of our business and we are proud to form long-lasting working relationships, the longest of which is 29 years. Our goal is to continue to grow our business whilst remaining true to the ethical framework and values on which we are founded.

See all Objectivity's posts

2988 HC Digital Transformation 476X381

Share

What’s the Fuss All About?

According to the new Docker Desktop Licence Agreement, professional use of Docker Desktop in large organisations requires a paid Docker subscription. Basically, if your company has 250+ employees or makes more than $10 million in annual revenue, you will not be able to use Docker Desktop without the paid subscription. It remains free for smaller companies, private use, educational purposes and open-source projects.

It’s important to mention that this change is related only to Docker Desktop, neither Docker Engine nor Docker itself are amending their licences. This means that you can still use Docker for development and all types of environments, including the production one. Therefore, this change shouldn’t impact your company’s business in any way.

If you’re a developer, and you worry that this change will affect your daily work, then fear no more! This article will show you how to achieve the same level of productivity without Docker Desktop. What’s more, there are additional benefits!

Prerequisites

  • You need to have a WSL2 compatible workstation: Windows 10 version 2004 and higher (Build 19041 and higher) or Windows 11. To check your current windows version: press the Windows logo key + R, type winver and click OK. If your system doesn’t meet this criterion, you will have to update your Windows to the latest version.

  • If you already have Docker for Windows installed, you need to uninstall it (especially if it's connected with the local WSL distribution).

  • You need to have the Linux Subsystem enabled. You can check that by pressing the Windows logo key + R, type “Turn Windows features on and off” and click OK on entry the that has been found. Scroll down list to “Windows Subsystem for Linux” and select it if it’s unselected. Please note that this step requires a system restart. You can also turn this feature on with the PowerShell command:

Enable-WindowsOptionalFeature -Online -FeatureName Microsoft-Windows-Subsystem-Linux

WSL2 Installation

For this process, you’ll need to run PowerShell with administrative privileges. Install Ubuntu submodule in WSL version 2 type:

wsl --set-default-version 2
wsl --install -d Ubuntu

NOTE:

Feel free to use a different Linux submodule, however, you need to remember to use proper commands related to the installed distribution.

This can take some time, but finally you should see the confirmation. Check if Ubuntu was installed in version 2 using this command:

wsl -l -v

You should see your installed Ubuntu with the WSL version. If by any chance you installed it in version 1, then you can change it with this command:

wsl --set-version Ubuntu-XX.YY 2

The XX.YY is the Ubuntu release version installed (Currently 20.04).

Docker Installation

Basically, you just need to follow the official steps for your particular distribution. For Ubuntu: https://docs.docker.com/engine/install/ubuntu/.

You can also copy the following script into your x console. Paste it either to a file such as install-docker.sh or directly to the console. Additionally, you paste each command separately in the same order:

#/bin/bash 

# 1. Required dependencies 
sudo apt-get update 
sudo apt-get -y install apt-transport-https ca-certificates curl gnupg lsb-release 

# 2. GPG key 
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /usr/share/keyrings/docker-archive-keyring.gpg 

# 3. Use stable repository for Docker 
echo \ 
  "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/docker-archive-keyring.gpg] https://download.docker.com/linux/ubuntu \ 
  $(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null 

# 4. Install Docker 
sudo apt-get update 
sudo apt-get -y install docker-ce docker-ce-cli containerd.io 

# 5. Add user to docker group 
sudo groupadd docker 
sudo usermod -aG docker $USER 

Here’s a detailed list of the necessary actions:

  • Step 1: Update the system and install the required dependencies. This will provide your freshly installed system with a general update and some tools required by Docker and needed in the next steps.

  • Step 2: GPG Key. To use a stable Docker repository, you have to tell your Linux environment that you trust this Docker repository.

  • Step 3: Add a stable repository for Docker. The Ubuntu system uses Debian-like repositories with software. You can add many repositories from various vendors to install their respective software.

  • Step 4: Install Docker. Finally!

  • Step 5: Add a user to the Docker group. If you will ever need to use Docker directly from the Linux submodule, then you want your local user to have privileges to do so.

  • You will be asked for the Linux root (administrative) password. Note that the entire process will take a while, so may be a good time to grab a coffee!

  • To start the Docker service and check if everything went smoothly:

sudo service docker start
sudo service docker status
docker run hello-world

At the end of the script execution, you should see that the Docker image ran smoothly:

Hello from Docker! 
This message shows that your installation appears to be working correctly. 

To generate this message, Docker took the following steps: 
1. The Docker client contacted the Docker daemon. 
2. The Docker daemon pulled the "hello-world" image from the Docker Hub. 
    (amd64) 
3. The Docker daemon created a new container from that image which runs the 
    executable that produces the output you are currently reading. 
4. The Docker daemon streamed that output to the Docker client, which sent it 
    to your terminal. 

To try something more ambitious, you can run an Ubuntu container with: 
$ docker run -it ubuntu bash 

Share images, automate workflows, and more with a free Docker ID: 
https://hub.docker.com/ 

For more examples and ideas, visit: 
https://docs.docker.com/get-started/ 

Kubernetes (Minikube) Installation — Optional

If you need a k8s environment, you can install everything using the official kube tutorial: https://minikube.sigs.k8s.io/docs/start/. Currently, for Ubuntu, it’s as easy as pasting these two commands into your shell:

curl -LO 
https://storage.googleapis.com/minikube/releases/latest/minikube-linux-amd64 
sudo install minikube-linux-amd64 /usr/local/bin/minikube

To start your minikube instance, type (this step can also take a while):

minikube start

You should see the minikube cluster starting log:

Finally, to check if your cluster is running correctly, type:

minikube status

You should see something similar to:

minikube
type: Control Plane 
host: Running 
kubelet: Running 
apiserver: Running 
kubeconfig: Configured

Using Docker and Kubernetes From Your PowerShell Command Line

At this moment you should have:

  • WSL2 Linux subsystem installed.
  • Installed & configured Docker within your selected Linux distribution.
  • (Optionally) Installed & configured minikube cluster within your selected Linux distribution.

This means that you can use all Docker, minikube and kubectl commands inside your Linux terminal. You can also use all these commands from PowerShell by preceding the command with “wsl”. For example:

wsl docker ps
wsl minikube status
wsl minikube kubectl -- get pods -A

It would be much more convenient if you could use it without saying here and there that you will use wsl in case of these commands. There is a simple workaround to achieve this by creating aliases for your PowerShell. Please note that this step will require you to create a PowerShell Profile script (you can read more about it here), so you need to have the privileges to run these scripts. There’s a chance that you have limited ExecutionPolicy and you won’t be able to run this script. To check that run, type in your PowerShell (with administrative privileges):

Get-ExecutionPolicy -List

The result of this command should be like:

Scope                    ExecutionPolicy
-----                    ---------------
MachinePolicy            Undefined
UserPolicy               Undefined
Process                  Undefined
CurrentUser              AllSigned
LocalMachine             Undefined

We’re interested in the LocalMachine scope. If it’s Undefined or Restricted, you will need to allow the RemoteSigned scripts to be runnable. You can achieve this by typing:

Set-ExecutionPolicy RemoteSigned

Now we are ready for the final touch! We need to change the content of the profile start script. To find the path for this script, type in your PowerShell (without administrative privileges — we are interested in your console now):

echo $PROFILE

In return, you will get a path to your start script. Something like this:

C:\Users\dpokusa\Documents\WindowsPowerShell\Microsoft.PowerShell_profile.ps1

Open this file in your favourite editor and paste this content:

Function Start-WslDocker {
    wsl docker $args
}
 
Function Start-WslMinikube {
    wsl minikube $args
}
 
Function Start-WslKubectl {
    wsl minikube kubectl -- $args
}
 
Set-Alias -Name docker -Value Start-WslDocker
Set-Alias -Name minikube -Value Start-WslMinikube
Set-Alias -Name kubectl -Value Start-WslKubectl

As you can see there are 3 aliases for:

  • Docker
  • minikube
  • kubectl

You can create more of them by applying the same approach, and have a tighter connection between your PowerShell and your local Linux submodule. Save this file and restart your PowerShell terminal. Now, you can use the exact same commands as you would use on Linux:

docker images
kubectl get all
minikube dashboard

Summary

Here we are! The process may seem compilated, at first glance, but if you look closely, you’ll find out that it’s rather simple, and can be described in 3 basic steps:

  1. Install a WSL2 Linux submodule of your choice.
  2. Install Docker and optionally local Kubernetes cluster like minikube or another solution.
  3. Prepare a profile script for your local PowerShell.

What’s next? For sure it’s not as simple as Docker Desktop, therefore, if you consider buying it, our advice will be to calculate your own use case. And there’s nothing that can stop you from installing additional tools that will simplify your day-to-day development experience with k8s and Docker. For example, you may consider using https://www.portainer.io/ (you can use community edition for business purposes) for managing your images, volumes and containers without the hustle of a CLI. If you are a more terminal-oriented person,you can use Docker directly or via tools such as Lazydocker (you can add it to your PowerShell profile script too!) The sky is the limit!

Let’s go through the pros and cons of the described solution to make these considerations easier:

Advantages

  • The Docker engine is free, so the entire solution is available without cost and licence management.
  • You can use any available software to manage your Docker and Kubernetes cluster like Portainer or Lazydocker.
  • It’s easy to start using a native CLI and learn Docker basics, so you can gain more DevOps competences!
  • A lot of production environments run on Linux, so using Linux Docker installation will make the development environment more like production in such cases.

Disadvantages

  • It requires configuration and a little Linux knowledge.
  • A PowerShell profile script is also needed (if your organization doesn’t give you permission to run it, you won’t be able to create PS aliases).
  • Even if you use additional tools to maintain your Docker/k8s cluster, it will be your responsibility to update and maintain the entire environment.

Happy coding!

About the Authors:

Daniel Pokusa
Technical Architect

Technical Architect fascinated with automation, lean methodologies, quality and efficiency in everyday work. On a daily basis related to JVM languages. Habitual speaker at IT conferences and IT-related events (such as Confitura, 4Developers, Java Developer Days, Boiling Frogs, QualityExcites and more). He believes that the most important things in software development are good communication, cooperation and knowledge sharing.

Patryk Lotzwi
Senior DevOps Engineer

DevOps, programmer, fan of automation. An active member of tech communities who participates in conferences. He’s the DevOps Practice Lead at Objectivity. In his free time Patryk plays video games, watches superhero movies or develops yet another smart home project.

2988 HC Digital Transformation 476X381
Objectivity Innovative leader in technologies

Our specialty is designing, delivering, and supporting IT solutions to help our clients succeed. We have an ethical framework that underpins everything we do. Our underlying philosophy is that every client engagement should result in a Win-Win and this is supported by our four values: People, Integrity, Excellence, and Agility. Our clients are at the heart of our business and we are proud to form long-lasting working relationships, the longest of which is 29 years. Our goal is to continue to grow our business whilst remaining true to the ethical framework and values on which we are founded.

See all Objectivity's posts

Related posts

You might be also interested in

Contact

Start your project with Objectivity

CTA Pattern - Contact - Middle

We use necessary cookies for the functionality of our website, as well as optional cookies for analytic, performance and/or marketing purposes. Collecting and reporting information via optional cookies helps us improve our website and reach out to you with information regarding our organisaton or offer. To read more or decline the use of some cookies please see our Cookie Settings.