Lorem ipsum dolor sit amet consectetur adipiscing elit sed do eiusmod tempor
Jul 08 2022Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.
Why use docker ?
When using multiple technologies to establish an end to end application or any other services. Each of the technologies may require different requirements, thus being compatible with different operating system each.
Their compatibility with the underling OS can be a challenge for any developer when taking the application into production. Not only the OS but certain services can be compatible with only certain versions of libraries and dependencies in the OS.
Again every time the applications architecture changes, these compatibility also somewhat changes, and the entire process needs to be repeated over and over again. Especially for any new developers in the project, the process of this kind of huge setup can be daunting and this is where docker comes in.
What can it do?
Docker can run these services in their own individual containers and have their own dependencies and their own libraries, which now make the things a lot easier.
![[Pasted image 20220921212231.png]]
Each containers can have their own separate dependencies and libraries to support the service running in it. All this can be done on the same VM and same OS.
Containers
They are completely isolated environments, that have their own processes/services, their own network interfaces and their own mounts.
- Processes
- Network Interfaces
- Mounts
all these containers share a same OS kernel. Containers are not first brought up by docker, but other have existed way before docker. Some types of containers are;
- LXD
- LXC
- LXCFS
- distrobuilder
If you want to read more about them, see below Linux Containers
Docker uses LXC container.
All these containers are very low level tools and thus, very hard to setup. Docker here provides a high level interface to setup these tools thus making our life easier.
Deeper Dive into Docker
Operating System
A little about operating systems first.
Deeper Dive into Docker
Operating System
A little about operating systems first.
![[Pasted image 20220921214014.png]]
There are two major thing to keep in mind in a operating system.
- Software
- Kernel
when you use distros like Ubuntu, Fedora, CentOS. These are all OS, they share a common Linux kernel. And all the system software are what differentiates these operating systems.
OS Kernel- responsible for the interaction between the underlying hardware of the system.
And Docker containers share an underlying OS Kernel.
Docker
When the underlying operating system where docker is installed is lets say Ubuntu, then the docker containers can run Debian, Fedora, CentOS or any other distribution relying on their Linux Kernel.
![[Pasted image 20220921214101.png]] Each docker container only need to have different software to run different distros since the underlying OS kernel is the same for all of them.
So what is a OS that doesn’t share a common kernel as all these in the example above?
Windows!
![[Pasted image 20220921214225.png]] You won’t be able to run a container containing windows if the underlying OS in the system where the Docker is installed has any Linux distribution.
But there’s a catch,
If you try to run a docker container having a Linux OS on a windows machine, it works. How does it work ?
![[Pasted image 20220921214240.png]] What’s actually happening here is that, when you run this container containing Linux, the container is being run inside a Linux Virtual machine, in your windows operating system.
This seems like a disadvantage for docker, but unlike Hypervisors Docker is not meant to virtualize and run different OS and kernels on the same hardware.
Use of Docker Containers
The main use of docker is to containerize apps and to ship and run them anywhere and anytime as many times as you want.
Containers vs Virtual Machines
| Container | Virtual Machines |
|---|---|
| ![[Pasted image 20220921214255.png|250]] | ![[Pasted image 20220921214651.png|250]] |
| Lower resource utilizations since all the underlying OS is the same | Higher Resource utilization since each VM has their own operating system |
| Docker Containers can be in MBs | Same goes with the size, it can be in GBs |
| Bootup time is very fast | Boot up time is quite slow |
| Less isolation since more resources are shared between the containers | .The VMs are completed isolated |
But in case of massive applications its not the deal of either containers or virtual machines but rather a case where both of these are used.
![[Pasted image 20220921214739.png]]
How is Docker used?
There are a lot versions of containerized application available publicly at Docker hub or Docker store.
You can find images of most OS, databases or service tools which are publicly available. You can install instances of these services on a Docker installed system. You can even run multiple instance of the same service, you can simply add as many instances as you wand and then configure a load balancer to the system. If something goes wrong just remove the instance and reinstall again.
Traditionally developers develop application and hand it over to the Ops team for production. The Ops team, don’t really have an idea about the application so, they collaborate with the Dev team to make sure all the config and dependencies required are present what deploying the application. But applications like docker brings about a new culture, DevOps
When a developer develops their application in their own Docker container all the dependencies and configurations re already set and the same docker file can be used in deployment without needing to do additional work of setting it up during the production. Thus, combining the Developer and Operator side of things coining the term DevOps.
Using Docker
Installation
Would recommend docker-engine installation using the install using the repository method or even docker-desktop for windows, search it you can find it somewhere withing the bookmark below.
Install Docker Engine on Ubuntu
Error #1
Cannot connect to Docker Daemon at unix://var/run/docker.sock?
- This error generally happens when installing docker on WSL
- If you encounter this error, even when you’re not in a WSL just restart your system.
- open terminal start the docker daemon using the command
$ sudo dockerd - open another terminal and do all the work there, you can test if docker is working using
$ sudo docker run docker/whalesay cowsay hello-world Error #2
docker: Error response from daemon: Get ”https://registry-1.docker.io/v2/docker/whalesay/manifests…”: net/http: TLS handshake timeout.
- If you encounter this error try again after a few seconds, this is probably a problem with your internet speed.
- you could also try and add in a file
$ touch /etc/docker/daemon.json and add the code below inside the json file
{
"registry-mirrors": ["<https://registry.docker-cn.com>"]
} - Then try:
$ sudo service docker restart
$ sudo docker run docker/whalesay cowsay hello-world Error #3
permission denied while trying to connect to the Docker daemon socket at unix:///var/run/docker.sock: Get “http://%2Fvar%2Frun%2Fdocker.sock/v1.24/images/json”: dial unix /var/run/docker.sock: connect: permission denied
when using autocomplete you may encounter this problem, to solve this. You can do:
sudo usermod -a -G docker [user] Insert your user name where [user] is.
You can check it was successful by doing grep docker /etc/group and see something like this:
docker:x:998:[user] after this restart the docker service.
sudo systemctl restart docker.service your should get your result something like this.
![[Pasted image 20220921215337.png]] If somethings still don’t work you should also checkout: Post-installation steps for Linux | Docker Documentation