2CUTURL
Published May 23, 2023, 5:20 p.m. by Courtney
Docker is a tool that enables you to create, deploy, and run applications by using containers. Containers allow you to package an application with all of its dependencies and ship it as one package. By using containers, you can run multiple applications on a single host without the need to worry about conflicting dependencies.
Docker is used by many organizations, including Facebook, Google, and Microsoft, to run their production workloads. In this tutorial, you will learn the basics of Docker by creating and running a simple container.
Containers are a way to package software in a format that can be run on any machine that has a container runtime installed. A container runtime is a piece of software that enables you to create and run containers.
Docker is the most popular container runtime. Other popular container runtimes are rkt and lxc.
A Dockerfile is a text file that contains instructions for how to build a Docker image. A Docker image is a file that contains all the dependencies for an application.
In this tutorial, you will learn how to create a simple Dockerfile and build a Docker image from it.
Docker uses a client-server architecture. The Docker client is used to interact with the Docker daemon. The Docker daemon is responsible for creating, running, and managing containers.
The Docker client and daemon can be run on the same host or on different hosts. When they are run on the same host, the Docker client and daemon communicate via a Unix socket. When they are run on different hosts, the Docker client and daemon communicate over TCP.
The Docker client and daemon communicate using the Docker API. The Docker API is used by the Docker CLI and by third-party tools to interact with the Docker daemon.
Docker is available for a variety of operating systems, including Windows, macOS, and Linux. You can install Docker on your host by following the instructions in the Docker documentation.
Once you have installed Docker, you can run the docker command to interact with the Docker daemon.
In this tutorial, you learned the basics of Docker. You learned how to create a simple Dockerfile and build a Docker image from it. You also learned how to run a container from a Docker image.
You may also like to read about:
hello this is Matthew from simply learn
and today we're gonna cover what is
docker and why it should be of value to
you as somebody who works in DevOps so
let's take a throw scenario a very
developer Etna testa before you had the
world of docker a developer would
actually build their code and then
they'd send it to the tester but then
the code wouldn't work on their system
coders are worldly a system due to the
differences in computer environments so
what could be the solution to this well
you could go ahead and create a virtual
machine to be the same of the solution
in both areas what do you think docker
is an even better solution so let's kind
of break out what the main big
differences are between docker and
virtual machines as you can see between
the left and the right hand side both
look to be very similar what you'll see
however is that on the docker side what
you'll see as a big difference is that
the guest OS for each container has been
eliminated docker is inherently more
lightweight but provides the same
functionality as a virtual machine so
let's step through some of the pros and
cons of a virtual machine versus docker
so first of all a virtual machine
occupies a lot more memory space on the
host machine
in contrast Dhaka occupies significantly
less memory space the boot up time
between both is very different docker
just boots up faster the performance of
the docker environment is actually
better and more consistent than the
virtual machine docker is also very easy
to set up and very easy to scale the
efficiencies therefore a much higher
with a docker environment versus a
virtual machine environment and you'll
find it is easier to port docker across
multiple platforms than a virtual
machine finally the space allocation
between docker and a virtual machine is
significant when you don't have to
include the guest Oh
you're eliminating a significant amount
of space and the dock environment is
just inherently smaller so after darker
as a developer you can build out your
solution and send it to a tester and as
long as we're all running in the doctor
environment everything will work just
great so let's step through what can I
cover in this presentation we're gonna
look at the DevOps tools and where
docker fits within that space we'll
examine what docker actually is and how
docker works and then finally we'll step
through the different components of the
docker environment so what is DevOps
DevOps is a collaboration between the
development team the operation team
allowing you to continuously deliver
solutions and applications and services
that both delight and improve the
efficiency of your customers if you look
at the Venn diagram that we have here on
the left hand side we have development
on the right hand side we have operation
and then there's a cross over in the
middle and that's where the DevOps team
sits if we look at the areas of
integration between both groups
developers are really interested in
planning code building and testing and
operations want to be able to
efficiently deploy operate a monitor
when you can have both groups
interacting with each other on these
seven key and elements then you can have
the efficiencies of an excellent DevOps
team so planning in codebase we use
tools like JIT and Guerra
for building we use Gradle and mavin
testing we use selenium the integration
between dev and ops is through tools
such as Jenkins
and then the deployment operation is
done with tools such as docker and share
finally nagas is used to monitor the
entire environment so let's step deeper
into what docker actually is so docker
is a tool which is used to automate the
deployment applications in a lightweight
container so the application can work
efficiently in different environments no
it's important to note that the
container is actually a software package
that consists of all the dependencies
required to run the application
so multiple containers can run on the
same hardware the containers are
maintained in isolated environments
they're highly productive and they're
quick and easy to configure so let's
take an example of what dogger is by
using a house that may be rented for
someone using Airbnb so in the house
there are three rooms and only one
cupboard and kitchen and the problem we
have is that none of the guests are
really ready to share the cupboard and
kitchen because every individual has a
different preference when it comes to
how the cupboard should be stocked and
how the kitchen should be used this is
very similar to how we run software
applications today each of the
applications could end up using
different frameworks so you may have a
framework such as rails perfect and
flask and you may want to have them
running for different applications for
different situations this is where
docker will help you run the
applications with the suitable
frameworks so let's go back to our
Airbnb example so we have three rooms
and a kitchen and cupboard how do we
resolve this issue well we put a kitchen
in covered in each room we can do the
same thing for computers docker provides
the suitable frameworks for each
different application and since every
application has a framework with a
suitable version this space can also
then be utilized for putting in Suffern
applications that are long and since
every application has its own framework
and suitable version the area that we
had previously stored for a framework
can be used for something else now we
can create a new application in this
instance a fourth application that uses
its own resources you know what
with these kinds of abilities to be able
to free up space on the computer
it's no wonder docker is the right
choice so let's take a closer look to
how docker actually works so when we
look at docker and we call something
Dokka we're actually referring to the
base engine which actually is installed
on the host machine that has all the
different components that run your
docker environment and if we look at the
image on the left-hand side of the
screen you'll see that docker
has a client-server relationship there
is a client installed on the hardware
there is a client that contains the
docker product and then there is a
server which controls how that docker
client is created the communication that
goes back and forth to be able to share
the knowledge on that docker client
relationship is done through a REST API
this is fantastic news because that
means that you can actually interface
and program that API so we look here in
the animation we see that the docker
client is constantly communicating back
to the server information about the
infrastructure and it's using this REST
API as that communication channel the
dock a server then we'll check out the
requests and the interaction necessary
for it to be the docker daemon which
runs on the server itself will then
check out the interaction and the
necessary operating system pieces needed
to be able to run the container okay so
that's just an overview of the docker
engine which is probably where you're
going to spend most of your time but
there are some other components that
form the infrastructure for docker let's
dig into those a little bit deeper as
well so what we're going to do now is
break out the four main components that
comprise of the docker environment the
four components are as follows the
docker clientís server which we've
already done a deeper dive on docker
images docker containers and the dagger
registry so if we look at the structure
that we have here on the left-hand side
you see the relationship between the
docker client and the darkest server and
then we have the rest api in between now
if we start digging into that rest api
particularly the relationship with the
daka daemon on the server we actually
have our other elements that form the
different components of the docker
ecosystem so the docker client is
accessed from your terminal window so if
you are using Windows this can be
PowerShell on Mac it's going to be your
terminal window and it allows you to run
the docker daemon and the registry
service when you have your terminal
window open so you can actually use your
terminal window to create instructions
on how to build and run your
images and containers if we look at the
images part of our registry here we
actually see that the image is really
just a template with the instructions
used for creating the containers which
you use within docker the document image
is built using a file called the docker
file and then once you've created that
docker file you store that image in the
docker hub or registry and that allows
other people to be able to access the
same structure of a docker environment
that you've created the syntax of
creating the image is fairly simple it's
something that you'll be able to get
your arms around very quickly and
essentially what you're doing is you're
creating the option of a new container
you're identifying what the image will
look like what are the commands that are
needed and the arguments for and then
those commands and once you've done that
you have a definition for what your
image will look like so if we look here
at what the container itself looks like
is that the container is a standalone
executable package which includes
applications and their dependencies it's
the instructions for what your
environment will look like so you can be
consistent in how that environment is
shared between multiple developers
testing units and other people within
your DevOps team now the thing that's
great about working with docker is that
it's so lightweight that you can
actually run multiple docker containers
in the same infrastructure and share the
same operating system this is its
strength it allows you to be able to
create those multiple environments that
you need for multiple projects so you're
working on interestingly though within
each container that contain it creates
an isolated area for the applications to
run so while you can run multiple
containers in an infrastructure each of
those containers are completely isolated
they're protected so that you can
actually control how your solutions work
there now as a team you may start off
with one or two developers on your team
but when a project starts becoming more
important and you start adding in more
people to your team you may have 15
people that are offshore you may have 10
people that are local you may have 15
consultants that are working on your
project
you have a need for each of those two
verbs or each person on your team to
have access to that docket image and to
get access to that image we use a docker
registry which is an open source server
site servers for hosting and
distributing the images that you have
defined you can also use docker itself
as its own default Rattray and docker
hub now something it has to be very
mindful is that
for publicly shared images you may want
to have your own private images in which
case you would do that through your own
registry so once again public
repositories can be used to host the
docket images which can be accessed by
anyone and I really encourage you to go
out to docker and see the other docket
images that have been created because
there may be tools there that you can
use to speed up your own development
environments now you will also get to a
point where you start creating
environments that are very specific to
the solutions that you are building and
when you get to that point you'll likely
want to create a private repository so
you're not sharing that knowledge with
the world in general now the way in
which you connect with the docker
registry is through simple pull and push
commands that you run through terminal
window to be able to get the latest
information so if you want to be able to
build your own container what you'll
start doing is using the pull commands
to actually pull the image from the
docker repository and the command line
that is fairly simple in terminal window
you would write docker
pull and then you put in the image name
and any tags associated with that image
and use the command pools so in your
terminal window you would actually use a
simple line of command once you've
actually connected to your docker
environment and that command will be
docker pull with the image name and any
associated tags around that image what
that will then do is pour the image from
the docker repository whether that's a
public repository or a private one now
in Reverse if you want to be able to
update the docker image with a new
information you do a push command where
you would take the script that you've
written about the docker container that
you defined and push it to the
repository and as you can imagine the
commands for that are also fairly simple
in
terminal window you would write darker
push the image name any associated tags
and then that would then push that image
to the docker repository again either a
public or a private repository so if we
recap the docker file creates a docker
image that's using the build commands
docket image then contains all the
information necessary for you to be able
to execute the project using the docket
image any user can run the code in order
to create a docker container and once
the docket images build is uploaded to a
registry or to a docker hub where it can
be shared across your entire team and
from the docker hub users can get access
to the docket image and build their own
new containers so the five key takeaways
here so with a virtual machine you're
able to create a virtualized environment
to run an application on an operating
system with docker it allows you to
focus on just running the application
and doing it consistently it improves
the ability for teams to be able to
share environments that are consistent
from team to team it's highly productive
and it's really quick and easy to
configure the architecture of docker is
really primarily built out of four
components of which the one that you'll
use the most is the client-server
environment where as a developer you
have a client application running on
your local machine and then you connect
with a server environment where you're
getting the latest information about
that container that you're building a
solution for and then finally what we
see with the workflow improvements with
docker is that the goal is to be able to
be more efficient to be able to be more
consistent with your development
environments and be able to push out
those environments whether it goes to a
test person to a business analyst or
anybody else on your DevOps team so they
have a consistent environment that looks
and acts exactly like your production
environment and can be eventually pushed
out to a production environment using
tools such as puppet or chef so you're
creating a consistent operations
environment really hope you've enjoyed
this presentation as always click like
and subscribe below to get more of these
presentations and if you have any
question
please put those in the comments below
hi there if you like this video
subscribe to the simple learn YouTube
channel and click here to watch similar
videos de nerd up and get certified
click here
2CUTURL
Created in 2013, 2CUTURL has been on the forefront of entertainment and breaking news. Our editorial staff delivers high quality articles, video, documentary and live along with multi-platform content.
© 2CUTURL. All Rights Reserved.