We used to have a silo problem. Docker solved it because we're able to containerize the microservices that we're developing in the form of Docker mail.
Once we run the Docker image, it becomes a container. This container is guaranteed to run in every machine because we're installing Docker as the platform. On top of Docker platform, we're curating the Docker images and running the container. That container has a limited number of libraries and a limited number of data, which is required to run the application.
Each container has a limited library system, which is required to run the application. This encapsulation makes it work perfectly, irrespective of the system. It works perfectly once we have encapsulated the application and containerized it. This is guaranteed to run in each machine.
We're deploying the solution on Azure cloud. We're curating the CI/CD pipeline. In the CI/CD pipeline, we're curating the Docker images and pushing it to the container registry. We're writing the steps for how to build the Docker image into the YAML code. Once the Docker image is built, it's pushed to the container registry. We're writing this YAML code in the pipeline.
Every person on the DevOps team is using this Docker tool.
We have plans to increase usage because it's a great tool, and it's the latest technology. We're no longer developing monolith architecture, so everyone is developing applications with microservices. Docker is the best tool to containerize the application and encapsulate it.
We used to work on monolithic architecture to develop the application until it goes to the production environment. It takes a huge amount of time. Now, we are using a microservices architecture. Rather than creating the application as a whole, we are dividing the application into small services. Each microservice is loosely coupled. With each microservice, we can develop and containerize it in the form of a Docker container.
If an application has a hundred microservices, like on an e-commerce portal website, then the login is usually one application, the catalog is one application, and the E-card is one application. Each application can be considered as one service. For each microservice, we can develop a code and containerize it.
Containerization is the most valuable feature. I can communicate to the Docker containers using other containers. The copying feature is helpful because if a container dies, we won't lose the data. That data is retained because of this data volume feature. We mount it to our shared system or any shared folder.
I would like to see a more UI-based tool so that students can easily understand it rather than memorizing all of the Docker commands. Some people try using Docker desktop and containers to make it graphical.
Docker Swarm could have more advanced features, like Kubernetes, which has an auto scaling feature and cell-filling capabilities.
I have used Docker for four years.
I would rate the stability as seven out of ten.
I would rate the scalability as seven out of ten.
I would rate technical support as seven out of ten.
Setup is simple. We can install Docker with just one command. It doesn't take more than three to four minutes to run the pipeline. A Docker image is built in that time frame.
We're running the pipeline in a self-hosted agent or Microsoft-hosted agent.
Docker Compose can be installed easily. It allows you to use multiple containers at a time. Docker Swarm can also be installed easily.
Docker is open source. To use a Docker enterprise model, we would need to pay for it.
I would rate this solution as seven out of ten.
My advice is to create an account on Docker Hub. In Docker Hub, there are three Docker images available for practice. We use GitHub to see how we're making the Docker images and then push it to the public Docker Hub. I have an account in Docker Hub and have images there that I can reuse later. I would also advise completing a Udemy course for Docker or watching YouTube videos about it. This will make Docker easier to understand.