Senior Architect at a engineering company with 10,001+ employees
Real User
Top 10
2024-05-31T06:44:10Z
May 31, 2024
In our line of work, we've been involved in various sectors, such as industrial products, transportation, client engineering, telecom, and medical. For instance, we're currently developing an IT platform. One key use case we're tackling is device management. We're looking at managing devices within our setup. These devices send data or signals, which then get transferred to the cloud. It's all about handling the lifecycle of these devices, deploying them, and managing non-provisioned ones, both on our end and on the client's side.
In our company, I mostly design 5G networks and my work revolves around virtualization of the 5G core, which is known as backhaul. In our organization, we use many containerization technologies to obtain a proper ROI. No-code servers and other servers are used in a combination predominantly as part of our company operations, which is the most relevant approach in the service provider or enterprise end. For small-scale customers of our organization, we use a combination of Kubernetes and Docker to drive adoption. One of the main use cases of Docker is resource utilization. Like traditional VMs, there is no need to depend upon reserving resources and it's much easier to spin up required instances for consumption using Docker.
Sr. HR Execuitve - Employee Relations at HGS - Hinduja Global Solutions
Real User
Top 5
2024-05-29T10:53:01Z
May 29, 2024
We normally use Docker for building and testing all of our applications in a much faster way. We create the software as packages and then use Docker to create containers to build these packages. This process allows us to run and manage our applications.
We are a service provider and we have various use cases. However, the most prominent one is that instead of virtualizing the application workloads, we use Docker. Docker allows you to create small applications and containerize them. You can create multiple such application containers that can run simultaneously on the Linux operating system.
In our company, I have used Docker to launch an application publicly. The aforementioned application becomes available to end users via CloudFront. Docker is also used to cache the optimized performance.
We use Docker for validation mechanisms and have built Docker images for our applications. For container orchestration, we haven't used Kubernetes instances extensively. Docker works well for us.
Instead of building images for underwriting systems, we pull pre-existing standard images. This allows us to quickly set up the necessary environment for development. For example, if I need a database instance, I simply pull the Docker image and create it, rather than going through a full installation and build process. We also leverage these images to create small, stackable components for building solutions. This streamlines our workflow and enhances our ability to upgrade and adapt quickly.
Currently, we are creating a JAR file and using microservices. There are around 178 services in a single project. We use Docker to manage and do load balancing for all the services together.
I use Docker for containerization, deployment, and to create packages. Docker has a wide range of uses and integrates well with other command-line tools like Terraform. Docker is most helpful when trying to work with CI/CD pipelines.
Containerization is one of the use cases of Docker. Basically, Docker provides containers to its users. So, users can build, run and share containers among developers.
We primarily use the solution to create the node for the containers to deploy multiple apps. We have iOS applications in the containers, and we can build multiple microservices in the containers. It provides access to the content for public IPs. We can host it, for example, on AWS can contain some instances in Azure or AWS.
We're utilizing Docker extensively as all our products and services are deployed on Kubernetes, which is based on Docker. Our reliance on it is high. We have various services, including Python, C++, and Node.js, and several applications that are deployed via Docker. Our usage of Docker is almost 100 percent across all entries.
We used to have a silo problem. Docker solved it because we're able to containerize the microservices that we're developing in the form of Docker mail. Once we run the Docker image, it becomes a container. This container is guaranteed to run in every machine because we're installing Docker as the platform. On top of Docker platform, we're curating the Docker images and running the container. That container has a limited number of libraries and a limited number of data, which is required to run the application. Each container has a limited library system, which is required to run the application. This encapsulation makes it work perfectly, irrespective of the system. It works perfectly once we have encapsulated the application and containerized it. This is guaranteed to run in each machine. We're deploying the solution on Azure cloud. We're curating the CI/CD pipeline. In the CI/CD pipeline, we're curating the Docker images and pushing it to the container registry. We're writing the steps for how to build the Docker image into the YAML code. Once the Docker image is built, it's pushed to the container registry. We're writing this YAML code in the pipeline. Every person on the DevOps team is using this Docker tool. We have plans to increase usage because it's a great tool, and it's the latest technology. We're no longer developing monolith architecture, so everyone is developing applications with microservices. Docker is the best tool to containerize the application and encapsulate it.
I'm using Docker local Kubernetes development. I'm building software that uses Docker for cloud and on-premises applications. I'm consulting for a company that provides an enterprise database solution built using Docker containers and Kubernetes, so everyone at the company is using Docker indirectly.
Tech Lead Consultant | Manager Data Engineering at Ekimetrics
Real User
2022-11-07T12:29:47Z
Nov 7, 2022
Our primary use case for this product is for packaging our solutions. In addition, we use it for packaging our web apps and deploying them on public cloud, primarily on Azure.
Our primary use is to deploy the applications in a secure environment. We prefer that our developer and the Docker files can make the images. After we have captured the images we use our CI/CD tool and deploy our applications. This makes our publisher fast and our containers are isolated from each other. We increase our security by using Docker.
Specialist - Cloud Services and Software at NRG Energy, Inc.
Real User
2022-09-28T15:06:17Z
Sep 28, 2022
We are using Docker in our Java pipeline which is based on DevOps. We use Docker because we do not have to set up an environment to let people try applications.
Docker is a versatile container platform used for running and deploying applications in isolated environments, ensuring consistency across development, testing, and production.
Docker offers solutions for containerizing applications, automating deployments, and managing infrastructure through its robust platform. It supports CI/CD workflows, provides a development platform for container management, and simplifies the setup by using streamlined tools. Organizations leverage Docker for...
The solution is used to run systems in small applications.
Docker helps us implement applications quickly.
In our line of work, we've been involved in various sectors, such as industrial products, transportation, client engineering, telecom, and medical. For instance, we're currently developing an IT platform. One key use case we're tackling is device management. We're looking at managing devices within our setup. These devices send data or signals, which then get transferred to the cloud. It's all about handling the lifecycle of these devices, deploying them, and managing non-provisioned ones, both on our end and on the client's side.
Docker is a development platform for containerization.
In our company, I mostly design 5G networks and my work revolves around virtualization of the 5G core, which is known as backhaul. In our organization, we use many containerization technologies to obtain a proper ROI. No-code servers and other servers are used in a combination predominantly as part of our company operations, which is the most relevant approach in the service provider or enterprise end. For small-scale customers of our organization, we use a combination of Kubernetes and Docker to drive adoption. One of the main use cases of Docker is resource utilization. Like traditional VMs, there is no need to depend upon reserving resources and it's much easier to spin up required instances for consumption using Docker.
We normally use Docker for building and testing all of our applications in a much faster way. We create the software as packages and then use Docker to create containers to build these packages. This process allows us to run and manage our applications.
We are a service provider and we have various use cases. However, the most prominent one is that instead of virtualizing the application workloads, we use Docker. Docker allows you to create small applications and containerize them. You can create multiple such application containers that can run simultaneously on the Linux operating system.
In our company, I have used Docker to launch an application publicly. The aforementioned application becomes available to end users via CloudFront. Docker is also used to cache the optimized performance.
We use the tool for some of our services. We use it for containerization.
We use Docker for validation mechanisms and have built Docker images for our applications. For container orchestration, we haven't used Kubernetes instances extensively. Docker works well for us.
Instead of building images for underwriting systems, we pull pre-existing standard images. This allows us to quickly set up the necessary environment for development. For example, if I need a database instance, I simply pull the Docker image and create it, rather than going through a full installation and build process. We also leverage these images to create small, stackable components for building solutions. This streamlines our workflow and enhances our ability to upgrade and adapt quickly.
Currently, we are creating a JAR file and using microservices. There are around 178 services in a single project. We use Docker to manage and do load balancing for all the services together.
I use Docker for containerization, deployment, and to create packages. Docker has a wide range of uses and integrates well with other command-line tools like Terraform. Docker is most helpful when trying to work with CI/CD pipelines.
We work with containers for forecasting.
I use the tool for SQL, MySQL, and web development.
We use Docker to build, run, and ship any application.
We use the solution to pick up applications and migrate them to run inside containers in Java.
Containerization is one of the use cases of Docker. Basically, Docker provides containers to its users. So, users can build, run and share containers among developers.
We primarily use the solution to create the node for the containers to deploy multiple apps. We have iOS applications in the containers, and we can build multiple microservices in the containers. It provides access to the content for public IPs. We can host it, for example, on AWS can contain some instances in Azure or AWS.
We're utilizing Docker extensively as all our products and services are deployed on Kubernetes, which is based on Docker. Our reliance on it is high. We have various services, including Python, C++, and Node.js, and several applications that are deployed via Docker. Our usage of Docker is almost 100 percent across all entries.
We used to have a silo problem. Docker solved it because we're able to containerize the microservices that we're developing in the form of Docker mail. Once we run the Docker image, it becomes a container. This container is guaranteed to run in every machine because we're installing Docker as the platform. On top of Docker platform, we're curating the Docker images and running the container. That container has a limited number of libraries and a limited number of data, which is required to run the application. Each container has a limited library system, which is required to run the application. This encapsulation makes it work perfectly, irrespective of the system. It works perfectly once we have encapsulated the application and containerized it. This is guaranteed to run in each machine. We're deploying the solution on Azure cloud. We're curating the CI/CD pipeline. In the CI/CD pipeline, we're curating the Docker images and pushing it to the container registry. We're writing the steps for how to build the Docker image into the YAML code. Once the Docker image is built, it's pushed to the container registry. We're writing this YAML code in the pipeline. Every person on the DevOps team is using this Docker tool. We have plans to increase usage because it's a great tool, and it's the latest technology. We're no longer developing monolith architecture, so everyone is developing applications with microservices. Docker is the best tool to containerize the application and encapsulate it.
Our primary use case for Docker is local development. We use Windows for most of our use cases, which means we need two Docker Desktop tools.
I'm using Docker local Kubernetes development. I'm building software that uses Docker for cloud and on-premises applications. I'm consulting for a company that provides an enterprise database solution built using Docker containers and Kubernetes, so everyone at the company is using Docker indirectly.
Our primary use case for this product is for packaging our solutions. In addition, we use it for packaging our web apps and deploying them on public cloud, primarily on Azure.
Our primary use is to deploy the applications in a secure environment. We prefer that our developer and the Docker files can make the images. After we have captured the images we use our CI/CD tool and deploy our applications. This makes our publisher fast and our containers are isolated from each other. We increase our security by using Docker.
We use this solution for data collection and transfer across applications.
We are using Docker in our Java pipeline which is based on DevOps. We use Docker because we do not have to set up an environment to let people try applications.
Docker is an open-source container runtime for running container images. We are using Docker Swarm which is similar to Kubernetes but from Docker.