After two years of growing interest, the reality of container technology is finally catching up to the promise. Containers, typified by Docker, are now having an effect on shaping not just applications, but the development process, too. An entire conference, Container World, running Feb. 21 to 23 in Santa Clara, Calif., is dedicating dozens of educational sessions and discussion panels to implementing container technology and understanding its impact on IT, DevOps and the business at large.
One session on Docker containers of importance to both developers and operations personnel is "A Practical Guide for Adding Docker to Enterprise Workflows," presented by Chris Ciborowski, CEO and principal consultant at Nebulaworks, an Irvine, Calif., DevOps adoption consultancy. Ciborowski said he believes Docker workflows deserve a closer look. He spoke with SearchCloudApplications to offer a preview of his session.
Your presentation is about using containers in workflows, as opposed to in applications -- specifically Docker workflows.
Chris Ciborowski: If you're focusing on the application bits going into the container, that's not what we're talking about. I'm focusing on the pipeline, not on the refactoring of an app. We talk about using containers in the software development lifecycle that you already have today. Developers are already using provisioned environments. Let's expand that with containers. And we focus on enhancing continuous integration and delivery with containers.
We tend to think of containers as holding pieces of an application or a microservice, but you are looking at the development process more than the application itself.
Ciborowski: That's right. Containers can help developers ship less on test-driven development. Instead of developing inside a VM [virtual machine] that's provisioned by an operations team, or on a laptop that takes a long time to configure with Python and Ruby, we're discussing using containers for that development environment.
Think of it as a scratch workspace that a developer can use and then quickly reprovision when a new requirement pops up. It's about helping the developer iterate more quickly. It doesn't take much from operations to provision that host into a VM with a Docker Engine and then letting the developer manage the prerequisites that go into the container.
A second area is continuous integration. How does that work with containerization of Docker workflows?
Chris CiborowskiCEO and principal consultant, Nebulaworks
Ciborowski: There's a natural fit for containers to run tools like the Jenkins automation server and run workloads that you would normally run in Jenkins in an ephemeral fashion. Instead of provisioning Linux machines to run Jenkins jobs, you can continue to use the Jenkins environment you have today, but reconfigure Jenkins to use Docker endpoint with containers to run jobs. You can elastically scale your Jenkins environment without having to preprovision the Jenkins agents that are sitting statically or sitting idle waiting to accept workloads. With containers, you can work these in a more agile fashion and scale elastically.
There's yet another area you plan to discuss, and that is use of containerized Docker workflows in prototyping. Is that targeting the business side more so than app development?
Ciborowski: The use of container environments for ops teams allows you to give product teams the ability to scaffold up and create applications and services on the fly quickly to show the line of business the potential offering that an application team can develop. Platform as a service has typically been the workspace where you can create a rapid prototype. Instead, we look at using containers to expand the technology you can put into these rapid prototypes instead of being bound to what is very prescriptive within the PaaS.
Use containers for continuous integration and deployment
Watch out for these Docker gotchas
Using containers with Windows Server 2016