Lessons in adopting Docker to decouple applications from infrastructure

Posted by

The software development landscape is perpetually changing. From new programming languages and their respective frameworks to the adoption of DevOps and other lifecycle tools, tech companies (and software-driven organizations) must always be prepared to anticipate these advancements to grab any competitive edge.

At Tasktop, the ‘newest’ tool we are becoming more familiar with is Docker, which has been experiencing a steady growth of adoption by thousands of software companies experiencing the benefits of decoupling their applications from infrastructure. Our company’s experience with Docker has grown from only a few developers using it to containerize our product, to most of our developers using it to host private repository environments such as Atlassian Jira, Gitlab and Bugzilla to test their backend code changes. We have also incorporated it into our internal infrastructure to transform our single-application VMs to Docker hosts running several applications at once.

Our adoption of Docker has been a slow burn over the course of a couple years, and the process has had its share of struggles and triumphs along the way. During my time at Tasktop, I’ve had the opportunity to learn how to leverage Docker for testing purposes and share my experience with developers across several different teams. Throughout this process, one question has stuck in my mind: how do software companies adopt and adapt to new technologies in a way that doesn’t disrupt developer cycle time? What’s the best way to pay back this technical debt?

This blog doesn’t seek a definitive answer to this question, since there is no right or wrong answer. Instead, it will look at our adoption of Docker as an exploration into this problem with a multitude of answers.

As a co-op software engineer, I believe co-ops and interns can play an important role in assisting companies to adopt and transition to new technologies. I spent my first three months at Tasktop with an experimental team of fellow co-ops and a full-time engineer/mentor with the goal of finding strategies to develop connectors faster and more effectively.

Our experience with Docker

When an engineer is developing, one of the biggest challenges is testing our code changes in isolation without affecting other team members, since we would frequently perform tests against one shared repository instance (i.e.: Jira, Gitlab, etc.). The lack of a versioned test environment also created problems for ourselves and other teams, since it was difficult to reproduce and resolve defects that could arise in previous versions of our code against specific versions of a repository.

By creating a Docker image, each developer can run and configure their own repository to test against on their computer, and allowed our team to run tests against previous versions and configurations of a repository. Since then, our demonstrated work has encouraged other teams to take time to learn Docker and leverage it for their own testing needs.

Engineering managers – for good reason – may not want to redirect their full-time engineers’ focus from fixing high-priority defects to learn something like Docker, and instead can call upon their co-op engineers to jump into the uncharted territory. This scenario has benefits for both the students and the company as a whole: the co-op students can gain the necessary skills working with the in-demand technology, and the knowledge they acquire can be passed along to full-time employees to build upon for future development.

It might go without saying, but having one or two software engineers with experience in the adopted tool is necessary for incremental change. Our experimental team benefitted greatly from having a couple engineers with Docker experience to code-review our Dockerfiles and discuss our Docker testing process. They helped us get the ball rolling, and now we are paying it forward to other teams as they ‘Dockerize’ their repositories.

Pain points

Rolling out a new technology with only our in-house experience has had its challenges along the way. At first, it was hard to facilitate necessary cross-team collaboration. There was a disproportionate emphasis on our experienced Docker developers to review new Docker code. Even documenting best practices for future Tasktop Docker developers has proved to be difficult, since each problem can be unique and doesn’t lend itself to a one-size-fits-all solution.

What about you?

What are some of your thoughts on alternative strategies when introducing new tools/technology in a software company? Is it best to look for professional training, hire more developers with the necessary experience, or try to make it work with your own talented engineers? Like our experience in developing with Docker, there are countless ways to get to your desired result. We would love to hear about your approaches to introducing and rolling out new tools, technologies and methodologies.

Further reading

How to optimize your software development co-op/internship program