Automation improves systems quality

Photo by Chris FordHere’s another example of improving both speed and quality. A while back I talked about how productivity tools improve software quality. The same is true of automation and productivity tools at the system level, as a recent experience reminded me, and as anyone in the devops movement will attest to.

I was speaking to a systems engineer who was working with a large estate of Apache web servers. At the time he was tasked with ensuring that the configs in production for the various environments were correctly reflected in version control. It wasn’t. And not only that the config files were inconsistent in their structure and logic. To complete his task fully—update all the config files and ensure consistency—would have been a monumental exercise.

It was clear to both of us that if generation of the files had been automated then this would be a trivial exercise. That’s not because automation makes things faster, but because specifying something to a machine encourages (if not forces) you to be mechanically regular—something like Puppet, Chef or Ansible encourage simple specifications. There would be few inconsistencies, and differences between configurations would be explicit.

Additionally, when you specify something with those tools you tend to describe the essential elements, while boilerplate and templating is handled by the framework and is therefore consistent. In other words, it’s not just about automation, but also about abstraction.

There are caveats to this argument, of course. For example, things like Puppet can also get very complicated and cause their own nightmares. But I would expect those complications to be nothing compared to the same estate managed manually.

Automation of environment deployments and rebuilds is another example of that counter-intuitive outcome of the lean movement: simultaneous improvements in speed and quality.

Photo by Chris Ford