How Enterprise Configuration Management Architecture Fits with DevOps

[article]
Summary:
When it comes to DevOps, the fundamentals of CM may be forgotten (erroneously) by some practitioners. DevOps tools can be strategic assets, but they are not as important as established CM standards and process. It's up to us as practitioners to ensure that the DevOps tool chain implementation supports the corporate CM policy.

What seems like a long time ago, I was one of those IT professionals who would spontaneously build and implement a service, expecting the business to embrace whatever the new capability of the day was with open arms. Sometimes I'd take some installation notes that I'd staple together and place in a file folder, never to be seen again. Just about as often, I wouldn't even bother with that. Those were the bad old days. I was young and foolish, but I wasn't alone.

Many of us matured over time and learned the value of configuration management (CM). CM is an essential function that makes the system processes we implement repeatable and dependable. CM promises that if the worst should happen, IT technical leadership can step in, replicate the environment, and restore services relatively quickly, having captured details from the initial system implementation. This represents a key deliverable toward business continuity management efforts. Ideally, CM provides a map that associates business requirements with configuration item sets to enhance the support desk's ability to manage systems throughout their lifecycles, by establishing a baseline upon which change management may be documented and communicated at its lowest and most pragmatic level.

Some of us started to write simple shell scripts to automate parts of implementations. Borrowing from our software developer colleagues, we started checking in our code to a central repository. Over time, these scripts evolved to become playbooks of a sort and became more sophisticated, although they were far from standardized.

I still do this during the proof of concept phase of small projects. I'm currently working on an Active Directory domain controller based on Samba v4 and hosted on Ubuntu on a Banana PI platform for my local Linux Users Group. I've scripted the prerequisite package installations and created diff files for configuration changes.

I became aware that the open source community had organized tools such as Ansible, Puppet, Saltstack, and Chef, and people like me were publishing their playbooks. These tools, at least for me, represent a means for a peer-reviewed augmentation, and potentially for the replacement of software installation and configuration manuals that are frequently poorly written, outdated, or absent.

Coincidently, this is about the same time I became aware of formal configuration management beyond source code management (SCM). Before this my ideas of configuration management were based on common sense, intuition, and good intent but lacked consistency and portability. It became obvious that systems engineers must evolve or become irrelevant. I needed to embrace some of the discipline and methodologies of the software development community in order for my work to become less context-dependent. This is how I first became aware of the term DevOps.

For me, DevOps is a combination of IT management frameworks, processes, and capabilities. It appears reasonable to say that DevOps tool chains are for the most part the same tool chains we've been using for years—tools like architectural models, Gantt charts, Mercurial and Git for SCM, support desk ticketing software, and reporting tools. More recently, we've added tools like security information and event management, continuous integration, and continuous delivery, but these are still just capabilities.

A measure of the maturity of applied DevOps is dependent on how these tools are integrated into established IT governance frameworks in support of business process requirements. This alignment may be enhanced as a function of integration and automation via business intelligence tools, much in the same way as other business processes. I would support a decision to use business analysis tools and business analysts (if available) to develop a business process engineering and improvement program specifically targeted toward applied DevOps.

I'm aware of and becoming familiar with a great number of tools such as Jenkins and Rundeck, which provide the DevOps environment with integration, automation, and management capabilities. This and similar tool chains are providing an architecture for systems engineers to meet increasingly demanding requirements. In isolation, these tools hold limited value, so they must be used in conjunction with other tools such as SCM and package management repositories and should support documented architectural models. It's up to us as practitioners to ensure that the corporate CM policy is based on EIA 649b or similar standards and that our DevOps tool chain implementation supports the policy.

All this being said, I don't foresee DevOps tools becoming a replacement for (or even equally important as) established CM standards and process, at least not in the immediate future. DevOps tools can be a strategic asset, but in the end they are just a means to express applied DevOps—a realization of IT CM as an emerging capability. They have the potential to reduce human error and dramatically decrease the time to implementation, but they do what we would do manually.

We must remain mindful that within IT, configuration management is vital, but CM as a discipline exists outside IT. The IT community is merely a consumer of this capability. I think it's more productive to get IT leadership professionals thinking about how to fulfill the role of business analyst in the applied DevOps process engineering and improvement context, and to discuss the integration of different governance frameworks—specifically, where these frameworks present conflict.

About the author

CMCrossroads is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.