In his CM: the Next Generation series, Joe Farah gives us a glimpse into the trends that CM experts will need to tackle and master based upon industry trends and future technology challenges.
Software standards have two main purposes: (1) They help systems work with one another, ensuring consistent interfaces and data exchange mechanisms; and (2) They help to define the use of best practices in a process or series of processes.
As a byproduct, they reduce costs. Standards allow a set of pre-trained personnel to be available on the job market. They provide reduced risk exposure as personnel responsible for processes and for developing standard interfaces/data exchange, are more easily replaced with less risk of knowledge loss. This is because the requirements identified by the standards are well defined. This, in turn, generally means that there are test suites, test plans, and measurements available to monitor processes, and to test out new components. Standards are much more rigorous on the hardware side of things than on the software side. If the hardware pieces don't fit together, neither does the solution. So, too, with communication and network protocols, which are very standards oriented, whether for voice, data or converged communications.
Software has many defined standards, but for the most part, the standards focus on process more than data and interfaces, with a few notable exceptions: ODBC for data exchange, compiler languages for consistency of programming, libraries and linkers for compatibility with operating systems/host platforms, and so forth.
But when it comes down to Software Configuration Management, standards are often relegated to some fairly high-level best practices: you must be able to reproduce a build; you must be able to demonstrate that the product covers the requirements; you must be able to support the product for N years. In my November 2007 article I listed a number of CM best practices. But many of these can hardly be called standard practices or even part of a larger standard.
There are a number of CM "standards"—guidelines, or directives—sometimes specific to a specific organization. And there are related process/quality standards which cover CM to some extent. These have been extremely useful to the SCM community over time. They've established some basic CM requirements. They've helped to form CM processes. However, they are often "soft standards" or project/program specific detailed standards.
They do not allow, for example, the transfer of information from one SCM tool/process set to another. Imagine a database standard that did not allow the information in repositories to be transferred from one DB vendor's tools to another. The reports, scripts, process enforcement, etc. of databases may not be transferable (as the standards for these are less developed). But at least the data can be transferred from one database to another and standard tools can be created which operate across databases of various vendors. Not so with SCM tools. There are some tools that treat problems and features the same. Some that do not require change packages. Some that don't have directories under revision control. And so on. How is such information to be passed between tools if we don't agree on what the information is? How are we supposed to gather common metrics across tools, or share development across SCM platforms?
I'm just as much at fault as the next guy. I haven't demanded that the SCM industry produce standards for information interchange. In spite of trying to work with non-standards such as Microsoft's unofficial SCC API, or the Eclipse platform, I have not tried to force the issue on anyone. Instead, I've taken my own views, as formed somewhat by the rest of the industry perhaps, and have marched down the road in a manner that I thought was best. So where to go from here?
Unfortunately, the CM technology space spans several generations, from 1st generation tools dating back to the 1970's to the odd 3rd generation tool (and perhaps even a