Print This Post

Backwards Compatibility and Open Standards

There has been much discussion within OGC, OASIS, and ISO of late, on the issue of backwards compatibility – but much less action.  Some initiatives, such as the attempt to harmonize UML models across the various ISO specifications, are laudable.  Others, however, such as the introduction of ebRIM 4.0 (which is not compatible with version 3.0) and the OGC ebXML RegRep geospatial extensions, or the apparent assumption within the OGC that a major revision of a standard requires the abandonment of backwards compatibility, are deeply troubling.

Backwards compatibility is not simply a nice-to-have feature, but is essential to the whole process of open standards development and adoption.  The existence of open standards implies a coupling of the development activities of all those that support the standard, meaning software vendors, system integrators, end users, and the developers of dependent standards.  For such an enterprise to be successful, we must continuously support an implicit co-operation between all of these different actors, and without a significant measure of backwards compatibility this co-operation is in peril.  Software component and tool vendors will not be able to justify expenditures if they perceive specifications to be unstable, or if they constantly require more investment just to keep pace with a changing specification.  They need stability and a consistent growth path.  Without the components and tools, systems integrators cannot deliver cost effectively, and both they and their end user customers will suffer as a result.  The additional training costs of not meeting backwards compatibility objectives are, of course, then borne by everyone.

Modern open standards, with the exception of some fairly low level standards like IP (RFC XXX), do not stand alone.  Rather, they are parts of a larger web of standards.  The aviation information standard AIXM, for example, depends on GML in which it is written.  GML, in turn, depends on XML, XLink,  XML Schema, and a range of ISO standards (TC 211 19111, 19107, 19108, 19109, etc.), and these specification, in their turn, depend on many others from the ISO, W3C, and IETF.  Changing one standard in the web has implications for many others and, hence, for a potentially much larger community of software developers, vendors, integrators, end users, and standards developers.

Of course, one cannot expect that we will get it right the first time, and hence revisions of standards are an essential part of the process.  Furthermore, we will not encompass all of the needed scope the first time around either, and so the scope of many standards can be expected to grow over time as well.  We cannot be opposed on principle to revision for correction or revision for extension.

What we can expect, however, is that backwards compatibility be given a very high position in the list of criteria applied in the revision process.  We should almost never break compatibility unless we absolutely have to.  Sometimes, a specification which is “incorrect” is better left uncorrected, as difficult as that might be for some.  Sometimes the “simplification” of a standard may only lead to an overall increase in complexity in the marketplace.

I would strongly urge those working in the various standards bodies critical to the evolution of the GeoWeb (W3C, OGC, OAISIS, ISO TC/211) to take the criteria of backwards compatibility very seriously and to enact approaches and regulations that make it difficult to put forward a standard revision that breaks this compatibility.  Breaking backwards compatibility should be considered a significant act and considerable effort should be expended to prevent it.  Proponents of changes which break backwards compatibility should be required to show that this is essential and there is no alternative.  Furthermore, I believe that specification designers must seek, at all times, the means to make the transition from one revision to another as painless as possible.

The current belief in the OGC that a major revision of a specification (e.g. from GML 3.0 to GML 4.0) REQUIRES breaking backwards compatibility is a complete misunderstanding, and turns the whole process on its head.  As I understand it, what is actually intended is that, while on minor revisions backwards compatibility is a dominant objective, it CAN be relaxed (if necessary) when moving to a new major version.  This is NOT AT ALL the same thing as saying that a major revision breaks backwards compatibility.

The other issue in all of this is the pace of specification change.  Many people believe, I think incorrectly, that since technology changes rapidly, standards should as well.  I think, to some extent, that the opposite is true in the field of information technology.  Remember that, in information technology, we are building (at least in the core infrastructure) a permanent and evolving system.  There is only one Internet.  There is only one World Wide Web (of documents).  The Internet Protocol (IP4) has remained substantially unchanged since the 1970’s.  Rather than a whirly‑gig world of constant change, we can see ourselves as building a wall – layer by layer – with the hope and intent that the substantial essence of these layers will be around for a very long time.  As in the building of any permanent structure, care and patience is at least as valuable as speed and the issues of the moment.