Managing Commodity Products And Services

It is important to understand the role and the limitations of process standards such as CMMI and ITIL, and of technical standards such as UML, XML, SOAP, and others. Adopting such standards does not automatically increase software quality or software development productivity.

The increase in IT related standards since the invention of the Web in 1989 can be seen as an indication of maturity of the IT industry. Ten years ago only a small fraction of personal computers was connected to the Web. Today, all kinds of devices that contain software - mobile phones, digital cameras, iPods, cars, and even washing machines - provide interfaces that allow them to communicate with other devices. Similarly, in the realm of enterprise software, today’s applications are typically interconnected with numerous other systems, across organizational boundaries and across a range of implementation technologies. The development and operation of such heterogeneous systems is often performed by distributed, global teams. No wonder that more and more organizations are taking a serious interest in software, process, and quality standards to facilitate interconnectivity and communication. But adoption and implementation of standards comes at a price. Which standards should an organization embrace? The answer depends heavily on the nature of the business, and it is highly recommended to do some homework before committing to achieve compliance with specific standards.

Standardization is a process that is riddled with difficulties. It starts with the recognition of the need for a standard, then getting these with a common interest to agree on the scope of the standard, and finally, developing the standard. Collectively, on a global scale, users of information technology exert significant pressure on IT suppliers to develop and adhere to standards, reminding them that if they do not they will become uncompetitive. This pressure has forced IT vendors and service providers to create and support organizations such as the World Wide Web Consortium, the Object Management Group, and the Carnegie Mellon Software Engineering Institute, that pledge to deliver standards for software interoperability and for assessing service quality.

The most successful standards are those that apply to software that is widely considered a commodity. Standards such as HTML and TCP/IP fall into this category. They provide the base or foundation for countless applications.

In contrast, attempts to develop useful standards for interoperability at higher levels of abstraction have either failed, or have led to standards of limited usefulness – take for example the XMI standard for exchanging UML model information. The level of standardization for interoperability between enterprise software represents the extreme end of the spectrum, where practical interoperability is close to zero. Simply ask any IT manager in a larger organization about the costs involved in integrating enterprise software packages from different vendors.

Besides technical software standards, there are a growing number of standards that relate to IT service delivery, software development processes, and quality assurance measures. Two flagship standards in this space are the CMMI standard for process improvement and the ITIL standard for IT service management. The former is popular with software development service providers, and the latter is establishing itself as a core standard for managing IT service delivery.

Standards Development

In organizations that use rather than develop information technology solutions there is a tendency to overestimate the benefits resulting from the adoption of standards. The best way to bring expectations down to a realistic level is to mentally step into the shoes of an IT vendor, and to examine the role of standards from the other side of the fence.

  • Standardization is triggered either by competitive pressure in a mature industry where the prices for basic services have declined significantly, or it is triggered pro-actively by industry consortia whose members would like to sell new products that require a new technology platform as an operational basis.
  • When standardization occurs as a reaction to market pressure, competing vendors are forced to consider the development of a common platform that provides basic functionality. Vendors cooperate on standards and they compete on implementations. Typically each established vendor in a consortium-driven standardization initiative pushes an agenda that reflects their specific area of specialization. The result of such design-by-committee activities is predictable — bloated, overly complex standards, and implementations that offer minimal interoperability.
  • In case of pro-active standardization the situation is somewhat better, provided that the new target market is perceived to be large enough to leave sufficient room for a range of vendors over a substantial period. Examples of pro-active standardization can be observed in the telecommunications industry. The business models of network operators, hardware vendors, and software providers in the telecommunication sector critically depend on collaboration on standard development.
  • The last five years have demonstrated that there are viable alternatives to consortia-led standards development. The idea of mandating Open Source reference implementations for software standards has finally given credibility to the term Open Standard . The license attached to open reference implementations of the WC3 for example makes implementations available to all organizations worldwide, whether or not they are W3C members, and the licence is not conditioned on payment of royalties, fees or other consideration.

Bruce Perens, who announced "Open Source" to the world, and who published his first Open Source program in 1987, sums up the problem of consortia-based standard development as follows:

In the consortium projects, there's always the handshake with one hand and a dagger in the other.

Standards by necessity are part of the public domain, and certification of standard compliance is either performed directly by a standards authority or by an appropriately qualified agent of the standards authority.

Technology standards mostly focus on interoperability, which implies a common technology platform. In theory interoperability should entail exchangeable implementations, but practical reality shows that this is not always the case. User pressure can easily lead to a premature agreement and release of a standard. In the interest of generating sales vendors will happily conform to the standard in the full knowledge that the standard does not cover important features.

Again the evolution of the XMI standard can be used to illustrate this point. The initial versions of the standard did not cover the preservation of UML diagram layouts. In practice this meant that XMI was only suitable for one-off data migration exercises, and impractical to use in an environment where different users work with different UML tools. Even today interoperability between UML tools is not such that individual users in a team can work with different tools. Vendors are already satisfied when their product passes certification and can wear the badge of “acme standard-compliant”, because it enables them to sell the product. Subsequent complaints from users about lack of interoperability may be ignored for several years, until sagging sales finally force the key vendors to revisit the standard.

Where-ever standardization is truly successful, basic functionality is provided as part of all competing offerings and becomes a non-differentiating feature. Vendors then concentrate on value added services in one or more areas of specialization. The original playing field turns into a commodity. In other words, standardization of non-differentiating features goes hand-in-hand with a higher degree of specialization amongst vendors.

Commoditization finally brings about economies of scale in terms of producing goods that conform to the standard, and the availability of training and skilled resources.
standards-and-specialization.jpg

Guiding principles for successful standard adoption

Standards are created with the intention of serving as a common denominator that simplifies or improves some aspect of operating a business or a technology. Only invest in adopting a standard if there is clear evidence that the standard adds value in your particular context. Consider the following factors:

  • Applicability to your industry
  • Suitability for the scale of your organization
  • Stability - is there a risk of frequently having to upgrade to a new version of the standard?
  • Maturity - (a) size of the user base, and (b) emerging competing standards
  • Financial value - can the benefit of adopting the standard be estimated in financial terms, and does the estimate compare favorably with the cost of implementing the standard?

In some cases there is no standard that is maintained by a dedicated standardization body, but there may be valuable de-facto standards worth adopting. In particular in the area of IT processes the development of standards lags considerably behind current state-of-the-art techniques. The key to successful exploitation of IT standards lies in finding the optimum balance between the cost and benefits of compliance and the need for specialization to get a competitive edge.

Increasing quality and productivity

Since standards live in the public domain, and are accessible to all your competitors, it is obvious that their implementation will do little to improve your competitive advantage. In areas that are close to the core of your business compliance with relevant standards is necessary but certainly not sufficient for survival. This observation applies to developers of IT products as well as to users of IT products and services. Deep subject domain knowledge is required to optimize business processes, to exploit automation, and to increase productivity. Progress in these areas is not easily achieved via industry standards alone.

A common trap that organizations fall into is believing that CMMI maturity levels correlate with specific, benchmarked levels of productivity. CMMI can be used to increase service and product quality, and although that often also results in increased productivity, it need not be the case. In contrast, the Agile Manifesto de-emphasizes the importance of plans and formal processes, and relates directly to productivity and the ability respond to change.

CMMI Agile Manifesto
Scale Large organisations Small teams of up to ten people
Major Goal 1 Quality Productivity
Major Goal 2 Repeatability Ability to respond to change
Major Goal 3 Scalability of processes Optimal use of individual skills and experiences
Major Goal 4 Ability to perform
process improvements
Ability to deliver results
that meet current customer needs

The table above is not intended to offer a complete comparison, but it highlights the difference in perspective. CMMI is suitable as a guide for process improvements in large organizations that require scalability. However, ultimately even large organizations can be modeled as collections of small teams. Thus the productivity of both large and small organizations can benefit from applying agile project management techniques at the level of individual projects.

In particular in relation to software development the Agile Manifesto also contains important lessons that have nothing to do with scale. These lessons need to be applied in order to avoid the drawbacks of traditional methodologies. These lessons are:

  • Software is highly abstract, and software requirements even more so. Precise requirements are essential, but due to the abstract nature of software it is easy to reach a point where adding further detail amounts to speculation, and the level of accuracy of the specification starts to decrease. A significant number of potential misunderstandings between the client and the software development team can't be uncovered until a tangible prototype system is available for validation. Hence the time spent up-front on elaborating software requirements needs to be limited. From a certain point onwards it costs less to "burn a pan cake", i.e. to build a first iteration of a system and to validate working software under construction, rather than to continue validating abstract requirements.
  • A large proportion of software development has the flavor of design activities rather than the flavor of construction activities. In software development, especially when using Model Driven Software Development, pattern based activities can be easily automated. Once automated, software construction activities don't take up significant time, and what is left consists of software design, building and validating prototypes, and project management.
  • The two points above reinforce each other. Traditional engineering disciplines deal with recurring variants of well-understood construction problems. In contrast, although basic software engineering principles are well-understood, the fast pace of evolution of hardware and software implementation technologies means that the majority of software systems are built with technologies or approaches that only have a track record of a few years at most. Typically each software project has to design the construction patterns and templates that are appropriate for a project-specific combination of implementation technologies.

Scalability of agile software development is achieved by using a divide-and-conquer technique to break up large programs of work into much smaller projects that can be managed with an agile approach. The circle closes when it comes to integrating and co-ordinating a range of relatively autonomous, agile projects. Here it is essential that well-defined work products are used to transfer deliverables between projects, and that a robust quality assurance regime ensures that the majority of defects are caught at the earliest possible stage.

The main value of agile methodologies lies in their ability to fully exploit the capabilities of the most experienced and specialized staff. This is an aspect that is somewhat orthogonal to the idea of process standardization, and hence it is at risk of being overlooked when taking a simplistic route towards CMMI maturity assessments and certification.

Recommendations

  • When considering process improvement using CMMI or a similar approach, be careful not to reach too high too fast. If increased productivity is an important objective, harvest the low hanging fruit by using agile software project management in individual projects to build a base of expertise. Agile methodologies such as Extreme Programming are ideal to instill the level of discipline required to consistently achieve high quality project outcomes. Subsequently the focus can shift to climbing the ladder of CMMI maturity levels and to scaling up.
  • analyze your current IT project portfolio, and consider breaking up large projects into appropriately smaller sub-projects that can be managed using an agile approach.
  • Provide each project with sufficient autonomy and resist the urge to standardize intermediate work products and process steps. Integration across projects can be achieved by synchronizing timeboxed iterations, and by standardizing the format of final deliverables that are transferred between projects.
  • Take into account the fact that teams of less than ten people undertake the vast majority of software projects. If you need to scale up a project beyond a team of ten people, an agile methodology such as Extreme Programming is not enough, and needs to be complemented with techniques that specifically address the aspect of scale.

The increasingly strategic role of Open Source Software

When it comes to design and implementation of an Enterprise Architecture, traditionally the key software acquisition decisions have centered around the building vs. buying conundrum. In the last five years however, many Open Source infrastructure software offerings have matured to the point of being rated best-in-class solutions by experienced software professionals. The motivation to adopt Open Source components is very similar to the motivation to adopt standards, i.e. avoidance of vendor lock-in. Some Open Source components such as the Apache web server or the J-Unit tool act as de-facto standards in specific domains. This means that build vs. buy decisions need to be extended to build vs. buy vs. Open Source decisions.

The success of OSS in the realm of infrastructure (i.e. commodity) software is best explained by comparing the traditional software product development life cycle with the evolutionary life cycle of OSS.

traditional-software-product-development.jpg

The contrast to the life cycle of OSS is significant.

open-source-software-development.jpg

[www.sourceforge.net], the largest repository of OSS, now hosts nearly 120,000 projects, most of which are targeted at organizations that develop software. Obviously many of these projects represent dead wood, but this still leaves a staggering number of high quality Open Source components. For each architectural concern that needs to be addressed in an Enterprise Architecture, there is typically a range of Open Source options to choose from.

infrastructure-software.jpg

Evaluating and selecting Open Source components

The selection process for OSS is similar to the selection process for proprietary software products. Instead of performing due diligence on a vendor, due diligence needs to be performed on the maturity of the community that supports the OSS under consideration to ascertain the risks regarding quality and availability of support Matters which need to be considered are:

  • The number of downloads is an indicator of popularity and can serve a rough indicator of quality.
  • Ideally the original contributors provided a substantial initial contribution, which has subsequently attracted a sizable group of active external contributors. Open Source projects that start with a blank sheet easily fail to generate sufficient momentum, and are subject to the risks of design by committee.
  • The online forums related to the project are a good place to obtain feedback on functionality, stability, and also on the quality of support available through the community. Is the forum active? Then, look out for the number of defects. If the number is high and does not rise sharply over time, it typically indicates that the project is mature.
  • Well-established and widely used Open Source projects typically have been covered by a range of technical books written by core developers and users. The presence of related books is another good indicator of project maturity.

Raising the level of abstraction

The earliest important milestone in IT standardization was arguably the development of the ASCII standard for the digital encoding of characters in 1963. The ASCII code was originally used in teleprinters, and it still survives to date — embedded in the 128 lowest characters of the Unicode character encoding standard. ASCII is necessary to implement the most basic technique for interfacing between two software systems, namely writing and reading information from text files or streams, a technique that acts as a foundation for more sophisticated communication protocols at higher levels of abstraction.

The next major steps in technical software standards occurred in the mid 1980s, with the release of the first SQL standard in 1986 and the development of the TCP/IP protocol that was adopted as the standard for computer communication by the US military in 1984, and that still powers the internet today. The work on the foundations for SQL and TCP/IP go back to the early 1970s. Hence it took more than a decade from the initial concepts to a first version of a related standard. But the slow speed of standardization should not detract from the value of the results. The scale and reliability of systems that can be build with SQL and TCP/IP far surpasses the scale and reliability of systems that could be built previously.

The TCP/IP protocol is also a good example to show the importance of raising the level of abstraction in order to manage the complexity of communication in a heterogeneous network of systems. The TCP/IP protocol stack consists of four layers (from top to bottom: application, transport, network, link), where each layer encapsulates the use of the protocol of the next lower layer.

The increasing level of abstraction in communication protocols occurred in parallel with increases in the level of abstraction offered by programming languages via the transition from processor specific assembly languages to hardware and operating system independent third generation languages (3GLs) such as COBOL, and C. Later object oriented languages such as C++ and others were developed, which provided explicit constructs for abstraction and encapsulation. The development of 3GLs led to standards for each programming language that theoretically ensured source code could be compiled and applications could run on different platforms. In practice however, each hardware and operating system platform typically forced software developers to work with very specific underlying infrastructure in order to deliver useful functionality. This meant that programming language standards merely ensured a standardisation of language syntax and semantics, but they did not lead to easily portable software, and certainly not to platform independent code.

This lesson still applies today, as can be seen in the convoluted history of the Java EE standard. In order to facilitate unproblematic adoption across hardware and operating system platforms, Java started off as a naked object oriented language without an extensive set of well-designed high-level APIs. This approach implied a big step backwards in terms of productivity, but the enterprise market bought into the standards based approach, hoping to shake off the technology stack lock-in imposed by traditional Computer Aided Software Engineering (CASE) tools.

Of course we have achieved operating system and hardware independence for many domains in which the Java language is used, but most enterprise applications are dependent on a quasi-unique combination of infrastructure components, so that reuse of individual components in a different context remains elusive, and so that interfacing different systems typically remains an expensive exercise.

To solve the software productivity problem, the OMG announced the Model Driven Architecture initiative (MDA) in the year 2000. Since then the OMG has worked on raising the level of abstraction of its standards beyond CORBA, i.e. via the development of MDA related standards such as the Meta Object Facility (MOF), a standard for defining meta models, and Query View and Transformations (QVT), a standard for expressing model transformations that is still in the process of publication. Overall the speed of progress in the last few years in terms of interoperability standards for model driven tooling matches the speed seen earlier in the development of the SQL standard and the TCP/IP standard. Therefore a widely adopted MDA standard is probably still another five years away.

In the mean time, Microsoft joined the model driven club in 2003 with their Software Factories initiative, and a large range of Open Source components for model driven software development have emerged and have had time to mature.

Beyond standardization

The majority of software systems are implemented in general-purpose languages such as Java and C#. Additionally, modeling languages such as the Unified Modeling Language provide a formal visual syntax to describe the structural aspect and part of the behavioral aspect of object oriented software systems.

The limitations of general-purpose languages and of standards based software development raise questions about alternative paths to increase software quality and software development productivity. This is where domain specific (special purpose) languages come in. Whenever deep domain knowledge is available, it is usually possible to design formal specification languages that directly tap into domain specific terminology. In fact all advanced applications of model driven software development approaches involve domain specific languages (DSLs), i.e. languages that incorporate domain concepts as first class language elements.

Familiar DSLs are end user programming tools such as spreadsheets and symbolic mathematics systems. Further practical examples include languages to specify electrical networks, languages to specify user interfaces for mobile phones, and languages to express pricing methodologies for specific types of products. A good DSL minimizes the notational gap to the problem domain. Hence a DSL is close to the natural language of a domain expert, and specifications expressed in a DSL tend to be very compact. Therefore, DSLs can be a very effective antidote against the problems of one-fits-all approaches.

The use of DSLs has a long track record, in particular in the telecommunications domain. Most DSLs are not commodities, usually they are custom-built, and contain core intellectual property of an organization. It will be interesting to see at what point a mature MDA standard will have a practical impact on the development of DSLs. In the absence of an industry standard, the Open Source implementation of a simplified variant of the OMG’s MOF standard contained in the Eclipse EMF project offers a practical platform for DSL development. Yet another example where an emerging Open Source de-facto standard is moving faster than a consortia based standardization process.

The value of certification

The authority that owns a standard usually also provides a certification regime that enables implementations to be validated against the specifications. In general certification is a very effective means to ensure that implementations meet the requirements laid out by the standard.
Implementations of technical software standards can easily be validated against automated tests to confirm compliance. This adds significantly to the value of the standards, and it minimizes the cost of certification. Compliance with process and service standards is much harder to assess and establish. Any standard that does not consist of a formal specification that can be validated automatically requires appropriately qualified human assessors and ultimately verification of compliance relies on the judgments provided by these assessors.

Yet, given that more and more software is developed and operated by global teams, some degree of process standardization is certainly desirable. Here we reach another limit of what can be achieved with standards. Standards that define activities carried out by IT professionals are only useful to a certain level of detail. Especially small and experienced teams only need minimal process guidance. Inexperienced teams need more guidance, but if too much “process” is added in the interest of quality assurance, productivity and motivation of staff drops noticeably.

In terms of the lines of code developed, only a small fraction of the software developed is used in life-critical systems where large investments in process compliance and certification are essential.
The cost of increasing the level of standardization of manual business processes should always be compared to the cost of fully automating those same processes. The core value of highly qualified and talented IT staff lies in their ability to innovate and automate, and not in their capacity for executing strictly prescribed sequences of micro-tasks. This is a lesson that is sometimes forgotten in large software organizations. At the opposite end of the spectrum, software start-up often don’t know at what point they should start to think about process standards, and may have difficulty settling on a level of standardization that is appropriate for the size of the business.

The optimum balance between agility and standardization depends on many factors, and there is no universal rule. The issue is complicated by the fact that organizations are not static, but are constantly evolving.

Conclusion

This chapter described the increasing relevance of IT standards and the factors that influence the rate of standardization observed in the industry. Standardization is not a silver bullet, yet skilled and selective adoption of appropriate standards provides the substrate for growing a successful IT organization. The chapter also explored the limits of standards. Building a strong competitive edge requires going beyond industry standards, and the creating an environment that fosters innovation.

Overview of key standards organizations

  • Agile Alliance
  • BMI, Business Modeling & Integration Domain Task Force (DTF) Previously known as BPMI
  • JCP, Java Community Process
  • ISACA, Information Systems Audit and Control Association
  • ISO, International Organization for Standardization
  • ITU-T, The Telecommunication Standardization Sector of the International Telecommunication Union (formerly known AS CCITT)
  • OGC, Office of Government Commerce (UK)
  • OMG, Object Management Group
  • OSI, Open Source Initiative
  • PMI, Project Management Institute
  • SEI, Software Engineering Institute, Carnegie Mellon University
  • W3C, World Wide Web Consortium

Next Chapter » Managing strategic software assets & intellectual property

Add a New Comment
Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License