1 The history of object-oriented analysis and design methods

The development of computer science as a whole proceeded from an initial concern with programming alone, through increasing interest in design, to concern with analysis methods only latterly. Reflecting this perhaps, interest in object-orientation also began, historically, with language developments. It was only in the 1980s that object-oriented design methods emerged. Object-oriented analysis methods emerged during the 1990s.

Apart from a few fairly obscure AI applications, up until the 1980s object-orientation was largely associated with the development of graphical user interfaces (GUIs) and few other applications became widely known. Up to this period not a word had been mentioned about analysis or design for object-oriented systems. In the 1980s Grady Booch published a paper on how to design for Ada but gave it the prophetic title: Object-Oriented Design. Booch was able to extend his ideas to a genuinely object-oriented design method by 1991 in his book with the same title, revised in 1993 (Booch, 1994) [sic].

With the 1990s came both increased pressures on business and the availability of cheaper and much more powerful computers. This led to a ripening of the field and to a range of applications beyond GUIs and AI. Distributed, open computing became both possible and important and object technology was the basis of much development, especially with the appearance of n-tier client-server systems and the web, although relational databases played and continue to play an important rôle. The new applications and better hardware meant that mainstream organizations adopted object-oriented programming and now wanted proper attention paid to object-oriented design and (next) analysis. Concern shifted from design to analysis from the start of the 1990s. An object-oriented approach to requirements engineering had to wait even longer.

The first book with the title Object-Oriented Systems Analysis was produced by Shlaer and Mellor in 1988. Like Booch's original paper it did not present a genuinely object-oriented method, but concentrated entirely on the exposition of extended entity-relationship models, based on an essentially relational view of the problem and ignoring the behavioural aspects of objects. Shlaer and Mellor published a second volume in 1992 that argued that behaviour should be modelled using conventional state-transition diagrams and laid the basis of a genuinely OO, albeit data-driven, approach that was to remain influential through its idea of 'translational' modelling, which we will discuss. In the meanwhile, Peter Coad had incorporated behavioural ideas into a simple but object-oriented method (Coad and Yourdon, 1990; 1991). Coad's method was immediately popular because of its simplicity and Booch's because of its direct and comprehensive support for the features of C++, the most popular object-oriented programming language of the period in the commercial world. This was followed by an explosion of interest in and publication on object-oriented analysis and design. Apart from those already mentioned, among the most significant were OMT (Rumbaugh et al., 1991), Martin-Odell (1992), OOSE (Jacobson et al., 1992) and RDD (Wirfs-Brock et al., 1990). OMT was another data-driven method rooted as it was in relational database design, but it quickly became the dominant approach precisely because what most programmers were forced to do at that time was to write C++ programs that talked to relational databases.

OMT (Rumbaugh et al., 1991) copied Coad's approach of adding operations to entity-type descriptions to make class models but used a different notation from all the previous methods. Not only was OMT thoroughly data-driven but it separated processes from data by using data flow diagrams separately from the class diagrams. However, it emphasized what Coad had only hinted at and Shlaer and Mellor were yet to publish: the use of state-transition diagrams to describe the life cycles of instances. It also made a few remarks about the micro development process and offered very useful advice on how to connect object-oriented programs with relational databases. Just as Booch had become popular with C++ programmers because of its ability to model the semantic constructs of that language precisely, so OMT became popular with developers for whom C++ and a relational database were the primary tools.

Two of OMT's chief creators, Blaha and Premerlani (1998), confirm this with the words: 'The OMT object model is essentially an extended Entity-Relationship approach' (p.10). They go on to say, in their presentation of the second-generation version of OMT, that the 'UML authors are addressing programming applications; we are addressing database applications'. Writing in the preface to the same volume, Rumbaugh even makes a virtue out of the relational character of OMT. We feel that a stricter adherence to object-oriented principles and to a responsibility-driven approach is a necessity if the full benefits of the object-oriented metaphor are to be obtained in the context of a fully object-oriented tool-set.

In parallel with the rise of the extended entity-relationship and data-driven methods, Wirfs-Brock and her colleagues were developing a set of responsibility-driven design (RDD) techniques out of experience gained more in the world of Smalltalk than that of the relational database. The most important contributions of RDD were the extension of the idea of using so-called CRC cards for design and, later, the introduction of the idea of stereotypes. CRC cards showed Classes with their Responsibilities and Collaborations with other objects as a starting point for design. These could then be shuffled and responsibilities reallocated in design workshops. The idea had originated from the work of Beck and Cunningham at Tektronix, where the cards were implemented using a hypertext system. Moving to physical pieces of cardboard enhanced the technique by allowing designers to anthropomorphize their classes and even consider acting out their life cycles.

Objectory was a proprietary method that had been around much longer than most object-oriented methods. It originated in the Swedish telecommunications industry and emerged in its object-oriented guise when Jacobson et al. (1992) published part of it (OOSE) in book form. The major contribution of this method was the idea that analysis should start with use cases rather than with a class model. The classes were then to be derived from the use cases. The technique marked an important step forward in object-oriented analysis and has been widely adopted, although it is possible to make some fairly severe criticisms of it. Objectory was the first OO method to include a bona fide, although partial, development process.

OBA (Object Behaviour Analysis) originated from Smalltalk-dominated work at ParcPlace and also included a process model that was never fully published although some information was made available (Goldberg and Rubin, 1995; Rubin and Goldberg, 1992). One interesting feature of OBA was the use of stereotypical scripts in place of use cases.

Coming from the Eiffel tradition, Waldén and Nerson's (1995) BON (Business Object Notation) emphasized seamlessness and hinted at a proto-process. However, this approach (and indeed its very seamlessness) depended on the adoption of Eiffel as a specification language throughout the process. It made important contributions to the rigour of object-oriented analysis as did Cook and Daniels' (1994) Syntropy. BON improves rigour using the Eiffel idea of class invariants while Syntropy does this and further emphasizes state machines.

MOSES (Henderson-Sellers and Edwards, 1994) was the first OO method to include a full-blown development process, a metrics suite and an approach to reuse management. SOMA (Graham, 1995), which appeared in its mature form roughly contemporaneously with MOSES and was influenced by it, also included all these features, as well as attempting to fuse the best aspects of all the methods published to date and go beyond them; especially in the areas of requirements engineering, process, agent-based systems and rigour.

In 1994 there were over 72 methods or fragments of methods. The OO community soon realized that this situation was untenable if the technology was to be used commercially on any significant scale They also realized that most of the methods overlapped considerably. Therefore, various initiatives were launched aimed at merging and standardizing methods.

Thus far, to the eyes of the developer there appeared a veritable soup of object-oriented analysis and design methods and notations. It was an obvious development to try to introduce some kind of unification and the Fusion method (Coleman et al., 1994; Malan et al., 1996) represents one of the first attempts to combine good techniques from other published methods, although some commentators have viewed the collection of techniques as poorly integrated. There is a process associated with Fusion although published descriptions of it appear incomplete compared to the proprietary versions sold by Hewlett-Packard. The modern object-oriented developer had to find a way to pick out the noodles from this rich soup of techniques. Because of this and because there were many similarities between methods it began to be felt by most methodologists that some sort of convergence was in order

The OPEN Consortium was an informal group of about 30 methodologists, with no common commercial affiliation, that wanted to see greater method integration but felt strongly that methods should include a complete process, should be in the public domain, should not be tied to particular tools and should focus strongly on scientific integrity as well as pragmatic issues. The founding members of OPEN were Brian Henderson-Sellers and myself who began to integrate the MOSES and SOMA process models. The result was published as Graham et al. (1997b). They were soon joined by Don Firesmith who started work on an integrated notation (OML) with the aim of exhibiting a more pure object-oriented character than the OMT-influenced UML and one that would be easier to learn and remember (Firesmith et al., 1997).

Jim Rumbaugh left GE to join Grady Booch at Rational Inc. These two merged their notations into what became the first version of UML (Booch et al., 1999). Later they were joined by Ivar Jacobson who added elements of his Objectory notation and began the development of the Rational Unified Process (RUP). UML was submitted to the OMG for standardization and many other methodologists contributed ideas, although the CASE tool vendors have generally resisted both innovations that would cause them to rewrite their tools and simplifications that would make their tools less useful. The OPEN consortium proposed the semantically richer OML which was largely ignored despite many good ideas, probably largely due to its over-complicatedness (Firesmith at al., 1997). Real-time elements were added based on the ROOM method (Selic et al., 1994) and a formal constraint language, OCL, heavily influenced by Syntropy (Cook and Daniels, 1994) introduced. A notation for multiple interfaces to classes was based on Microsoft's work on COM+. Activity diagrams for process modelling were based on the Martin-Odell method. The idea of stereotypes adopted in UML was based on ideas proposed by Rebecca Wirfs-Brock (though much mangled in the first realizations of the standard). The struggle to improve UML continues and we will therefore not assume a completely fixed character for it in this text. Thus were issues of notation largely settled by the end of the 1990s, which has shifted the emphasis to innovation in the field of method and process. Among the most significant contributions to analysis and design methodology, following the renaissance of UML, was Catalysis (D'Souza and Wills, 1999) which was the first method to contain specific techniques for component-based development along with coherent guidance on how the UML should be used. Our own work showed that objects could be regarded as intelligent agents if rulesets were added to type specifications. This generalized the insistence in other methods (notably BON, Syntropy and Catalysis) that invariants were needed to specify types fully.

Figure 1 shows the relationships between several object-oriented methods, languages and notations discussed in this tutorial. See Appendix B for a discussion of these methods and others.

Figure 1 Some of the influences on UML.