![]() |
![]() ![]() |
5.5 Architectural Linkage to Software DevelopmentIf we apply proper architectural principles to create and maintain software structure, potential cost saving could be 50% or greater [Horowitz 1993]. Good software structure is a function of the overall application architecture, the software interfaces or what is called confrontational architecture, and the implementation itself (Figure 5.7). Figure 5.7. Computational Specification Links Architecture and ImplementationComputational interfaces may be the key enabler for improved software structure. Software interfaces as specified in IDL define boundaries between modules of software. If the software interfaces are coordinated architecturally, it is possible to define the boundaries for application programmers so that the intended structure of the application becomes its implemented structure. In practice, the authors have found that the specification of software interfaces provides an actual benefit to the programmers because they then have a guideline for how to implement software independently of other application developers. When developers share the same specification, their software can then interoperate, even though the applications are developed separately. Figure 5.8 describes the overall process for how these kinds of results can be achieved. Starting with a set of enterprise requirements for a community of users, a business object analysis process can define the overall structure and characteristics of the application environment. Business object analysis is an object-oriented analysis in which both end-users and object-oriented modelers and architects participate in defining new information technology capabilities that satisfy the needs of the business and the constraints of the technology implementation process. After the business object analysis has produced object models, there is a further step, a drill-down exercise to define the common interface definitions. The common interface definitions are the software interfaces that define the actual internal software boundaries for the system. This is a drill-down exercise because these interfaces will specify the actual operations and parameters that are passed throughout the software system. Figure 5.8. Sample Architecture-Driven ProcessThe common interface definitions must be coordinated with individual software projects in order for the appropriate lessons learned and legacy-migration considerations to be incorporated into the designs. As the common interface definitions mature and are applied across multiple projects, these definitions can become localized standards and profiles for the community of developers. They can provide useful information for new developers and commercial vendors that may want to participate in the interoperability solutions. It is not sufficient for interface specifications to stand alone. One important lesson that has been repeatedly discovered is that no matter how precise a specification is, a definition of how applications use this specification is required to ensure interoperability. This requirement is equivalent to the profiling concept introduced in Chapter 3. Figure 5.9 shows how a set of both horizontal and vertical specifications can be constrained with respect to a profile so that application developers will be much more likely to provide interoperability between separate implementations. There is a distinct difference between specifications and profiles, which needs to be incorporated into software process. A specification such as an IDL definition should be designed so that it can be reused across multiple applications or families of systems. The profile information, on the other hand, should correspond to specific applications and families of systems so that the conventions can be specialized without compromising the reusability of the overall specification. Specifications can be standardized either locally within the organization or on a more global scale through organizations like the object management group. However, profiles should remain fluid. Profiles in their best case are documented developer agreements for how standards specifications are used in specific instances. Figure 5.9. Interoperability ProfileIdentifying the appropriate categories of specifications to be standardized is a challenge that many organizations never overcome. The process that has been applied repeatedly to achieve this purpose is shown in Figure 5.10. The problem for many individual software development projects and end-users is understanding the need for commonality and how that need is distinguished from the actual design and architecture of specific applications. The same problem arises in identifying common data elements when commonality of information architecture is desired. The first step in the process is to basically survey the available requirements and technologies and other kinds of input that provide stakeholder impact on the selection of common functionality. Given that a broadly based survey across the scope of the enterprise is impossible, a smaller group of architects can get together and brainstorm some of the candidate facilities for interface coordination. Figure 5.10. Large-Scale Architecture-Driven ProcessAbstracting the selection of these facilities in an architectural block diagram to display how some facilities play roles that are horizontal in relationship to some of the others is essential. It is also important to define a diagram extraction in order to communicate the structure of architecture of this scale to multiple stakeholders in these deliberations. In Step 4, the individual facilities identified earlier are defined and documented as to their scope and basic functionality. This definition is necessary to constrain the drill-down process, which will be necessary in order to drive out the details for the interface definitions or data element definitions. In Step 5, a review process allows the various stakeholders in the architecture to verify that their needs are being met and also to build consensus across the enterprise for funding the reusable assets that will be defined when the interfaces are created. Step 6 in the process is to slow the pace of architectural decision making and stabilize the architecture. After multiple review iterations of the architecture document among all the potential stakeholders, the document needs to be published. It is then appropriate to use this information in tutorials and to make sure that there is a thorough understanding across the developer community. Many organizations often overlook this final step of communicating architectural vision because, after approval is obtained, many architects assume that potential developers will be constrained by the organizational decision and that it is an appropriate transfer of responsibility to individual developers to understand what has been documented. There is a key distinction between what happens in Steps 1 through 6 and what happens in Step 7. In Steps 1 through 6, the design of the architecture is being delivered and there is open discussion of potential extensions and changes, particularly among individual architects who are highly cognizant of the design implications. In Step 7, the assumption is that the architecture has been stabilized and that individual elements of the architecture are no longer the subject of debate. It is not possible to disseminate the architecture properly if the perception is that the debate is continuing. This phenomenon is the downfall of some significant and otherwise well-conceived architectures. Figure 5.11 shows the overall prices for architectural migration. The migration process starts with some preexisting software including legacy applications, commercial software, and the possible use of shareware or freeware. Mixed into this is the creation of new software that will be implementing many new capabilities within the target system. The architectural migration process is influenced by business needs and by the definition of enterprise architecture that we described earlier, with a focus on the computational interfaces, which are the real keys to controlling software boundaries. After the target architecture is defined, there is a continuous process of migration. Figure 5.11. System Architecture MigrationThe process of migration may take many years to realize and may never truly be completed. The kind of migration that we recommend is sometimes called chicken little migration because it does not assume that on any specific date the legacy system will be shut down and the new system turned on at potentially substantial risk to the organization if the new system is not perfect. In chicken little migration, the capabilities of the legacy that already provides business value in the capabilities of the target system can be brought online or transferred as the target system takes form. Figure 5.12 shows one of the key concepts in how the target system is realized by leveraging legacy applications. The legacy application will have one or more mechanisms for transferring information. At a minimum, a legacy system maintains some files on disk or perhaps a database, and the legacy implication may have more than that; for example, it may have some application program interfaces that are documented or some other types of command-line interfaces. Figure 5.12. Legacy Object Wrapping ApproachLegacy applications may comprise a majority of commercial software having the same kinds of mechanisms available for the purpose of integration. The authors' experience with object-oriented integration points to a different set of mechanisms for virtually every legacy and commercial package encountered. The purpose of the object wrapper is to map from the preexisting interfaces to the target architectural interfaces that may be defined using more coarsely grained interfaces and expressed using OMG IDL or XML WSDL. In addition to providing a direct functional mapping, the capabilities of the target architecture should be considered and will reside in the resulting object wrapper. For example, a distributed-object architecture typically has one or more directory services to enable applications to discover dynamically other applications in the environment without hardwired programming of point-to-point integration. The support for metadata director services is one of the new functions that the object wrapper can provide. Other kinds of functions in the wrapper include support for security, system management, and data interchange. Object-oriented technology enables the creation of significant applications. Through survey research, we have discovered some of the key challenges to the migration to object technology. Overall, the key challenge is the difficulty for an organization in establishing the enterprise system architecture. Several noteworthy quotes were provided from anonymous sources at the front lines: "People start in the middle of the software process, immediately begin development without doing their homework, and proceed until failure with no vision, no business process, and an incomplete architecture." Another challenge is in management of the object-oriented process, which differs in some fundamental ways from how software processes from previous paradigms were managed. "People are solving tomorrow's problems with today's technology and yesterday's methodology." Another frequently encountered challenge was a difficulty in sustaining architecture during the development and maintenance phases of the software life cycle, once the enterprise architecture had been established. Common laments included: "It is easier to scope and start over rather than to figure out what they did" and "Requirements evolve during implementation, leading to hack design." Other types of challenges were perceived as smaller obstacles than one might expect. For example, compared to architectural and management issues, technology requirements were accorded a fairly low priority in the migration to object technology. |
![]() |
![]() ![]() |