EServices Capability Model ( escm) Annotated Bibliography

Page created by Pedro Fisher
 
CONTINUE READING
scm
                            eServices Capability Model (e                       )
                                Annotated Bibliography
                            Bennet Kumar, Vivek Mahendra, Elaine Hyder,
                              Elise Nawrocki, K. Madhu, Rajesh Gupta
                                           August 2001
                                         CMU-CS-01-125

                                        School of Computer Science
                                        Carnegie Mellon University
                                        Pittsburgh, PA 15213-3890

                                                  Abstract
The eServices Capability Model is being developed to enable IT-enabled service providers to appraise
and improve their capability to provide consistently high quality services in the Internet economy. The
framework for this model will enable service providers to establish and manage continually improving
relationships with their clients. The primary focus of existing quality models is only on the execution of a
contract. The eServices Capability Model addresses the contract activities related to the design and
development of an outsourced service but also asserts that successful outsourcing necessitates a focus on
(1) the activities leading to the formation of outsourcing relationships and (2) the transitioning or
termination of outsourced services. The research used to design this model is represented in this
Annotated Bibliography and is organized into the following sections, listed alphabetically: Models and
Assessment Methods (Articles and Reports [1-13], CMM-Related [14-31], and Other [32-45]);
Outsourcing [46-164]; Outsourcing (Application Service Provider [165-177], Data Capture, Integration
and Analysis Services [178-179], Engineering Services [180-197], Human Resource Services [198-211],
Multimedia and Animation Services [212-218], Remote Customer Interaction [219-225], Transcription
Services [226-240], and Call Center [241]); Standards [242-256], Standards, Articles and Reports [257-
272]; Strategic Alliances [273-280]; and Other [281-295]. Entries are ordered alphabetically by author’s
last name. Where appropriate, copies of papers are available from: Dr. Jane Siegel, School of Computer
Science, Carnegie Mellon University, 3603 Newell Simon Hall, 5000 Forbes Avenue, Pittsburgh, PA
15213, or via e-mail at jals@cs.cmu.edu.
This research is supported by Satyam Infoway Ltd. (SIFY), the leading e-business solutions and Internet
infrastructure provider in India, and its subsidiary Satyam Serwiz.com Limited.
The views and conclusions contained in this document are those of the authors and should not be
interpreted as representing the official policies, either expressed or implied, of Satyam Infoway Ltd. or
Satyam Serwiz.com Limited.

© 2001 by Carnegie Mellon University. All rights reserved.                                        1
Keywords: Application Service Provider, Data Capture, Data Integration, Data Analysis Services,
Engineering Services, Human Resource Services, Multimedia and Animation Services, Remote Customer
Interaction, Transcription Services, Call Center Services, Strategic Alliances, Quality, Standards, CMM,
Models and Assessment Methods, Outsourcing.

© 2001 by Carnegie Mellon University. All rights reserved.                                     2
MODELS AND ASSESSMENT METHODS
     Articles and Reports

[1] Das, A., Soh, C.W.L., and Lee, P.C.B. (1999). A Model of Customer Satisfaction with
     Information Technology Service Providers: An Empirical Study. Academy of Management
     Review: 190-193.
             The concept of customer satisfaction is gaining importance in the information technology
     (IT) industry because organizations have increasingly outsourced their operations and
     development activities to IT service providers. The service providers are concerned with
     improving the level of customer satisfaction for two reasons. First, it contributes to overall
     systems success in the customer organization. Second, it helps the service providers to achieve
     the goal of retaining and growing their businesses. Drawing on the resource based view of the
     firm which suggests that employees are a strategic resource in differentiating one service
     provider from another, the authors hypothesize that the quality of service is dependent on
     employee satisfaction and employee contextual knowledge. This study is conducted in
     collaboration with a large IT service provider. The study is divided into two stages. The first
     stage has already been completed. It involved the collection of data on customer satisfaction,
     service quality, solution quality, and price. Data were collected via a questionnaire survey
     distributed to customers of the service provider. The authors received 430 returns and are now in
     the process of analyzing the data. In the second stage of their study, the authors collected data on
     employee satisfaction, employee tenure, and employee experience. Employee tenure and
     employee experience are surrogates for employee contextual knowledge. After all the data has
     been collected, the authors will use structural equations to test the hypotheses proposed in this
     study.

[2] Herbsleb, J., Carleton, A., Rozum, J., Siegel, J., and Zubrow, D. (1994). Benefits of CMM-Based
     Software Process Improvement: Initial Results. Pittsburgh, PA. Software Engineering Institute:
     64 pgs.
             Data from 13 organizations were collected and analyzed to obtain information on the
     results of CMM-based software process improvement efforts. The authors report the cost and
     business value of improvement efforts, as well as the yearly improvement in productivity, early
     defect detection, time to market, and post-release defect reports. Improvement efforts and results
     in five organizations are reported in more depth in case studies. In addition, technical issues that
     were confronted as the authors tried to measure the results of software process improvement are
     discussed. The paper ends with conclusions about the results of SPI efforts.

[3] Humphrey, W. S. (June 1992). Introduction to Software Process Improvement. Pittsburgh, PA.
     Software Engineering Institute. CMU/SEI-92-TR-7, ESD-TR-92-007. 36 pgs.
            While software now pervades most facets of modern life, its historical problems have not
     been solved. This report explains why some of the problems have been so difficult for
     organizations to address and the actions required to address them. It describes the Software
     Engineering Institute’s (SEI) software process maturity model, how this model can be used to
     guide software organizations in process improvement, and the various assessment and evaluation
     methods that use this model. The report concludes with a discussion of improvement experience
     and some comments on future directions for this work.

     © 2001 by Carnegie Mellon University. All rights reserved.                                 3
[4] Kitson, D. H. An Emerging International Standard for Software Process Assessment. 8 pgs.
             In June 1993, an international effort was chartered by ISO/IEC JTC1 to develop a
     standard for software process assessment. An important goal of this effort is to harmonize
     existing process assessment approaches. One of the risks in fielding such an international
     standard is that the appearance of a new and potentially incongruous approach to assessment and
     improvement could undermine or demotivate continued investment in process improvement by
     the software acquisition and software supplier communities. If the prospective standard can
     provide an integrating framework while still advancing the state of practice in software process
     assessment, it will be a very significant and positive accomplishment. This paper describes the
     current state of this standardization effort and discusses challenges which must be overcome.

[5] Kitson, D. H., and Kitson, L.J. (1998). ISO/IEC 15504 -Overview and Status.
            Briefing objectives include: promote awareness and understanding of a key emerging
    international software standard, identify the potential risks and benefits to the software
    community, explain how it relates to key CMM®-related products, and disseminate information
    on recent developments.

[6] Minnich, I. (1996). ISO 9001 and the SE-CMM. La Mirada, CA, SECAT LLC: 8 pgs.
    http://www.csz.com/secat
            This paper provides a top-level summary of the comparison between the Systems
    Engineering Capability Maturity Model (SE-CMM) and ISO 9001, the international standard on
    Quality Systems – Model for Quality Assurance in Design, Development, Production,
    Installation, and Servicing. People from many organizations have asked to characterize the
    overlap between the two documents. Questions such as "If we perform systems engineering
    according to the SE-CMM, or at an SE-CMM level 'x', are we ISO compliant?" are of interest to
    organizations that are trying to implement both concepts concurrently. The answer, of course, is
    "It depends." But the bottom line is that an organization can achieve a peaceful coexistence
    between the SE-CMM and ISO 9001 by planning ahead, looking at the requirements imposed by
    each document, and folding the results into the organization's way of doing business.

[7] Paulk, M. C. (1995). How ISO 9001 Compares with the CMM. IEEE Software: 74-83.
            Organizations concerned with ISO 9001 certification often question its overlap with the
     Software Engineering Institute’s Capability Maturity Model. The author looks at 20 clauses in
     ISO 9001 and maps them to practices in the CMM. The analysis provides answers to some
     common questions about the two documents.

[8] Software Engineering Institute (2000). The Evidence for CMM®-based Software Process
     Improvement. Pittsburgh, PA. Software Engineering Institute.
            Slide presentation includes the following: The Community Maturity Profile: Adoption of
     CMM-based Software Process Improvement (SPI) and high level results, Impacts of Software
     Process Assessments: What happens after the assessment, Project Management Processes: A
     major barrier for process improvement, and Benefits of CMM-based SPI.

     © 2001 by Carnegie Mellon University. All rights reserved.                             4
[9] Software Engineering Institute. SEMA Appraisal Submittal Packet.
     http://www.sei.cmu.edu/activities/sema/packet.html
             The Appraisal Submittal Packet includes Process Appraisal Information System Record
     of Entry form. The Organization Questionnaire Project Questionnaire (PAIS) Record of Entry
     form is designed to assist in the submittal of the appraisal artifacts. The Organization (OQ) and
     Project Questionnaires (PQ) are designed: (a) To assist an assessment team in pre-on site
     activities, such as: identifying software groups/projects within an organization (OQ), and
     identifying which group/projects to appraise (PQ), and (b) For collecting data for customers to
     enable them to see the aggregated software community profile reports.

[10] Software Engineering Institute Team, Measurement and Analysis (2000). Process Maturity
     Profile of the Software Community 1999 Year End Update. Pittsburgh, PA. Software
     Engineering Institute.
             This briefing uses information from reports of CMM® Based Appraisals for Internal
     Process Improvement (CBA IPIs) and Software Process Assessments (SPAs). This briefing
     includes three primary sections: Current Status (snapshot of the software community based on
     the most recent assessment, since 1995, of reporting organizations), Community Trends (global
     distribution of assessments, growth in the number of assessments performed, and shifts in the
     maturity profile over time), and Organizational Trends (analysis of Key Process Area (KPA)
     satisfaction, and time to move up in maturity).

[11] Software Engineering Institute Team, Measurement and Analysis (2000). Process Maturity
     Profile of the Software Community 2000 Mid-Year Update. Pittsburgh, PA. Software
     Engineering Institute.
             This briefing uses information from reports of CMM ® Based Appraisals for Internal
     Process Improvement (CBA IPIs) and Software Process Assessments (SPAs). This briefing
     includes three primary sections: Current Status (snapshot of the software community based on
     the most recent assessment, since 1996, of reporting organizations), Community Trends (global
     distribution of assessments, growth in the number of assessments performed, and shifts in the
     maturity profile over time), and Organizational Trends (analysis of Key Process Area (KPA)
     satisfaction, and time to move up in maturity).

[12] SPICE Team (1998). Phase 2 Trials Interim Report, Approved for Public Release, Version 1.00:
     174 pgs.
     http://www-sqi.cit.gu.edu.au/spice/
     http://www.iese.fhg.de/SPICE
             This report details the interim findings of the second phase of empirical trials conducted
     as part of the SPICE Project. The project was initiated by the International Standards group for
     Software Engineering, ISO/IEC JTC1/SC7, to develop a standard for software process
     assessment. The project is undertaking a set of trials to validate the emerging standard against the
     goals and requirements defined at the start of the SPICE Project and to verify the consistency and
     usability of its component parts. The project aims to test the emerging standard across a
     representative sample of organizations for differing scenarios of use to obtain rapid feedback and
     to allow refinement prior to publication as a full International Standard. The trials should
     determine whether the emerging standard satisfies the needs of its prospective users.

     © 2001 by Carnegie Mellon University. All rights reserved.                                 5
Furthermore, the trials are intended to provide guidance on applying the emerging standard. Such
     an exercise is unprecedented in the software engineering standards community and provides a
     unique opportunity for empirical validation. An international trials team was established to plan
     and organize the trials and analyze the results. Trials are being structured into phases, each with
     its own objectives and scope. An appropriate organization and infrastructure (including
     procedures and data collection mechanisms) are established to support each phase, from
     selection and conduct of trials through to analysis of trials data and reporting of results. The
     original trials plan organized the SPICE Trials into three broad phases. The first phase is
     completed. This project is currently in the second phase. Data collection for the third phase is
     planned to start in September 1998.

[13] Tingey, M. O. (1997). Comparing ISO 9000, Malcolm Baldrige and the SEI CMM for Software,
     Prentice Hall PTR: Upper Saddle River, NJ.
            This book compares three quality management systems (QMS) assessment methods:
     Malcolm Baldrige National Quality Award, ISO 9001, and the Software Engineering Institute
     (SEI) Capability Maturity Model for Software (CMM). This book also establishes a framework
     from which to compare QMS assessment methodologies in general. This outline provides a
     methodology which is best suited for the Organization's QMS. It also provides a cross reference
     among the various methodologies for specific aspects of a QMS and an overview and detailed
     analysis of each methodology. Further, it presents a complete translation of all assessment
     methodology requirements into statements of activity. The book is divided into 5 parts. Part 1
     provides an introduction and a backdrop to better understand the comparison. Part 2 provides an
     overview for each of the 3 QMS assessment methodologies. Part 3 is the core part of the book,
     and compares the three methodologies. Part 4 provides the framework used for comparing the
     QMS methodologies. Part 5 provides the detailed requirements of the 3 models.

     © 2001 by Carnegie Mellon University. All rights reserved.                                6
MODELS AND ASSESSMENT METHODS
     CMM-Related Documents

[14] Byrnes, P., and Phillips, M. (1996). Software Capability Evaluation Version 3.0 Method
     Description: Pittsburgh, PA. Software Engineering Institute. Technical Report, CMU/SEI-96-
     TR-002, ESC-TR-96-002. 192 pgs.
            This report describes Version 3.0 of the Software Capability Evaluation (SCE) Method.
     SCE is a method for evaluating the software process of an organization to gain insight into its
     process capability. This version of the SCE Method is based on the Capability Maturity Model
     (CMM) defined in Capability Maturity Model for Software, Version 1.1 [Paulk 93a]. It is
     compliant with the CMM Appraisal Framework (CAF) [Masters 95]. This document is an update
     to SCE Version 2.0 [CBA Project 94].

[15] CMMI Product Development Team (2000). CMMIsm-SE/SW, V1.0 Capability Maturity
     Model® – Integrated for Systems Engineering/Software Engineering, Version 1.0 - Continuous
     Representation. Pittsburgh, PA. Software Engineering Institute: 618 pgs.
            The CMM Integration project was formed to sort out the problem of using multiple
     CMMs. The project’s mission was to combine three source models—(1) Capability Maturity
     Model for Software (SW-CMM®) v2.0 draft C, (2) Electronic Industries Alliance/Interim
     Standard (EIA/IS) 731, and (3) Integrated Product Development Capability Maturity Model
     (IPD-CMM) v0.98—into a single model for use by organizations pursuing enterprise-wide
     process improvement.

[16] CMMI Product Development Team (2000). CMMIsm-SE/SW, V1.0 Capability Maturity
     Model® – Integrated for Systems Engineering/Software Engineering, Version 1.0 - Staged
     Representation. Pittsburgh, PA. Software Engineering Institute: 602 pgs.
            The CMM Integration project was formed to sort out the problem of using multiple
     CMMs. The project’s mission was to combine three source models—(1) Capability Maturity
     Model for Software (SW-CMM ® ) v2.0 draft C, (2) Electronic Industries Alliance/Interim
     Standard (EIA/IS) 731, and (3) Integrated Product Development Capability Maturity Model
     (IPD-CMM) v0.98—into a single model for use by organizations pursuing enterprise-wide
     process improvement.

[17] CMMI Product Development Team (2000). ARC, V1.0 Assessment Requirements for CMMI,
     Version 1.0. Pittsburgh, PA. Software Engineering Institute: 47 pgs.
     http://www.sei.cmu.edu/cmmi/
             The Assessment Requirements for CMMI (ARC) V1.0 defines the requirements
     considered essential to assessment methods intended for use with CMMI models. In addition, a
     set of assessment classes is defined based on assessment usage scenarios. These classes are
     intended primarily for developers of assessment methods to use with CMMI capability models in
     the context of the CMMI Product Suite. Additional audiences for the document include lead
     assessors, and other individuals who are involved in or may be interested in process assessment
     or improvement. The approach employed to provide guidance to assessment method developers
     is to define a class of assessment method usage scenarios (which are based on years of
     experience in the process improvement community) called assessment classes. Requirements are
     then allocated to each class as appropriate based on the attributes associated with that class.

     © 2001 by Carnegie Mellon University. All rights reserved.                            7
Thus, a particular assessment method may declare itself to be an ARC class A, B, or C
     assessment method. This designation implies the sets of ARC requirements which the method
     developer has considered when designing the method. Assessment methods which satisfy all of
     the ARC requirements are called class A methods; in addition to being used to render ratings for
     benchmarking purposes, class A assessment methods can be used to conduct 15504-conformant
     assessments.

[18] CMMI Product Development Team. (October 2000). SCAMPI - Standard CMMI Assessment
     Method for Process Improvement. Pittsburgh, PA. Software Engineering Institute. CMU/SEI-
     2000-TR-009, ESC-TR-2000-009. 86 pgs.
            This document describes the Standard CMMI Assessment Method for Process
     Improvement (SCAMPI). This document explains the role of assessments in the context of the
     IDEAL (Initiating, Diagnosing, Establishing, Acting, Leveraging) approach to software process
     improvement. The SCAMPI method is based on the CMM-Based Appraisal for Internal Process
     Improvement (CBA IPI) V1.1 assessment method [Dunaway 96b] and the Electronic Industries
     Alliance/Interim Standard (EIA/IS) 731.2 Appraisal Method [EIA 98b]. SCAMPI satisfies the
     Assessment Requirements for CMMI (ARC) V1.0 [SEI 00a] and is a Class A assessment method
     2. This method helps an organization gain insight into its process capability or organizational
     maturity by identifying strengths and weaknesses of its current processes relative to one or more
     of the CMMI models. Guidelines are provided for establishing resource requirements for
     conducting a SCAMPI assessment.

[19] Cooper, J., Fisher, M., and Sherer, S.W. (Eds.) (1999). Software Acquisition Capability Maturity
     Model (SA-CMM) Version 1.02. Pittsburgh, PA. Software Engineering Institute: 168 pgs.
              Government and industry have the need to improve the maturity of their internal software
     acquisition processes. In order for organizations to make improvements, they must know the
     ultimate goal and what is required to achieve that goal. Additionally, progress toward achieving
     the goal must be measurable. A capability maturity model provides the framework needed to
     facilitate the desired improvement. The Software Acquisition Capability Maturity Model (SA-
     CMM) has been developed to provide such a framework. This new version incorporates change
     requests that have been received, as well as the results of lessons learned from conducting
     appraisals and from the use of Version 1.01.

[20] Curtis, B., Hefley, W.E., and Miller, S. (1995). Overview of the People Capability Maturity
     Model. Pittsburgh, PA. Software Engineering Institute: 77 pgs.
             This document provides an overview and an introduction to the People Capability
     Maturity Model (P-CMM) [Curtis95]. Specifically, this document defines the concepts necessary
     to understand the P-CMM and the motivation and purpose behind the P-CMM. This overview
     describes the P-CMM structural components, consisting of key process areas within the five
     maturity levels of the P-CMM, and the principles that underlie each of the maturity levels.
     Finally, the document addresses potential uses of the P-CMM in assessing organizational
     practice or guiding improvement of an organization’s workforce capability.

     © 2001 by Carnegie Mellon University. All rights reserved.                              8
[21] Curtis, B., Hefley, W.E., and Miller, S. (1995). People Capability Maturity Model. Pittsburgh,
     PA. Software Engineering Institute: 444 pgs.
             To provide guidance to organizations that want to improve the way they address these
     people-related issues, the SEI has developed the People Capability Maturity Model SM (P-
     CMMsm). The P-CMM is a maturity framework, patterned after the structure of the CMM, that
     focuses on continuously improving the management and development of the human assets of a
     software or information systems organization. The P-CMM provides guidance on how to
     continuously improve the ability of software organizations to attract, develop, motivate,
     organize, and retain the talent needed to steadily improve their software development capability.

[22] Dunaway, D. K., Seow, M.L., and Baker, M. (April 2000). Analysis of Lead Assessor Feedback
     for CBA IPI Assessments Conducted July 1998 - October 1999. Pittsburgh, PA. Software
     Engineering Institute. CMU/SEI-2000-TR-005, ESC-TR-2000-005. 45 pgs.
            This document consolidates and analyzes information from Lead Assessor Requirements
     Checklists that were submitted by Lead Assessors in assessments conducted using the Capability
     Maturity Model - Based Appraisal for Internal Process Improvement (CBA-IPI) method. A total
     of 83 Lead Assessor Requirements Checklists were completed and submitted between July 1998
     and October 1999. This document is organized based on the format of Lead Assessor
     Requirements Checklists, which are grouped in four major sections, as (a) Planning the
     assessment, (b) conducting the assessment, (c) reporting results, and (d) additional questions.
     The findings for each of these major sections are presented in the chapters and within each
     chapter an analysis of the results for each question that is significant or meaningful is presented.

[23] Dunaway, D. K., and Masters, S. (1996). CMMsm-Based Appraisal for Internal Process
     Improvement (CBA IPI): Method Description. Pittsburgh, PA. Software Engineering Institute:
     57 pgs.
              This document is a high-level overview of the CMMsm-Based Appraisal for Internal
     Process Improvement (CBA IPI) V1.1 assessment method. It provides a brief history of SEI
     appraisal methods, as well as establishing appraisals in the context of the IDEALsm approach to
     software process improvement. CBA IPI is a diagnostic tool that supports, enables, and
     encourages an organization’s commitment to process improvement. The method helps an
     organization gain insight into its software development capability by identifying strengths and
     weaknesses of its current processes related to the Capability Maturity Model(sm) for Software
     V1.1. The method focuses on identifying software improvements that are most beneficial, given
     an organization’s business goals and current maturity level. Brief descriptions of the method
     activities, roles, and responsibilities are provided. In addition, guidelines are provided for
     establishing resource requirements for conducting a CBA IPI. The SEI Appraiser Program is
     discussed, detailing the requirements for persons qualified to lead CBA IPIs.

[24] Garcia, S. M. Evolving Improvement Paradigms: Capability Maturity Models and ISO/IEC
     15504 (PDTR). Pittsburgh, PA. Software Engineering Institute. 12 pgs.
            This paper describes the evolution of the structure and representation of Capability
     Maturity Models(sm) and various components of the ISO/IEC 15504 (PDTR) product set,
     formerly known as “SPICE”--Software Process Improvement and Capability dEtermination.

     © 2001 by Carnegie Mellon University. All rights reserved.                                 9
“15504” will be used as shorthand for the product set encompassed by the 15504 project. The
     paper focuses on historical, structural, and conceptual evolution of the two product types.

[25] Hefley, W. E., and Curtis, B. (1998). People CMM® -Based Assessment Method Description
     Version 1.0. Pittsburgh, PA. Software Engineering Institute: 103 pgs.
             This document provides a high-level overview of the People Capability Maturity Model
     SM (CMM®)-Based Assessment Method. It introduces the People CMM as a source of
     guidelines for improving the capability and readiness of an organization's workforce in the
     context of the IDEAL(sm) approach to process improvement. In order to measure the capability
     and maturity of an organization's workforce practices, an appraisal method has been developed
     for the People CMM. This document describes the requirements and methods for the People
     CMM-Based Assessment Method. This method is a diagnostic tool that supports, enables, and
     encourages an organization’ s commitment to improving its ability to attract, develop, motivate,
     organize, and retain the talent needed to steadily improve its organizational capability. The
     method helps an organization gain insight into its workforce capability by identifying strengths
     and weaknesses of its current practices related to the People CMM. The method focuses on
     identifying improvements that are most beneficial, given an organization’ s business goals and
     current maturity level. Brief descriptions of the method activities, roles, and responsibilities are
     provided. The SEI Appraiser Program is discussed, detailing the requirements for persons
     qualified to lead People CMM-Based Assessments.

[26] Humphrey, W. S., and Sweet, W.L. (September 1987). A Method for Assessing the Software
     Engineering Capability of Contractors, Preliminary Version. Pittsburgh, PA. Software
     Engineering Institute. CMU/SEI-87-TR-23, ESD-TR-87-186. 40 pgs.
             This document provides guidelines and procedures for assessing the ability of potential
     DoD contractors to develop software in accordance with modern software engineering methods.
     It includes specific questions and a method for evaluating the results.
[27] Masters, S., and Bothwell, C. (1995). CMM Appraisal Framework, Version 1.0. Pittsburgh, PA.
     Software Engineering Institute: 76 pgs.
             This technical report describes version 1.0 of the CMM Appraisal Framework (CAF).
     This framework describes the common requirements used by the CMM-Based Appraisal (CBA)
     project in developing appraisal methods based on the Capability Maturity Model (CMM) for
     Software, Version 1.1 [Paulk 93a]. The CAF provides a framework for rating the process
     maturity of an organization against the CMM. The CAF includes a generic appraisal architecture
     for CMM-based appraisal methods and defines the requirements for developing CAF compliant
     appraisal methods.

[28] Miller, S. (2000). People Capability Maturity Model ® Baseline Maturity Profile. Pittsburgh,
     PA. Software Engineering Institute.
             Slide Presentation consists of the following: People Capability Maturity Model v1.0
     released in September 1995, First People CMM Assessment conducted in March 1996, There are
     eight SEI-Authorized People CMM Lead Assessors, and 85 Individuals have applied to become
     People CMM Lead Assessors.

     © 2001 by Carnegie Mellon University. All rights reserved.                               10
[29] Paulk, M. C., Curtis, B., Chrissis, M.B., and Weber, C.V. Capability Maturity Model(sm) for
     Software, Version 1.1. Pittsburgh, PA, Software Engineering Institute: 82 pgs.
             This paper provides a technical overview of the Capability Maturity Model for Software
     and reflects Version 1.1. Specifically, this paper describes the process maturity framework of
     five maturity levels, the structural components that comprise the CMM, how the CMM is used in
     practice, and future directions of the CMM.

[30] Whitney, R., Nawrocki, E., Hayes, W., and Siegel, J. (March 1994). Interim Profile-
     Development and Trial of a Method to Rapidly Measure Software Engineering Maturity Status.
     Pittsburgh, PA, Software Engineering Institute.
             Development of an interim profile (IP) method was driven by a business need to rapidly
     measure an organization’s software engineering process maturity between organizational
     software process assessments (SPAs). This document provides information about the process
     used to develop the method and a description of the method to software engineering process
     group (SEPG) members and practitioners responsible for diagnosing software process maturity.
     This document also addresses the next steps in the further development and use of the interim
     profile method.

[31] Zubrow, D., Hayes, W., Siegel, J., and Goldenson, D. (June 1994). Maturity Questionnaire.
     Pittsburgh, PA. Software Engineering Institute. CMU/SEI-94-SR-7.
             This questionnaire focuses solely on process issues, specifically those derived from the
     CMM. The questionnaire is organized by KPAs and covers all 18 KPAs of the CMM. It
     addresses each KPA goal in the CMM but not all of the key practices. This document covers the
     software process maturity questionnaire, a placard providing instructions on the response options
     for the questions and a glossary.

     © 2001 by Carnegie Mellon University. All rights reserved.                             11
MODELS AND ASSESSMENT METHODS
     Other

[32] Abrardo, A., Caldelli, R., Cowderoy, A., Donaldson, J., Granger, S., and E. and Veenendaal
     (1998). The MultiSpace Application Priorities, ESPRIT Project 23066: 59 pgs.
             In the context of multimedia development, the quality of multimedia systems and titles
     can be defined in terms of a series of 21 quality sub-characteristics, each of which is measurable.
     Suggestions are made of which quality sub-characteristics are likely to be important for 7
     different cases. In the context of multimedia, the interpretation of quality needs to pay special
     attention to motivation, cost and interconnectivity. Issues concerning involvement and training
     are especially important for the author and content specialists. The production of multimedia
     involves a diverse range of organizations, project types, users, and technologies. A scheme is
     proposed for classifying these. In the case of users, this needs to address both stereotyping and
     classification.

[33] Baldridge National Quality, P. (2000). Criteria for Performance Excellence: 60 pgs.
     http://www.quality.nist.gov
             The Malcolm Baldridge Criteria for Performance Excellence are the basis for
     organizational self-assessments, for making Awards, and for giving feedback to applicants. In
     addition, the Criteria have three other important roles in strengthening U.S. competitiveness: to
     help improve organizational performance practices and capabilities; to facilitate communication
     and sharing of best practices information among U.S. organizations of all types; and to serve as a
     working tool for understanding and managing performance, and guiding planning and training.

[34] Chapman, P., Kerber, R., Clinton, J., Khabaza, T., Reinartz, T., and Wirth, R. (1999). The
     CRISP-DM Process Model: 99 pgs.
             The CRISP-DM data mining methodology is described in terms of a hierarchical process
     model, consisting of sets of tasks described at four levels of abstraction (from general to
     specific): phase, generic task, specialized task, and process instance. At the top level, the data
     mining process is organized into a number of phases; each phase consists of several second-level
     generic tasks. This second level is called generic, because it is intended to be general enough to
     cover all possible data mining situations. The generic tasks are intended to be as complete and
     stable as possible. Complete means covering both the whole process of data mining and all
     possible data mining applications. Stable means that the model should be valid for yet
     unforeseen developments like new modeling techniques. The third level, the specialized task
     level, is the place to describe how actions in the generic tasks should be carried out in certain
     specific situations. For example, at the second level there might be a generic task called clean
     data. The third level would describe how this task differed in different situations, such as
     cleaning numeric values versus cleaning categorical values, or whether the problem type is
     clustering or predictive modeling. The description of phases and tasks as discrete steps
     performed in a specific order represents an idealized sequence of events. In practice, many of the
     tasks can be performed in a different order and it will often be necessary to repeatedly backtrack
     to previous tasks and repeat certain actions. This process model does not attempt to capture all of
     these possible routes through the data mining process because this would require an overly
     complex process model. The fourth level, the process instance, is a record of the actions,
     decisions, and results of an actual data mining engagement. A process instance is organized

     © 2001 by Carnegie Mellon University. All rights reserved.                              12
according to the tasks defined at the higher levels, but represents what actually happened in a
     particular engagement, rather than what happens in general.

[35] Cooper, J, Fisher, M., and Sherer, S.W. (April 1999). Software Acquisition Capability Maturity
     Model (SA-CMM) Version 1.02. Software Engineering Institute, Carnegie Mellon University,
     CMU/SEI-99-TR-002, ESC-TR-99-002, 158 pgs.
     http://www.sei.cmu.edu/pub/documents/99.reports/pdf/99tr002.pdf
              Government and industry have the need to improve the maturity of their internal software
     acquisition processes. In order for organizations to make improvements, they must know the
     ultimate goal and what is required to achieve that goal. Additionally, progress toward achieving
     the goal must be measurable. A capability maturity model provides the framework needed to
     facilitate the desired improvement. The Software Acquisition Capability Maturity Model (SA-
     CMM) has been developed to provide such a framework. This new version incorporates change
     requests that have been received, as well as the results of lessons learned from conducting
     appraisals and from the use of Version 1.01.

[36] Cowderoy, A. (1998). Final Report of the MultiSpace Project: 11 pgs.
     www.brameur.co.uk/qpi/projects/multispace/
     www.mmhq.co.uk/mmhqeurope/multispace/
             The MultiSpace project has demonstrated that practices for defining and improving
     quality developed in engineering disciplines can be adapted and applied to the production of
     multimedia. Guidelines are now publicly available (and commercially) for methods and
     instruments have been produced. Commercial opportunities have been created exploiting
     “multimedia quality” in the form of specification, evaluation and management services. Early
     uptake of the quality-oriented methods give a competitive advantage. A new marketing company
     has been established as a result of MultiSpace. Europe now has a head start in this area, but work
     is now needed on new methods, standards development and dissemination actions such as the
     new Center for MultiMedia Quality resulting from the project.

[37] Customer Operations Performance Center (2000). COPC-2000 Standard: External Customer
     Service Provider, Release 3. Amherst, Customer Operations Performance Center, Inc. 48 pgs.
             The COPC-2000® Standard is used globally by both buyers (clients) and providers of
     customer-contact and fulfillment services (collectively called customer-service providers, or
     CSPs) to improve the service quality provided to end-users and to reduce costs. Clients use the
     Standard for developing Requests for Proposals from CSPs, selecting CSPs, and managing their
     relationships with CSPs. CSPs use the Standard for assessing and improving their processes and
     performance. While not a guarantee of success, experience has shown that CSPs who adopt the
     Standard achieve higher service and quality levels, and lower costs (than competitors). CSPs
     include: Customer-contact centers: Operations that interact with end users via phone (i.e., call
     centers), electronic means (i.e., e-commerce centers), or traditional mail or fax. Fulfillment
     centers: Operations that perform assembly and pick, pack, and ship activities. The Standard is
     used by both internal CSPs (i.e., those that interact with their company’s own end users) and
     third-party CSPs (i.e., those that interact with the end users of their clients). The version
     presented here is for third-party CSPs. Additional versions exist for internal CSPs, healthcare
     CSPs, and e-commerce providers.

     © 2001 by Carnegie Mellon University. All rights reserved.                             13
[38] Daily, K., and Cowderoy, A. (Eds.) (1997). The MultiSpace Quality Framework, ESPRIT
     Project 23066: 138 pgs.
             The MultiSpace Framework describes what is meant by “multimedia quality” and
     considers the philosophy used in the construction of multimedia products. By following the
     various principles and methods described in the Framework, existing and new multimedia
     producers are able to ensure that their work will achieve higher quality. Quality Management is
     considered in terms of the good practice that may be developed with objective and target setting,
     evaluation and improved working processes. The different characteristics of end-users may be
     established and described, and their different needs and expectations can be assessed. These
     provide input to the definition of a set of quality objectives that satisfy the needs of the end-user,
     the content providers, the multimedia developer and their client (the publishing entity). The
     quality targets for individual activities are expressed in terms of “ external” quality
     characteristics, which may be evaluated without any reference to the internal workings of the
     product, and various internal measures of the content and functionality.

[39] Donaldson, J. (1998). Report on the Promotion of Intellectual Debate, MultiSpace Workpackage
     WP7. Technology Transfer: 7 pgs.
             As intellectual debate is needed to validate new ideas, stimulate new research and assist
     commercial exploitation, The MultiSpace project has made it a priority to ensure that its results
     and findings have been given suitable public exposure in the appropriate forums for such
     research. MultiSpace has produced 16 papers, several of which have already been published or
     presented world-wide. Workshops have been convened to allow public debate on the project’s
     findings and there is evidence of the project having had a discernible influence on experts in the
     area of cultural multimedia.

[40] Dorling, A. (1995). Software Process Assessment - Parts 1 - 9, Version 1.00 (Formerly IG
     Version 1.00).
     http://www-sqi.cit.gu.edu.au/spice/
             The SPICE Project, Version 1.0, an International Standard, provides a framework for the
     assessment of software processes and consists of the following titles: Part 1: Concepts and
     introductory guide, Part 2: A model for process management, Part 3: Rating processes, Part 4:
     Guide to conducting assessment, Part 5: Construction, selection and use of assessment
     instruments and tools, Part 6: Qualification and training of assessors, Part 7: Guide for use in
     process improvement, Part 8: Guide for use in determining supplier process capability, Part 9:
     Vocabulary.

[41] Earthy, J. (1999). Usability Maturity Model: Processes, Version 2.2 (Trump Version): 86 pgs.
            A process model for human-centred activities in the system lifecycle based on ISO
     13407, the British HCI Group ISM, the Philips HPI model and Eason and Harker's human
     system maturity model. The background to the model is described. The model describes seven
     processes each defined by a set of base practices. The base practices are defined. A set of work
     products are given for each process. A summary is provided of the ISO 15504 scale for the
     assessment of the maturity of processes. The uses of the model are outlined. A recording form
     is supplied and its use described. Mappings of the base practices to processes in SPICE, CMM

     © 2001 by Carnegie Mellon University. All rights reserved.                                 14
and SE-CMM are provided. The process model is conformant to ISO 15504. This version is
     prepared for INTERACT'99.

[42] Earthy, J. (1998). Usability Maturity Model: Human Centredness Scale, Version 1.2,
     Information Engineering Usability Support Centres: 34 pgs.
             An organizational human-centredness maturity scale based on Flanaghan's Usability
     Leadership scale, Sherwood-Jones' Total System maturity model and ISO 13407. The
     background to the scale is described. The scale has six levels defined by a set of attributes. Each
     attribute is defined by one or more management practices performed at that level. The
     management practices are defined. The uses of the scale are outlined. A recording form is
     supplied and its use described. Some indicators of personal attitude at each level are given. The
     scale is conformant to ISO 15504. This version specially prepared for the TRUMP project.

[43] Muns, R. (2000). Certified Support Center Model and Standards Document, Draft Version 8.0,
     Help Desk Institute: 23 pgs.
     www.HelpDeskInst.com
             HDI's Certified Support Center (CSC) program has been designed to conform to existing
     international quality standards, such as the EFQM (European Foundation for Quality
     Management), the Malcolm Baldridge National Quality Awards, and ISO9000. The model is
     based upon the European Foundation for Quality Management (EFQM) framework, with
     modifications to adapt the standards to be specific to the quality standards requirements of
     support center organizations. This includes eight model elements with standards within each
     element. The CSC standards are analogous to ISO9000 in that they require quality processes and
     procedures.

[44] Niessink, F., and van Vliet, H. (December 1999). The Vrije Universiteit IT Service Capability
     Maturity Model. Vrije Universiteit Amsterdam Technical Report IR-463, Release L2-1.0, 77 pgs.
             This document describes the Vrije Universiteit Information Technology Service
     Capability Maturity Model, or IT Service CMM for short. The IT Service CMM is a capability
     maturity model that specifies different maturity levels for organizations that provide IT services.
     Examples of IT services are the maintenance of software systems, operation of information
     systems, the management and maintenance of workstations, networks or mainframes, or the
     provision of contingency services. An important question is how these services should be defined
     and managed. The complexity of IT applications makes it difficult to properly tune customer
     requirements and service provider capabilities. Customers often cannot express their real service
     requirements and do not know the corresponding performance needs. Likewise, service providers
     often do not know how to differentiate between IT services and how to attune them to a specific
     customer. The IT Service CMM is aimed at enabling IT service providers to assess their
     capabilities with respect to the delivery of IT services and to provide IT service providers with
     directions and steps for further improvement of their service capability.

[45] Olson, T. G., Humphrey, W.S., and Kitson, D.H. (February 1989). Conducting SEI-Assisted
     Software Process Assessments. Pittsburgh, PA. Software Engineering Institute. CMU/SEI-89-
     TR-7, ESD-TR-89-07. 52 pgs.

     © 2001 by Carnegie Mellon University. All rights reserved.                              15
This report describes software process assessment as it is performed in organizations with
the assistance of the Software Engineering Institute (SEI). A software process assessment is an
appraisal or review of an organization’s software process (e.g., software development process).
The main objectives of such as assessment are to understand the state of practice in an
organization, to identify key areas for improvement, and to initiate the actions that facilitate
those improvements. This report is specifically addressed to the organizations and assessment
team members that may be involved in the SEI-assisted software process assessment.

© 2001 by Carnegie Mellon University. All rights reserved.                             16
OUTSOURCING

[46] ---------- (November/December 1995). And now for something different…Users speak out on
      Outsourcing. InfoServer.
               Customers and potential customers attending InfoServer's 2nd Annual Conference shared
      a strong interest in what could be called the 3Ms of outsourcing: management, measurement, and
      mercy.

[47] ---------- (1996). Electronic College of Process Innovation - Index of Articles and Case Studies.
     http://www.c3i.osd.mil/bpr/bprcd/mltc040.htm
              The topic for this electronic source is Strategic Alliances and Partnering, covering the
     following subjects: Case Studies, Concepts, How-To, Interviews, Lessons Learned, Research,
     and Tools.

[48] ---------- (2000). “Managing Vendors: Tips for Success.” Harvard Business Review: 1.
     http://www.hbsp.harvard.edu/hbsp/prod_detail.asp?U0003C
              With the growth of outsourcing, more and more businesspeople are responsible for
     managing relationships with suppliers. But what happens when the vendor doesn't deliver on
     time? What happens when it screws up on quality? Disaster can be avoided by managing your
     vendor as if it were a department in your company, making sure the contract allows you to get
     the information you need to judge the vendor's performance, and monitoring the relationship and
     changing the contract as necessary. Includes a sidebar entitled "When Things Go Wrong."

[49] ---------- Outsourcing Strategies, Volumes 1 and 2.
     http://www.bestpractice.haynet.com/reports/outsrc.html
              As a follow-up to The UK Outsourcing Report, Management Today and Market Tracking
     International have added a substantial and detailed supplementary report covering all aspects of
     outsourcing IT, from the operating environment and key drivers on how to manage the IT
     outsourcing process, providing up-to-date and succinct case studies across a variety of market
     sectors. The IT Outsourcing Supplementary Report aims to build on the material in Volume One
     and provide more detailed coverage of this topical and far-reaching subject.

[50] ---------- (1997). The Decision to Outsource - An Australian Government Report, The Parliament
     of the Commonwealth of Australia, Senate Finance and Public Administration References
     Committee.
     http://www.aph.gov.au/senate/committee/fapa_ctte/outsourcing/ch1_0.htm
              The ever growing pressure on governments throughout the 1980s and 1990s to control
     public expenditure has focused attention on outsourcing. Much of the evidence to the committee
     tended to argue for or against outsourcing almost as a matter of principle. The committee
     observes that such an approach is not helpful. In practice, outsourcing is just one option available
     to managers to pursue their agencies' objectives. In the specific area of IT the Commonwealth
     already sources a significant proportion of goods and services from the private sector. Virtually
     all hardware and a large proportion of software is purchased from the private sector and a
     proportion of other services - applications development, maintenance, training, help desk

     © 2001 by Carnegie Mellon University. All rights reserved.                               17
services - are also provided by private contractors. Thus, particularly with regard to the supply of
     hardware, the government's IT outsourcing initiatives represent a change of process from a
     decentralized agency focused model to a consolidated, integrated, service wide approach rather
     than a dramatic shift from public to private provision. This is not to suggest that the changes
     underway are not significant. The scale of outsourcing is greater than anything previously
     undertaken in the Australian IT market. The provision of services to agencies by private
     contractors where they were previously provided internally by the agencies' own employees
     represents a major change in the way the public sector works.

[51] ---------- (April 1995). The Smart Vendor Audit Checklist. Inc.
              Four years ago, hot tub builder Softub contracted out the assembly of the motor, pump,
     and control unit that provides the heat and jet action for the hot tubs. Chief executive Tom
     Thornbury felt comfortable about the vendor after meeting with the owner twice and because
     other customers raved about it. It was not long before equipment failures surfaced during testing
     and in customers' homes. Two years ago, after several other incidents involving substandard
     parts, Softub decided to do things differently. The big change was an audit team, led by
     purchasing agent Gary Anderson, which goes out to grill vendor candidates. To ensure that the
     audits would be effective, Anderson designed a vendor audit survey form which acts primarily as
     a check-list. The form forces the team to focus on specific areas so the team does not forget
     anything when it is on a visit. The payoff is that Softub is recruiting a better breed of supplier.

[52] Achstatter, G.A. (August 19, 1997). Executive Update. Investors Business Daily.
             According to the May/June, 1997 issue of Banking Strategies, people, in addition to
     technology, make the difference between successful outsourcing and the also-rans. Certain
     factors are critical to the management of outsourcing companies. The company must be willing
     to invest the necessary resources to train the company's employees in the working of the vendor's
     new system. The vendor should provide timely answers and support when the customer requires
     service. The customer or the vendor should have plans in place to conduct annual performance
     audits, or reviews, under the new system.

[53] Anthes, G.H. (April 7,1997). Net Outsourcing, A Risky Proposition. Computer World.
            Internet outsourcing is inherently difficult to manage because of its rapid rate of change
     and inadequately defined electronic commerce company strategies. There are several views that
     should help manage this type of contract. Sajoo Samuel, an Assistant Vice President of the First
     Chicago Trust in Jersey City, N.J., makes the outsourcing strategy work through close day-to-
     day management of the external relationships. An internal project manager works with the
     outsourcers daily while serving as liaison with internal business units. The project manager and
     outsourcer follow a schedule they have jointly committed to. When outsourcing to multiple
     vendors, policies must be in place for change management, that is, who will accept changes from
     whom. Attention must be paid to contract terms and service level guarantees.

[54] Applegate, L. M., and Collura, M. (January 3, 2001). Amazon.com—2000. Harvard Business
     Review #801194, 23 pgs.
     http://www.hbsp.harvard.edu/hbsp/prod_detail.asp?801194
             Enables a thorough analysis of Amazon.com and the company’s value proposition, in
     terms of its business concept, digital business capabilities, and community and shareholder

     © 2001 by Carnegie Mellon University. All rights reserved.                               18
value. Examines the company’s complex set of business models and web of business
     relationships, as well as Amazon’s plan to monetize (generate revenues and earnings through) its
     assets.

[55] Applegate, L. M., and Davis, K. (1995). “Xerox: Outsourcing Global Information Technology
     Resources.” Harvard Business Review 9-195-158: 31 pgs.
     http://www.hbsp.harvard.edu/hbsp/prod_detail.asp?195158
             In order to increase revenues, develop new technologies, and manage information
     technology more efficiently, Xerox decided to sign a 10-year, $3.2 billion contract with
     Electronic Data Systems (EDS). This case describes the events that preceded Xerox's decision to
     outsource information technology.

[56] Autor, D. H. (2000). Outsourcing at Will: Unjust Dismissal Doctrine and the Growth of
     Temporary Help Employment. Cambridge, MA, National Bureau of Economic Research: 51 pgs.
     http://www.nber.org/papers/w7557
             The U.S. temporary help services (THS) industry grew at 11 percent annually between
     1979 – 1995, five times more rapidly than non-farm employment. Contemporaneously, courts in
     46 states adopted exceptions to the common law doctrine of employment at will that limit
     employers’ discretion to terminate workers and opened them to litigation. This paper assesses
     whether the decline of employment at will and the growth of THS are causally related. To aid the
     analysis, the paper considers a simple model of employment outsourcing, the primary
     implication of which is that firms will respond to externally imposed firing costs by outsourcing
     positions requiring the least firm-specific skills rather than those with the highest expected
     termination costs. The empirical analysis indicates that one class of exception, the implied
     contractual right to ongoing employment, led to 14 – 22 percent excess temporary help growth in
     adopting states. Unjust dismissal doctrines did not significantly contribute to employment growth
     in other business service industries. Temporary help employment is closely correlated with union
     penetration, with states experiencing the least rapid decline in unionization undergoing
     substantially faster THS growth. The decline of employment at will explains as much as 20
     percent of the growth of THS between 1973 – 1995 and accounts for 336,000 to 494,000
     additional workers employed in THS on a daily basis as of 1999.

[57] Balaguer, N. S. (1990). “Sears, Roebuck and Co.: Outsourcing Within the Company (A).”
     Harvard Business Review 9-191-015: 18 pgs.
     http://www.hbsp.harvard.edu/hbsp/prod_detail.asp?191015
             In early 1988, Charles Moran, CIO of Sears, Roebuck and Co. was reviewing the
     evolution of communications and data processing activities at Sears. Recognition of
     communications network consolidation opportunities across its diverse and highly autonomous
     business groups (Sears Merchandising, Allstate Insurance, Dean Witter, and Caldwell Banker) in
     the early 1980s had led the company to form the highly successful Sears Communication
     Network (SCN) in 1983. During the past five years, the network contributed to lowered unit
     costs, improved service, reliable and expandable capability, and timely development of new
     applications. Now, Moran and his colleagues were contemplating data processing consolidation
     opportunities across business groups. The key was whether the business groups would agree to a

     © 2001 by Carnegie Mellon University. All rights reserved.                             19
centrally-managed utility and concentrate their information technology efforts on business
     applications development activities.

[58] Balaguer, N. S. (1990). “Sears, Roebuck and Co.: Outsourcing Within the Company (B).”
     Harvard Business Review 9-191-016: 3 pgs.
     http://www.hbsp.harvard.edu/hbsp/prod_detail.asp?191016
             Describes the challenges faced by Charles Carlson, president of Sears Technology
     Services, Inc. (STS). STS was the result of Charles Moran's data processing consolidation
     activities for three of Sears' four business groups. Leaves students with two questions: How does
     one implement a corporate utility concept in a highly decentralized and autonomous
     organization; and how does one make STS a viable and valuable organization for the Sears
     family?

[59] Balaguer, N. S. (1990). “Sears, Roebuck and Co.: Outsourcing Within the Company (C).”
     Harvard Business Review 9-191-017: 1 pg.
     http://www.hbsp.harvard.edu/hbsp/prod_detail.asp?191017
             Describes the results of data consolidation efforts by Sears Technology Services, Inc.
     (STS). Provokes discussion of the drawbacks associated with concentrating one's efforts on
     consolidation activities in lieu of more strategic activities when financial pressures dictate cost
     savings. Leaves students with a series of questions: 1) How does one ensure that STS is
     recognized by the business groups as more than a corporate data processing utility? 2) How
     should Carlson balance his organization's resources in relation to the data consolidation efforts
     and the role of technology leaders? and 3) What goals should he communicate to the chairman of
     Sears for STS for the 1990s?

[60] Baldwin, C. Y., and Clark, K.B. (1997). “Managing in an Age of Modularity.” Harvard Business
     Review: 10 pgs.
     http://www.hbsp.harvard.edu/hbsp/prod_detail.asp?97502
             Modularity is a familiar principle in the computer industry. Different companies can
     independently design and produce components, such as disk drives or operating software, and
     those modules will fit together into a complex and smoothly functioning product because makers
     obey a given set of design rules. As businesses as diverse as auto manufacturing and financial
     services move toward modular designs, the authors say, competitive dynamics will change
     enormously. Leaders in a modular industry will control less, so they will have to watch the
     competitive environment closely for opportunities to link up with other module makers. They
     will also need to know more: engineering details that seemed trivial at the corporate level may
     now play a large part in strategic decisions. Leaders will also become knowledge managers
     internally because they will need to coordinate the efforts of development groups in order to
     keep them focused on the modular strategies the company is pursuing.

[61] Barrett, R. (1996). Outsourcing Success Means Making the Right Moves, Reengineering
     Resource Center, The Outsourcing Institute.
     www.reengineering.com/articles/jul96/InfoManagement.html
             Stand inside the information management department of any large organization and you'll
     hear the mantra: outsource, outsource, outsource. Depending on how you approach it, off-loading

     © 2001 by Carnegie Mellon University. All rights reserved.                              20
You can also read