Monday, June 25, 2007

Configuration Management: CMMI Maturity Level 2

Purpose
The purpose of Configuration Management is to establish and maintain the integrity of work products using configuration identification,configuration control, configuration status accounting, and configuration
audits.


Introductory Notes
The Configuration Management process area involves the following:
· Identifying the configuration of selected work products that compose the baselines at given points in time
· Controlling changes to configuration items
· Building or providing specifications to build work products from the configuration management system
· Maintaining the integrity of baselines
· Providing accurate status and current configuration data to developers, end users, and customers

The work products placed under configuration management include the products that are delivered to the customer, designated internal work products, acquired products, tools, and other items that are used in
creating and describing these work products.

For Supplier Sourcing
Acquired products may need to be placed under configuration management by both the supplier and the project. Provisions for conducting configuration management should be established in supplier agreements. Methods to ensure that the data is complete and consistent should be established and maintained.


Examples of work products that may be placed under configuration management include the following:
· Plans
· Process descriptions
· Requirements
· Design data
· Drawings
· Product specifications
· Code
· Compilers
· Product data files
· Product technical publications


Configuration management of work products may be performed at several levels of granularity. Configuration items can be decomposed into configuration components and configuration units. Only the term "configuration item" is used in this process area. Therefore, in these practices, "configuration item" may be interpreted as "configuration component" or "configuration unit" as appropriate.

Baselines provide a stable basis for continuing evolution of configuration items.
An example of a baseline is an approved description of a product that includes internally consistent versions of requirements, requirement traceability matrices, design, discipline-specific items, and end-user documentation.
Baselines are added to the configuration management system as they are developed. Changes to baselines and the release of work products built from the configuration management system are systematically
controlled and monitored via the configuration control, change management, and configuration auditing functions of configuration management.
This process area applies not only to configuration management on projects, but also to configuration management on organization work products such as standards, procedures, and reuse libraries.

Configuration management is focused on the rigorous control of the managerial and technical aspects of work products, including the delivered system.


This process area covers the practices for performing the configuration management function and is applicable to all work products that are placed under configuration management.

Specific and Generic Goals
SG 1 Establish Baselines
Baselines of identified work products are established.


SG 2 Track and Control Changes
Changes to the work products under configuration management are tracked and controlled.

SG 3 Establish Integrity
Integrity of baselines is established and maintained.

GG 2 Institutionalize a Managed Process
The process is institutionalized as a managed process.
(The following goal is not required for maturity level 2, but required for maturity level 3 and
above.)
GG 3 Institutionalize a Defined Process
The process is institutionalized as a defined process.


Practice-to-Goal Relationship Table
SG 1 Establish Baselines
SP 1.1 Identify Configuration Items
SP 1.2 Establish a Configuration Management System
SP 1.3 Create or Release Baselines


SG 2 Track and Control Changes
SP 2.1 Track Change Requests
SP 2.2 Control Configuration Items

SG 3 Establish Integrity
SP 3.1 Establish Configuration Management Records
SP 3.2 Perform Configuration Audits

GG 2 Institutionalize a Managed Process
GP 2.1 (CO 1) Establish an Organizational Policy
GP 2.2 (AB 1) Plan the Process
GP 2.3 (AB 2) Provide Resources
GP 2.4 (AB 3) Assign Responsibility
GP 2.5 (AB 4) Train People
GP 2.6 (DI 1) Manage Configurations
GP 2.7 (DI 2) Identify and Involve Relevant Stakeholders
GP 2.8 (DI 3) Monitor and Control the Process
GP 2.9 (VE 1) Objectively Evaluate Adherence
GP 2.10 (VE 2) Review Status with Higher Level Management
(The following goal is not required and its practices are not expected for a maturity level 2 rating,
but are required and expected for a maturity level 3 rating and above.)


GG 3 Institutionalize a Defined Process
GP 3.1 Establish a Defined Process
GP 3.2 Collect Improvement Information


Process And Product Quality Assurance: Institutionalize a Managed Process

The process is institutionalized as a managed process.

Commitment to Perform
GP 2.1 (CO 1) Establish an Organizational Policy
Establish and maintain an organizational policy for planning and performing the process and product quality assurance process.

Elaboration:
This policy establishes organizational expectations for objectively evaluating whether processes and associated work products adhere to the applicable process descriptions, standards, and procedures, and ensuring that noncompliance is addressed.
This policy also establishes organizational expectations for process and product quality assurance being in place for all projects. Process and product quality assurance must possess sufficient independence from project management to provide objectivity in identifying and reporting noncompliance issues.

Ability to Perform
GP 2.2 (AB 1) Plan the Process
Establish and maintain the plan for performing the process and product quality assurance process.

Elaboration:
This plan for performing the process and product quality assurance process may be included in (or referenced by) the project plan, which is described in the Project Planning process area.

GP 2.3 (AB 2) Provide Resources
Provide adequate resources for performing the process and product quality assurance process, developing the work products, and providing the services of the process.

Examples of resources provided include the following tools:
· Evaluation tools
· Noncompliance tracking tool

GP 2.4 (AB 3) Assign Responsibility
Assign responsibility and authority for performing the process, developing the work products, and providing the services of the process and product quality assurance process.

Elaboration:
To guard against subjectivity or bias, ensure that those people assigned responsibility and authority for process and product quality assurance can perform their evaluations with sufficient independence and objectivity.

GP 2.5 (AB 4) Train People
Train the people performing or supporting the process and product quality assurance process as needed.

Elaboration:
Examples of training topics include the following:
· Application domain
· Customer relations
· Process descriptions, standards, procedures, and methods for the project
· Quality assurance objectives, process descriptions, standards, procedures, methods, and tools
Directing Implementation

GP 2.6 (DI 1) Manage Configurations
Place designated work products of the process and product quality assurance process under appropriate levels of configuration management.

Elaboration:
Examples of work products placed under configuration management include the following:
· Noncompliance reports
· Evaluation logs and reports

GP 2.7 (DI 2) Identify and Involve Relevant Stakeholders
Identify and involve the relevant stakeholders of the process and product quality assurance process as planned.

Elaboration:
Examples of activities for stakeholder involvement include the following:
· Establishing criteria for the objective evaluations of processes and work products
· Evaluating processes and work products
· Resolving noncompliance issues
· Tracking noncompliance issues to closure

GP 2.8 (DI 3) Monitor and Control the Process
Monitor and control the process and product quality assurance process against the plan for performing the process and take appropriate corrective action.

Elaboration:
Examples of measures used in monitoring and controlling include the following:
· Variance of objective process evaluations planned and performed
· Variance of objective work product evaluations planned and performed

Verifying Implementation
GP 2.9 (VE 1) Objectively Evaluate Adherence
Objectively evaluate adherence of the process and product quality assurance process against its process description, standards, and procedures, and address noncompliance.

Elaboration:
Examples of activities reviewed include the following:
· Objectively evaluating processes and work products
· Tracking and communicating noncompliance issues
Examples of work products reviewed include the following:
· Noncompliance reports
· Evaluation logs and reports

GP 2.10 (VE 2) Review Status with Higher Level Management
Review the activities, status, and results of the process and product quality assurance process with higher level management and resolve issues.
(The following goal is not required and its practices are not expected for a maturity level 2 rating, but are required for a maturity level 3 rating and above.)

GG 3 Institutionalize a Defined Process
The process is institutionalized as a defined process.

GP 3.1 Establish a Defined Process
Establish and maintain the description of a defined process and product quality assurance process.

GP 3.2 Collect Improvement Information
Collect work products, measures, measurement results, and improvement information derived from planning and performing the process and product quality assurance process to support the
future use and improvement of the organization’s processes and process assets.

Process And Product Quality Assurance: Specific Practices by Goal SG2

Provide Objective Insight
Noncompliance issues are objectively tracked and communicated, and resolution is ensured.

SP 2.1 Communicate and Ensure Resolution of Noncompliance Issues
Communicate quality issues and ensure resolution of noncompliance issues with the staff and managers.
Noncompliance issues are problems identified in evaluations that reflect a lack of adherence to applicable standards, process descriptions, or procedures. The status of noncompliance issues provides an indication
of quality trends. Quality issues include noncompliance issues and results of trend analysis.
When local resolution of noncompliance issues cannot be obtained, use established escalation mechanisms to ensure that the appropriate level of management can resolve the issue. Track noncompliance issues to
resolution.

Typical Work Products
1. Corrective action reports
2. Evaluation reports
3. Quality trends

Subpractices
1. Resolve each noncompliance with the appropriate members of the staff where possible.

2. Document noncompliance issues when they cannot be resolved within the project.
Examples of ways to resolve noncompliance within the project include the following:
· Fixing the noncompliance
· Changing the process descriptions, standards, or procedures that were violated
· Obtaining a waiver to cover the noncompliance issue


3. Escalate noncompliance issues that cannot be resolved within the project to the appropriate level of management designated to receive and act on noncompliance issues.

4. Analyze the noncompliance issues to see if there are any quality trends that can be identified and addressed.

5. Ensure that relevant stakeholders are aware of the results of evaluations and the quality trends in a timely manner.

6. Periodically review open noncompliance issues and trends with the manager designated to receive and act on noncompliance issues.

7. Track noncompliance issues to resolution.

SP 2.2 Establish Records
Establish and maintain records of the quality assurance activities.

Typical Work Products
1. Evaluation logs
2. Quality assurance reports
3. Status reports of corrective actions
4. Reports of quality trends

Subpractices
1. Record process and product quality assurance activities in sufficient detail such that status and results are known.

2. Revise the status and history of the quality assurance activities as necessary.

Process And Product Quality Assurance: Specific Practices by Goal SG1

Objectively Evaluate Processes and Work Products
Adherence of the performed process and associated work products and services to applicable process descriptions, standards, and procedures is objectively evaluated.

SP 1.1 Objectively Evaluate Processes
Objectively evaluate the designated performed processes against the applicable process descriptions, standards, and procedures.
Objectivity in quality assurance evaluations is critical to the success of the project. A description of the quality assurance reporting chain and how it ensures objectivity should be defined.


Typical Work Products
1. Evaluation reports
2. Noncompliance reports
3. Corrective actions

Subpractices
1. Promote an environment (created as part of project management) that encourages employee participation in identifying and reporting
quality issues.

2. Establish and maintain clearly stated criteria for the evaluations.
The intent of this subpractice is to provide criteria, based on business needs, such as the following:
· What will be evaluated
· When or how often a process will be evaluated
· How the evaluation will be conducted
· Who must be involved in the evaluation


3. Use the stated criteria to evaluate performed processes for adherence to process descriptions, standards, and procedures.

4. Identify each noncompliance found during the evaluation.

5. Identify lessons learned that could improve processes for future products and services.

SP 1.2 Objectively Evaluate Work Products and Services
Objectively evaluate the designated work products and services against the applicable process descriptions, standards, and procedures.

Typical Work Products
1. Evaluation reports
2. Noncompliance reports
3. Corrective actions

Subpractices
1. Select work products to be evaluated, based on documented sampling criteria if sampling is used.

2. Establish and maintain clearly stated criteria for the evaluation of work products. The intent of this subpractice is to provide criteria, based on business needs, such as the following:
· What will be evaluated during the evaluation of a work product
· When or how often a work product will be evaluated
· How the evaluation will be conducted
· Who must be involved in the evaluation

3. Use the stated criteria during the evaluations of work products.

4. Evaluate work products before they are delivered to the customer.

5. Evaluate work products at selected milestones in their development.

6. Perform in-progress or incremental evaluations of work products and services against process descriptions, standards, and procedures.


7. Identify each case of noncompliance found during the evaluations.

8. Identify lessons learned that could improve processes for future products and services.

Tuesday, June 19, 2007

Process and Product Quality Assurance: Specific Practices by Goal SG1

SG 1 Objectively Evaluate Processes and Work Products
Adherence of the performed process and associated work products and services to applicable process descriptions, standards, and procedures is objectively evaluated.


SP 1.1 Objectively Evaluate Processes
Objectively evaluate the designated performed processes against the applicable process descriptions, standards, and procedures.
Objectivity in quality assurance evaluations is critical to the success of the project. A description of the quality assurance reporting chain and how it ensures objectivity should be defined.


Typical Work Products
1. Evaluation reports
2. Noncompliance reports
3. Corrective actions

Subpractices
1. Promote an environment (created as part of project management) that encourages employee participation in identifying and reporting quality issues.

2. Establish and maintain clearly stated criteria for the evaluations.
The intent of this subpractice is to provide criteria, based on business needs, such as the following:
· What will be evaluated
· When or how often a process will be evaluated
· How the evaluation will be conducted
· Who must be involved in the evaluation


3. Use the stated criteria to evaluate performed processes for adherence to process descriptions, standards, and procedures.

4. Identify each noncompliance found during the evaluation.

5. Identify lessons learned that could improve processes for future products and services.


SP 1.2 Objectively Evaluate Work Products and Services
Objectively evaluate the designated work products and services against the applicable process descriptions, standards, and procedures.


Typical Work Products
1. Evaluation reports
2. Noncompliance reports
3. Corrective actions

Subpractices
1. Select work products to be evaluated, based on documented sampling criteria if sampling is used.
2. Establish and maintain clearly stated criteria for the evaluation of work products.
The intent of this subpractice is to provide criteria, based on business needs, such as the following:
· What will be evaluated during the evaluation of a work product
· When or how often a work product will be evaluated
· How the evaluation will be conducted
· Who must be involved in the evaluation
3. Use the stated criteria during the evaluations of work products.
4. Evaluate work products before they are delivered to the customer.
5. Evaluate work products at selected milestones in their development.
6. Perform in-progress or incremental evaluations of work products and services against process descriptions, standards, and procedures.

7. Identify each case of noncompliance found during the evaluations.
8. Identify lessons learned that could improve processes for future products and services.

Process and Product Quality Assurance: CMMI Maturity Level 2

Purpose
The purpose of Process and Product Quality Assurance is to provide staff and management with objective insight into processes and associated work products.


Introductory Notes
The Process and Product Quality Assurance process area involves the following:
· Objectively evaluating performed processes, work products, and services against the applicable process descriptions, standards,and procedures
· Identifying and documenting noncompliance issues
· Providing feedback to project staff and managers on the results of quality assurance activities
· Ensuring that noncompliance issues are addressed The Process and Product Quality Assurance process area supports the delivery of high-quality products and services by providing the project staff and managers at all levels with appropriate visibility into, and feedback on, processes and associated work products throughout the life of the project.
The practices in the Process and Product Quality Assurance process area ensure that planned processes are implemented, while the practices in the Verification process area ensure that the specified requirements are satisfied. These two process areas may on occasion address the same work product but from different perspectives. Projects should take care to minimize duplication of effort.

Objectivity in process and product quality assurance evaluations is critical to the success of the project. Objectivity is achieved by both independence and the use of criteria. Traditionally, a quality assurance
group that is independent of the project provides this objectivity. It may be appropriate in some organizations, however, to implement the process and product quality assurance role without that kind of
independence. For example, in an organization with an open, quality oriented culture, the process and product quality assurance role may be performed, partially or completely, by peers; and the quality assurance function may be embedded in the process.
If quality assurance is embedded in the process, several issues must be addressed to ensure objectivity. Everyone performing quality assurance activities should be trained in quality assurance. Those performing
quality assurance activities for a work product should be separate from those directly involved in developing or maintaining the work product. An independent reporting channel to the appropriate level of
organizational management must be available so that noncompliance issues may be escalated as necessary.
Quality assurance should begin in the early phases of a project to establish plans, processes, standards, and procedures that will add value to the project and satisfy the requirements of the project and the
organizational policies. Those performing quality assurance participate in establishing the plans, processes, standards, and procedures to ensure that they fit the project’s needs and that they will be useable for
performing quality assurance evaluations. In addition, the specific processes and associated work products that will be evaluated during the project are designated. This designation may be based on sampling
or on objective criteria that are consistent with organizational policies and project requirements and needs.
When noncompliance issues are identified, they are first addressed within the project and resolved there if possible. Any noncompliance issues that cannot be resolved within the project are escalated to an
appropriate level of management for resolution. This process area primarily applies to evaluations of products and services, but it also applies to evaluations of nonproject activities and work products such as training activities. For these activities and work products, the term "project" should be appropriately interpreted.


Specific and Generic Goals

SG 1 Objectively Evaluate Processes and Work Products
Adherence of the performed process and associated work products and services to applicable process descriptions, standards, and procedures is objectively evaluated.


SG 2 Provide Objective Insight
Noncompliance issues are objectively tracked and communicated, and resolution is ensured.


GG 2 Institutionalize a Managed Process
The process is institutionalized as a managed process.
(The following goal is not required for maturity level 2, but required for maturity level 3 and
above.)


GG 3 Institutionalize a Defined Process
The process is institutionalized as a defined process.


Practice-to-Goal Relationship Table


SG 1 Objectively Evaluate Processes and Work Products
SP 1.1 Objectively Evaluate Processes
SP 1.2 Objectively Evaluate Work Products and Services


SG 2 Provide Objective Insight
SP 2.1 Communicate and Ensure Resolution of Noncompliance Issues
SP 2.2 Establish Records


GG 2 Institutionalize a Managed Process
GP 2.1 (CO 1) Establish an Organizational Policy
GP 2.2 (AB 1) Plan the Process
GP 2.3 (AB 2) Provide Resources
GP 2.4 (AB 3) Assign Responsibility
GP 2.5 (AB 4) Train People
GP 2.6 (DI 1) Manage Configurations
GP 2.7 (DI 2) Identify and Involve Relevant Stakeholders
GP 2.8 (DI 3) Monitor and Control the Process
GP 2.9 (VE 1) Objectively Evaluate Adherence
GP 2.10 (VE 2) Review Status with Higher Level Management

(The following goal is not required and its practices are not expected for a maturity level 2 rating, but are required and expected for a maturity level 3 rating and above.)

GG 3 Institutionalize a Defined Process

GP 3.1 Establish a Defined Process
GP 3.2 Collect Improvement Information

Sunday, June 17, 2007

Measurement and Analysis: Institutionalize a Managed Process

CG2 Institutionalize a Managed Process
The process is institutionalized as a managed process.

Commitment to Perform
GP 2.1 (CO 1) Establish an Organizational Policy
Establish and maintain an organizational policy for planning and performing the measurement and analysis process.
Elaboration:
This policy establishes organizational expectations for aligning measurement objectives and activities with identified information needs and objectives and for providing measurement results.

Ability to Perform

GP 2.2 (AB 1) Plan the Process
Establish and maintain the plan for performing the measurement and analysis process.
Elaboration:
Typically, this plan for performing the measurement and analysis process is included in the project plan.


GP 2.3 (AB 2) Provide Resources
Provide adequate resources for performing the measurement and analysis process, developing the work products, and providing the services of the process.
Elaboration:
Measurement personnel may be employed full time or part time. A measurement group may or may not exist to support measurement activities across multiple projects.
Examples of other resources provided include the following tools:
· Statistical packages
· Packages that support data collection over networks


GP 2.4 (AB 3) Assign Responsibility
Assign responsibility and authority for performing the process,developing the work products, and providing the services of the measurement and analysis process.

GP 2.5 (AB 4) Train People
Train the people performing or supporting the measurement and analysis process as needed.
Elaboration:
Examples of training topics include the following:
· Statistical techniques
· Data collection, analysis, and reporting processes
· Development of goal-related measurements (e.g., Goal Question Metric)

Directing Implementation
GP 2.6 (DI 1) Manage Configurations
Place designated work products of the measurement and analysis process under appropriate levels of configuration management.

Elaboration:
Examples of work products placed under configuration management include the following:
· Specifications of base and derived measures
· Data collection and storage procedures
· Base and derived measurement data sets
· Analysis results and draft reports
· Data analysis tools


GP 2.7 (DI 2) Identify and Involve Relevant Stakeholders
Identify and involve the relevant stakeholders of the measurement and analysis process as planned.
Elaboration:
Examples of activities for stakeholder involvement include the following:
· Establishing measurement objectives and procedures
· Assessing measurement data
· Providing meaningful feedback to those responsible for providing the raw data on which the analysis and results depend


GP 2.8 (DI 3) Monitor and Control the Process
Monitor and control the measurement and analysis process against the plan for performing the process and take appropriate corrective action.
Elaboration:
Examples of measures used in monitoring and controlling include the following:
· Percentage of projects using progress and performance measures
· Percentage of measurement objectives addressed

Verifying Implementation
GP 2.9 (VE 1) Objectively Evaluate Adherence
Objectively evaluate adherence of the measurement and analysis process against its process description, standards, and procedures, and address noncompliance.

Elaboration:
Examples of activities reviewed include the following:
· Aligning measurement and analysis activities
· Providing measurement results
Examples of work products reviewed include the following:
· Specifications of base and derived measures
· Data collection and storage procedures
· Analysis results and draft reports


GP 2.10 (VE 2) Review Status with Higher Level Management
Review the activities, status, and results of the measurement and analysis process with higher level management and resolve issues.
(The following goal is not required and its practices are not expected for a maturity level 2 rating,
but are required for a maturity level 3 rating and above.)


GG 3 Institutionalize a Defined Process [CL104.GL101]
The process is institutionalized as a defined process.
GP 3.1 Establish a Defined Process
Establish and maintain the description of a defined measurement and analysis process.


GP 3.2 Collect Improvement Information
Collect work products, measures, measurement results, and improvement information derived from planning and performing the measurement and analysis process to support the future use and improvement of the organization’s processes and process assets.

Monday, June 11, 2007

Measurement and Analysis: Specific Practices by Goal SG2

SG 2 Provide Measurement Results
Measurement results that address identified information needs and objectives are provided. The primary reason for doing measurement and analysis is to address identified information needs and objectives. Measurement results based on objective evidence can help to monitor performance, fulfill contractual obligations, make informed management and technical decisions, and enable corrective actions to be taken.

SP 2.1 Collect Measurement Data
Obtain specified measurement data. The data necessary for analysis are obtained and checked for completeness and integrity.
Typical Work Products
1. Base and derived measurement data sets
2. Results of data integrity tests
Subpractices
1. Obtain the data for base measures.
Data are collected as necessary for previously used as well as for newly specified base measures. Existing data are gathered from project records or from elsewhere in the organization. Note that data that were collected earlier may no longer be available for reuse in existing databases, paper records, or formal repositories.
2. Generate the data for derived measures.
Values are newly calculated for all derived measures.

3. Perform data integrity checks as close to the source of the data as possible. All measurements are subject to error in specifying or recording data. It is alwaysbetter to identify such errors and to identify sources of missing data early in the measurement and analysis cycle. Checks can include scans for missing data, out-of-bounds data values, and unusual patterns and correlation across measures. It is particularly important to do the following:
Test and correct for inconsistency of classifications made by human judgment(i.e., to determine how frequently people make differing classification decisionsbased on the same information, otherwise known as "inter-coder reliability").
· Empirically examine the relationships among the measures that are used to calculate additional derived measures. Doing so can ensure that important distinctions are not overlooked and that the derived measures convey their intended meanings (otherwise known as "criterion validity").

SP 2.2 Analyze Measurement Data
Analyze and interpret measurement data. The measurement data are analyzed as planned, additional analyses are conducted as necessary, results are reviewed with relevant stakeholders, and necessary revisions for future analyses are noted.
Typical Work Products
1. Analysis results and draft reports
Subpractices
1. Conduct initial analyses, interpret the results, and draw preliminary conclusions. The results of data analyses are rarely self evident. Criteria for interpreting theresults and drawing conclusions should be stated explicitly.
2. Conduct additional measurement and analysis as necessary, and prepare results for presentation. The results of planned analyses may suggest (or require) additional, unanticipated analyses. In addition, they may identify needs to refine existing measures, to calculate additional derived measures, or even to collect data for additional primitive measures to properly complete the planned analysis. Similarly, preparing the initial results for presentation may identify the need for additional,unanticipated analyses.
3. Review the initial results with relevant stakeholders.It may be appropriate to review initial interpretations of the results and the way in which they are presented before disseminating and communicating them more widely. Reviewing the initial results before their release may prevent needless misunderstandings and lead to improvements in the data analysis and presentation. Relevant stakeholders with whom reviews may be conducted include intendedend users and sponsors, as well as data analysts and data providers

4. Refine criteria for future analyses. Valuable lessons that can improve future efforts are often learned from conducting data analyses and preparing results. Similarly, ways to improve measurement specifications and data collection procedures may become apparent, as may ideas for refining identified information needs and objectives.

SP 2.3 Store Data and Results
Manage and store measurement data, measurement specifications, and analysis results. Storing measurement-related information enables the timely and cost effectivefuture use of historical data and results. The information also is needed to provide sufficient context for interpretation of the data,measurement criteria, and analysis results. Information stored typically includes the following:
· Measurement plans
· Specifications of measures
· Sets of data that have been collected
· Analysis reports and presentations
The stored information contains or references the information needed to understand and interpret the measures and assess them for reasonableness and applicability (e.g., measurement specificationsused on different projects when comparing across projects).
Data sets for derived measures typically can be recalculated and need not be stored. However, it may be appropriate to store summaries based on derived measures (e.g., charts, tables of results, or report prose). Interim analysis results need not be stored separately if they can be efficiently reconstructed. Projects may choose to store project-specific data and results in aproject-specific repository. When data are shared more widely acrossprojects, the data may reside in the organization’s measurement repository.

Typical Work Products
1. Stored data inventory
Subpractices
1. Review the data to ensure their completeness, integrity, accuracy,and currency.
2. Make the stored contents available for use only by appropriategroups and personnel.
3. Prevent the stored information from being used inappropriately.
Examples of ways to prevent inappropriate use of the data and related information include controlling access to data and educating people on the appropriate use of data.
Examples of inappropriate use include the following:

· Disclosure of information that was provided in confidence
· Faulty interpretations based on incomplete, out-of-context, or otherwise misleading information
· Measures used to improperly evaluate the performance of people or to rank projects
· Impugning the integrity of specific individuals

SP 2.4 Communicate Results
Report results of measurement and analysis activities to all relevant stakeholders. The results of the measurement and analysis process arecommunicated to relevant stakeholders in a timely and usable fashionto support decision making and assist in taking corrective action.Relevant stakeholders include intended users, sponsors, data analysts,and data providers.
Typical Work Products
1. Delivered reports and related analysis results
2. Contextual information or guidance to aid in the interpretation of analysis results
Subpractices
1. Keep relevant stakeholders apprised of measurement results on a timely basis. Measurement results are communicated in time to be used for their intended purposes. Reports are unlikely to be used if they are distributed with little effort tofollow up with those who need to know the results. To the extent possible and as part of the normal way they do business, users of measurement results are kept personally involved in setting objectives and deciding on plans of action for measurement and analysis. The users are regularly kept apprised of progress and interim results.
2. Assist relevant stakeholders in understanding the results.Results are reported in a clear and concise manner appropriate to themethodological sophistication of the relevant stakeholders. They are understandable, easily interpretable, and clearly tied to identified information needs and objectives. The data are often not self evident to practitioners who are not measurement experts. Measurement choices should be explicitly clear about the following:
How and why the base and derived measures were specified
How the data were obtained
How to interpret the results based on the data analysis methods that were used
How the results address their information needs
Examples of actions to assist in understanding of results include the following:
· Discussing the results with the relevant stakeholders
· Providing a transmittal memo that provides background and explanation
· Briefing users on the results
· Providing training on the appropriate use and understanding of measurement results

Measurement and Analysis: Specific Practices by Goal SG1

SG 1 Align Measurement and Analysis Activities
Measurement objectives and activities are aligned with identified information needs and objectives. The specific practices covered under this specific goal may be addressed concurrently or in any order:
· When establishing measurement objectives, experts often think ahead about necessary criteria for specifying measures and analysis procedures. They also think concurrently about the constraints imposed by data collection and storage procedures.
· It often is important to specify the essential analyses that will be conducted before attending to details of measurement specification, data collection, or storage.


SP 1.1 Establish Measurement Objectives
Establish and maintain measurement objectives that are derived from identified information needs and objectives.
Measurement objectives document the purposes for which measurement and analysis are done, and specify the kinds of actions that may be taken based on the results of data analyses.
The sources for measurement objectives may be management, technical, project, product, or process implementation needs.
The measurement objectives may be constrained by existing processes, available resources, or other measurement considerations.
Judgments may need to be made about whether the value of the results will be commensurate with the resources devoted to doing the work.
Modifications to identified information needs and objectives may, in turn, be indicated as a consequence of the process and results of measurement and analysis. Sources of information needs and objectives may include the following:
· Project plans
· Monitoring of project performance
· Interviews with managers and others who have information needs

· Established management objectives
· Strategic plans
· Business plans
· Formal requirements or contractual obligations
· Recurring or other troublesome management or technical problems
· Experiences of other projects or organizational entities
· External industry benchmarks
· Process-improvement plans


Typical Work Products
1. Measurement objectives
Subpractices
1. Document information needs and objectives.
Information needs and objectives are documented to allow traceability to subsequent measurement and analysis activities.
2. Prioritize information needs and objectives.
It may be neither possible nor desirable to subject all initially identified information needs to measurement and analysis. Priorities may also need to be set within the limits of available resources.
3. Document, review, and update measurement objectives.
It is important to carefully consider the purposes and intended uses of measurement and analysis.

The measurement objectives are documented, reviewed by management and other relevant stakeholders, and updated as necessary. Doing so enables traceability to subsequent measurement and analysis activities, and helps ensure that the analyses will properly address identified information needs and
objectives.
It is important that users of measurement and analysis results be involved in setting measurement objectives and deciding on plans of action. It may also be appropriate to involve those who provide the measurement data.
4. Provide feedback for refining and clarifying information needs and objectives as necessary.
Identified information needs and objectives may need to be refined and clarified as a result of setting measurement objectives. Initial descriptions of information needs may be unclear or ambiguous. Conflicts may arise between existing needs and objectives. Precise targets on an already existing measure may be
unrealistic.

5. Maintain traceability of the measurement objectives to the identified information needs and objectives.
There must always be a good answer to the question, "Why are we measuring this?"
Of course, the measurement objectives may also change to reflect evolving information needs and objectives.
SP 1.2 Specify Measures
Specify measures to address the measurement objectives.
Measurement objectives are refined into precise, quantifiable measures.
Measures may be either "base" or "derived." Data for base measures are obtained by direct measurement. Data for derived measures come
from other data, typically by combining two or more base measures.
Examples of commonly used base measures include the following:
· Estimates and actual measures of work product size (e.g., number of pages)
· Estimates and actual measures of effort and cost (e.g., number of person hours)
· Quality measures (e.g., number of defects, number of defects by severity)


Examples of commonly used derived measures include the following:
· Earned Value
· Schedule Performance Index
· Defect density
· Peer review coverage
· Test or verification coverage
· Reliability measures (e.g., mean time to failure)
· Quality measures (e.g., number of defects by severity/total number of defects)


Derived measures typically are expressed as ratios, composite indices,or other aggregate summary measures. They are often more quantitatively reliable and meaningfully interpretable than the base
measures used to generate them.
Typical Work Products
1. Specifications of base and derived measures

Subpractices
1. Identify candidate measures based on documented measurement objectives.
The measurement objectives are refined into specific measures. The identified candidate measures are categorized and specified by name and unit of measure.
2. Identify existing measures that already address the measurement objectives.
Specifications for measures may already exist, perhaps established for other purposes earlier or elsewhere in the organization.
3. Specify operational definitions for the measures.
Operational definitions are stated in precise and unambiguous terms. They address two important criteria as follows: · Communication: What has been measured, how was it measured, what are the units of measure, and what has been included or excluded?
· Repeatability: Can the measurement be repeated, given the same definition, to get the same results?
4. Prioritize, review, and update measures.
Proposed specifications of the measures are reviewed for their appropriateness with potential end users and other relevant stakeholders. Priorities are set or changed, and specifications of the measures are updated as necessary.


SP 1.3 Specify Data Collection and Storage Procedures
Specify how measurement data will be obtained and stored.
Explicit specification of collection methods helps ensure that the right data are collected properly. It may also aid in further clarifying information needs and measurement objectives.
Proper attention to storage and retrieval procedures helps ensure that data are available and accessible for future use.
Typical Work Products
1. Data collection and storage procedures
2. Data collection tools

Subpractices
1. Identify existing sources of data that are generated from current work products, processes, or transactions.
Existing sources of data may already have been identified when specifying the measures. Appropriate collection mechanisms may exist whether or not pertinent data have already been collected.
2. Identify measures for which data are needed, but are not currently available.
3. Specify how to collect and store the data for each required measure.
Explicit specifications are made of how, where, and when the data will be collected. Procedures for collecting valid data are specified. The data are stored in an accessible manner for analysis, and it is determined whether they will be saved for possible reanalysis or documentation purposes. Questions to be considered typically include the following:
· Have the frequency of collection and the points in the process where measurements will be made been determined?
· Has the time line that is required to move measurement results from the points of collection to repositories, other databases, or end users been established?
· Who is responsible for obtaining the data?
· Who is responsible for data storage, retrieval, and security?
· Have necessary supporting tools been developed or acquired?
4. Create data collection mechanisms and process guidance.

Data collection and storage mechanisms are well integrated with other normal work processes. Data collection mechanisms may include manual or automated forms and templates. Clear, concise guidance on correct procedures is available to those responsible for doing the work. Training is provided as necessary to
clarify the processes necessary for collection of complete and accurate data and to minimize the burden on those who must provide and record the data.
5. Support automatic collection of the data where appropriate and feasible.
Automated support can aid in collecting more complete and accurate data.

Examples of such automated support include the following:
· Timestamped activity logs
· Static or dynamic analyses of artifacts
However, some data cannot be collected without human intervention (e.g.,customer satisfaction or other human judgments), and setting up the necessary infrastructure for other automation may be costly.
6. Prioritize, review, and update data collection and storage procedures.
Proposed procedures are reviewed for their appropriateness and feasibility with those who are responsible for providing, collecting, and storing the data. They also may have useful insights about how to improve existing processes, or be able to suggest other useful measures or analyses.
7. Update measures and measurement objectives as necessary.

Priorities may need to be reset based on the following:

· The importance of the measures
· The amount of effort required to obtain the data
Considerations include whether new forms, tools, or training would be required to obtain the data.


SP 1.4 Specify Analysis Procedures
Specify how measurement data will be analyzed and reported.

Specifying the analysis procedures in advance ensures that appropriate analyses will be conducted and reported to address the documented measurement objectives (and thereby the information needs and
objectives on which they are based). This approach also provides a check that the necessary data will in fact be collected.
Typical Work Products
1. Analysis specification and procedures
2. Data analysis tools
Subpractices
1. Specify and prioritize the analyses that will be conducted and the reports that will be prepared.
Early attention should be paid to the analyses that will be conducted and to the manner in which the results will be reported. These should meet the following criteria:
· The analyses explicitly address the documented measurement objectives
· Presentation of the results is clearly understandable by the audiences to whom the results are addressed
Priorities may have to be set within available resources.
2. Select appropriate data analysis methods and tools.

Issues to be considered typically include the following:
· Choice of visual display and other presentation techniques (e.g., pie charts, bar charts, histograms, radar charts, line graphs, scatter plots, or tables)
· Choice of appropriate descriptive statistics (e.g., arithmetic mean, median, or mode)
· Decisions about statistical sampling criteria when it is impossible or unnecessary
to examine every data element
· Decisions about how to handle analysis in the presence of missing data elements

· Selection of appropriate analysis tools
· The work does not cost more to perform than is justified by the benefits that it provides.
Criteria for evaluating the conduct of the measurement and analysis might include the extent to which the following apply:
· The amount of missing data or the number of flagged inconsistencies is beyond specified thresholds.
· There is selection bias in sampling (e.g., only satisfied end users are surveyed to evaluate end-user satisfaction, or only unsuccessful projects are evaluated to determine overall productivity).
· The measurement data are repeatable (e.g., statistically reliable).
· Statistical assumptions have been satisfied (e.g., about the distribution of data or about appropriate measurement scales).

Sunday, June 3, 2007

Measurement And Analysis : CMMI Maturity Level 2

Purpose
The purpose of Measurement and Analysis is to develop and sustain a measurement capability that is used to support management information needs.


Introductory Notes
The Measurement and Analysis process area involves the following:
· Specifying the objectives of measurement and analysis such that they are aligned with identified information needs and objectives
· Specifying the measures, data collection and storage mechanisms,analysis techniques, and reporting and feedback mechanisms
· Implementing the collection, storage, analysis, and reporting of the data
· Providing objective results that can be used in making informed decisions, and taking appropriate corrective actions
The integration of measurement and analysis activities into the processes of the project supports the following:
· Objective planning and estimating
· Tracking actual performance against established plans and objectives
· Identifying and resolving process-related issues
· Providing a basis for incorporating measurement into additional processes in the future


The staff required to implement a measurement capability may or may not be employed in a separate organization-wide program.Measurement capability may be integrated into individual projects or
other organizational functions (e.g., Quality Assurance). The initial focus for measurement activities is at the project level.However, a measurement capability may prove useful for addressing organization- and/or enterprise-wide information needs.


Projects may choose to store project-specific data and results in a project-specific repository. When data are shared more widely across projects, the data may reside in the organization’s measurement repository.

For Supplier Sourcing
Measurement and analysis of the product components provided by suppliers is essential for effective management of the quality and costs of the project. It may be possible, with careful management of supplier agreements, to provide insight into the data that support supplier-performance analysis.


Specific and Generic Goals
SG 1 Align Measurement and Analysis Activities

Measurement objectives and activities are aligned with identified information needs and objectives.

SG 2 Provide Measurement Results
Measurement results that address identified information needs and objectives are provided.

GG 2 Institutionalize a Managed Process
The process is institutionalized as a managed process.
(The following goal is not required for maturity level 2, but required for maturity level 3 and
above.)
GG 3 Institutionalize a Defined Process
The process is institutionalized as a defined process.


Practice-to-Goal Relationship Table
SG 1 Align Measurement and Analysis Activities

SP 1.1 Establish Measurement Objectives
SP 1.2 Specify Measures
SP 1.3 Specify Data Collection and Storage Procedures
SP 1.4 Specify Analysis Procedures


SG 2 Provide Measurement Results
SP 2.1 Collect Measurement Data
SP 2.2 Analyze Measurement Data
SP 2.3 Store Data and Results
SP 2.4 Communicate Results


GG 2 Institutionalize a Managed Process
GP 2.1 (CO 1) Establish an Organizational Policy
GP 2.2 (AB 1) Plan the Process
GP 2.3 (AB 2) Provide Resources
GP 2.4 (AB 3) Assign Responsibility
GP 2.5 (AB 4) Train People
GP 2.6 (DI 1) Manage Configurations
GP 2.7 (DI 2) Identify and Involve Relevant Stakeholders
GP 2.8 (DI 3) Monitor and Control the Process
GP 2.9 (VE 1) Objectively Evaluate Adherence
GP 2.10 (VE 2) Review Status with Higher Level Management

(The following goal is not required and its practices are not expected for a maturity level 2 rating,
but are required and expected for a maturity level 3 rating and above.)
GG 3 Institutionalize a Defined Process
GP 3.1 Establish a Defined Process
GP 3.2 Collect Improvement Information