Example: quiz answers

IMPLEMENTING MEASUREMENT AND ANALYSIS

2007 Vol. 14, No. 1 IMPLEMENTING MEASUREMENT AND ANALYSISBy Neil Potter and Mary SakryIntroductionThe MEASUREMENT and ANALYSIS (MA) Process Area in theCapability Maturity Model Integration (CMMI)* provides asolid infrastructure for IMPLEMENTING a MEASUREMENT this article we will describe the practices of MA and givesome examples of practice PurposeFrom the CMMI text, the purpose of MEASUREMENT and Analysisis to develop and sustain a MEASUREMENT capability that is usedto support management information Specific Goals (SGs) and Specific Practices (SPs) inFigure 1describe the 1 MEASUREMENT objectives and activities are aligned with identifiedinformation needs and Establish and maintain MEASUREMENT objectives that are derived fromidentified information needs and Specify measures to address the MEASUREMENT Specify how MEASUREMENT data will be obtained and Specify how MEASUREMENT data will be analyzed and 2 MEASUREMENT results, which address identified information needs andobjectives, are Obtain specified MEASUREMENT Analyze and interpret MEASUREMENT Manage and store MEASUREMENT data, MEASUREMENT specifications, andanalysis Report results of MEASUREMENT and ANALYSIS activities to all 1 MA Specific Goals and PracticesWhen MA is implemented, objectives a

IMPLEMENTING MEASUREMENT AND ANALYSIS (Continued from page 2) During development, software components and files are completed and handed to the integrators. The integrators check each component against the criteria listed in Figure 3and derive a score. The scores are then charted and communicated back to the developers (see Figure 4and the corresponding graph in Figure 5).

Tags:

  Analysis, Measurement, Implementing, Implementing measurement and analysis

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of IMPLEMENTING MEASUREMENT AND ANALYSIS

1 2007 Vol. 14, No. 1 IMPLEMENTING MEASUREMENT AND ANALYSISBy Neil Potter and Mary SakryIntroductionThe MEASUREMENT and ANALYSIS (MA) Process Area in theCapability Maturity Model Integration (CMMI)* provides asolid infrastructure for IMPLEMENTING a MEASUREMENT this article we will describe the practices of MA and givesome examples of practice PurposeFrom the CMMI text, the purpose of MEASUREMENT and Analysisis to develop and sustain a MEASUREMENT capability that is usedto support management information Specific Goals (SGs) and Specific Practices (SPs) inFigure 1describe the 1 MEASUREMENT objectives and activities are aligned with identifiedinformation needs and Establish and maintain MEASUREMENT objectives that are derived fromidentified information needs and Specify measures to address the MEASUREMENT Specify how MEASUREMENT data will be obtained and Specify how MEASUREMENT data will be analyzed and 2 MEASUREMENT results, which address identified information needs andobjectives, are Obtain specified MEASUREMENT Analyze and interpret MEASUREMENT Manage and store MEASUREMENT data, MEASUREMENT specifications, andanalysis Report results of MEASUREMENT and ANALYSIS activities to all 1 MA Specific Goals and PracticesWhen MA is implemented, objectives and measures are estab-lished.

2 MEASUREMENT results are then used to determine progresstowards these objectives. Typical symptoms that occur whenMA is not performed well include: few clear objectives;numerous measures defined but not used; or objectives aredefined by rumor. Next is a brief explanation of the Specific : Establish and maintain MEASUREMENT objectivesthat are derived from identified information needs may cover a broad range of issues such as budget,deadline, quality and product performance. A few items that areconsidered important for the business should be selected. Objectives can either be qualitative or numeric. Usually,objectives start out as ambiguous or qualitative phrases, ( , Improve quality ) and over time are refined as numeric targets,( , Improve quality from 10 major defects per release to nomore than 5 major defects per release. )The CMMI text provides examples such as: Reduce time to delivery Reduce total lifecycle cost Deliver specified functionality completely Improve prior levels of quality Improve prior customer satisfaction ratings Maintain and improve the acquirer/supplier relationshipsSP : Specify measures to address the measures means that the measures are written down andmade crystal clear.

3 You can determine how well this practice isimplemented by performing a simple test. Ask five colleagues todefine quality. If they provide multiple, conflicting orambiguous responses, you know that this measure needs to beclarified. When the practice is performed correctly, all fivepeople will refer to the same written definition. SP : Specify how MEASUREMENT data will be obtained practice simply asks for a definition of how and where dataare collected. An example would be: Every Monday, eachproject manager collects actual effort hours expended on project CONTINUED ON PAGE 2 tasks and stores in The frequency,method of collection and storage location do not necessarilyhave to be the same for each : Specify how MEASUREMENT data will be analyzedand practice helps clarify what to look for in the data. Forexample, should the MEASUREMENT results lie within a range?

4 Should the trend be up, down or flat? Is there a threshold that,when exceeded, triggers further investigation or correctiveactions? Figure 2provides some examples of objectives and are defined in table form to keep the measurementdocumentation concise. This table could be imbedded in aproject plan for project-level measures or created as a separatedocument for organizational measures. The ANALYSIS in theexample is rudimentary and will likely become more advancedover for MAThe process to implement MA will vary. If there are manyorganizations or projects participating in the measurementprogram and numerous objectives to clarify, the process mightbe larger and more complicated than if you are applying MA toone small group of 15 people. In the beginning, the process might be as simple as completingthe table in Figure 2with the appropriate stakeholders. Formore complex organizations, there might be alignment stepswith other groups, more advanced ANALYSIS and more compre-hensive reporting.

5 Example metric for product integrationThe last entry in Figure 2shows an example from one of ourclients that measures the number of issues found whendevelopers release components to the integration this example, the organization instituted a simple measureinto the day-to-day flow of the project to detect problems beingpassed from the developers to the integrators. The measureallows problems to be detected early and acts as an ongoingfeedback mechanism for the developers. CONTINUED ON PAGE 3 ObjectiveMeet all commitmentsStakeholder(s)JenniferJimMary Metric(s)# Days early or lateData Storage andCollectionCollected every in If > 5 days late:Report to program managerExamine critical pathAdjust resources?Delegate work?Simplify solution?Slip deadline?ReportingWeekly staff meetingMonthly program reviewIMPLEMENTING MEASUREMENT AND ANALYSIS (Continued from page 1)100% known defects resolvedJenniferJimMaryKurt# Major with open statusCollected weekly via scriptrun on bugtrack systemdatabase.

6 Defect counts are recorded Open rate (new defectsopen per week) <= close rate(defects closed per week)2. # Major defects open = 0 Weekly staff meetingMonthly program reviewReduce problems enteringintegration by 10%JenniferJimMaryKurtProduct-component handoffscorecard rating (defined inscorecard spreadsheet)For each product version,store in: <productname> Score should be > 80/100for all handoffsMonthly program reviewFigure 2 Examples of objectives and measuresComponent Handoff CriteriaComponent must build (Comp-build)Component must not break therelease (Rel-Build)Component handoff must meetdeadline (On-time)Component must not introducefailures in sanity checks (Sanity Tests)Component handoff document contentmust be complete, as per currenttemplate (Handoff Doc)Component label is correct - correctversions, correct elements (Label)Component changes correspond to therelease plan (In Plan)Rule Used to Generate ScoreIf one build breaks.

7 0, otherwise 100If one component breaks the release: 0 for allrelated components, otherwise 100If a handoff arrives after the deadline: 0,otherwise 100If one sanity check fails, as described in theTest Plan: 0, otherwise 100If some information is missing or inaccurate:0, otherwise 100If the label is applied to the wrong elements,or is missing elements, or is the wrongversion: 0, otherwise 100If the list of functions identified in the changerequest does not match the list of functionsplanned for this release in the release plan:0. If the list matches: 100 Coefficient2234122 Figure 3 Scorecard Criteria IMPLEMENTING MEASUREMENT AND ANALYSIS (Continued from page 2)During development, software components and files are completed and handed to the integrators check each component against the criteria listed in Figure 3and derive a score. The scores are then charted and communicated back to the developers (see Figure 4and thecorresponding graph in Figure 5).

8 The well-defined criteria helps the developers avoidrepeating many of the previously experienced problems. The developers also know that theaverage scores over time are reported to the senior managers. This provides an extra incentiveto deliver quality criteria are well maintained toavoid the integration checkbecoming academic. If the criteriaare incorrect or add no value tothe project, a lessons-learnedsession is conducted and thecriteria are refined. Example Scorecard Examples of three components being scored in Release R1 are shown in Figure objective Reduce problems entering integration by 10% was made very visible bythe Scorecard measure. The measure provided an effective way to monitor the quality ofcomponents entering the integration phase of a project. The criteria and corresponding scoreshelped pinpoint corrective MEASUREMENT and AnalysisPA provides guidance on estab-lishing a working measurementprogram.

9 It emphasizes the needfor clear objectives and definedmeasures. MEASUREMENT resultsare then collected and used toassess progress towards best implementations of MAfocus on what is most important to a business. Time is spent refining meaningful objectivesand measures that provide timely insight so that corrective actions can be :*The full CMMI source is at: 4 Score examples for 3 componentsFigure 5 The scores of 6 components over 16 releasesSCAMPI HIGH MATURITY LEAD APPRAISER CERTIFICATIONThe Software Engineering Institute has awarded the SCAMPI High Maturity Lead Appraiser certifi-cation to Neil Potter. This certification recognizes Neil's expertise for determining that an organizationhas demonstrated the capability to quantitatively manage its projects to produce high-quality,predictable results at CMMI Maturity Levels 4 and 5. Benefits of this certification include recognitionfrom the SEI as a member of the select group of individuals who are setting the standards for the leadappraiser community.

10 OUnderstand customer needs. Clarify product requirements two-day workshop, IN SEARCHOF EXCELLENT REQUIREMENTS, teachessoftware engineers, managers, requirements analysts and user representativeshow to gather, document, analyze and manage customer requirements forsoftware applications and projects effectively. Meet project deadlines and reduce three-day workshop, PROJECT PLANNINGAND MANAGEMENT, teaches projectmanagers and their teams how to meet deadlines through better estimation,reduce surprises using risk management, schedule work for better optimization,understand and negotiate project trade-offs, and track project deadlines. Scope and estimate the project one-day workshop, PROJECT ESTIMATION, (a subset of Project Planning andManagement) helps teams develop more accurate schedule delays caused by needless product rework. Find defects two-day workshop, INSPECTION (PEER REVIEWS), teaches teams to efficientlyfind defects in code and documentation.


Related search queries