Introducing and Sustaining Change

Report
3 CMMI® Views
Southern California SPIN
4 June 2010
Rick Hefner
Director, Process Assurance
Northrop Grumman Corporation
Background
• Many published results show improved cost and schedule
performance from adopting CMMI®
• Despite these results, there is still community debate over the
value of CMMI®, and whether CMMI® ratings provide sufficient
guarantees of program performance.
• This program will explore three factors contributing to the
confusion:
– Inaccurate CMMI® ratings
– Over-estimating the benefits that CMMI® provides a customer
– Contractors not living up to their CMMI® rating
2
Agenda
Underlying CMMI® Principles
• CMMI® relationship to productivity, predictability and speed
Does CMMI Benefit the Customer?
How Projects Fail
How to Get Contractors to Live Up to Their CMMI Ratings
3
SM SCAMPI, SCAMPI Lead Appraiser, and SEI are service marks of Carnegie Mellon University.
® Capability Maturity Model Integration and CMMI® are registered in the U.S. Patent & Trademark Office.
Projects Have Historically Suffered from
Mistakes
People-Related Mistakes
1. Undermined motivation
2. Weak personnel
3. Uncontrolled problem
employees
4. Heroics
5. Adding people to a late
project
6. Noisy, crowded offices
7. Friction between developers
and customers
8. Unrealistic expectations
9. Lack of effective project
sponsorship
10. Lack of stakeholder buy-in
11. Lack of user input
12. Politics placed over
substance
13. Wishful thinking
4
Process-Related Mistakes
14. Overly optimistic schedules
15. Insufficient Risk
Management
16. Contractor failure
Insufficient
planning
17. Abandonment of planning
under pressure
18. Wasted time during the
fuzzy front end
19. Shortchanged upstream
activities
20. Inadequate design
21. Shortchanged quality
assurance
22. Insufficient management
controls
23. Premature or too frequent
convergence
25. Omitting necessary tasks
from estimates
26. Planning to catch up later
27. Code-like-hell programming
Reference: Steve McConnell, Rapid Development
Product-Related Mistakes
28. Requirements gold-plating
29. Feature creep
30. Developer gold-plating
31. Push me, pull me
negotiation
32. Research-oriented
development
Technology-Related Mistakes
33. Silver-bullet syndrome
34. Overestimated savings
from
new tools or methods
35. Switching tools in the
middle
of a project
36. Lack of automated
source-code control
Standish Group survey of
13,000 projects (2003)
• 34% successes
• 15% failures
• 51% overruns
Many Approaches to Solving the Problem
• Which weaknesses are causing my problems?
• Which strengths may mitigate my problems?
• Which improvement investments offer the best return?
People
Business
Environment
Tools
One solution!
Process
5
Management
Structure
Product
Methods
Technology
Approaches to Process Improvement
Data-Driven (e.g., Six Sigma, Lean)
Model-Driven (e.g., CMM®, CMMI®)
• Clarify what your customer wants
(Voice of Customer)
– Critical to Quality (CTQs)
• Determine the industry best
practice
– Benchmarking, models
• Determine what your processes
can do (Voice of Process)
– Statistical Process Control
• Compare your current practices
to the model
– Appraisal, education
• Identify and prioritize
improvement opportunities
– Causal analysis of data
• Identify and prioritize
improvement opportunities
– Implementation
– Institutionalization
• Determine where your
customers/competitors are going
(Voice of Business)
– Design for Six Sigma
6
• Look for ways to optimize the
processes
What Is the CMMI® Trying to Achieve?
A model is a simplified representation of the world. Capability Maturity
Models (CMM ®s) contain the essential elements of effective processes for
one or more bodies of knowledge. These elements are based on the
concepts developed by Crosby, Deming, Juran, and Humphrey.
-Introduction, CMMI ®
• CMMI® provides a model of industry best practices
• Following these practices has shown to produce software and
systems faster, better, and cheaper, when properly applied
• The main benefits cited by CMMI® users are:
– More predictable adherence to budgets and schedules
– Reduced re-work (which can reduce cost and schedule)
– Reduced risk
7
How Do Mature Processes Help?
• Process maturity gets at one
source of the problem, e.g.,
– Are we using proven industry
practices?
– Does the staff have the
resources needed to execute
the process?
– Is the organization providing
effective project support?
• The main benefits typically
seen are:
– Improved predictability of
project budgets and schedules
– Improved management
awareness of problems
– Reduced re-work, which
improves predictability, cost,
and schedule
8
J. Herbsleb and D. Zubrow,
“Software Process Improvement:
An Analysis of Assessment Data
and Outcomes”
– 13 organizations
– ROI of 4:1 to 9:1
– Improved quality, error rates,
time to market, productivity
R. Dion, “Process Improvement
and the Corporate Balance
Sheet”
 ROI of 7.7:1: Reduced rework, improved quality
 Two-fold increase in
productivity
Agenda
Underlying CMMI® Principles
Does CMMI Benefit the Customer?
• Cost of implementing CMMI-compliant processes
• Timelines for impacting program performance
• Practical tips and techniques for realizing the benefits
How Projects Fail
How to Get Contractors to Live Up to Their CMMI Ratings
9
SM SCAMPI, SCAMPI Lead Appraiser, and SEI are service marks of Carnegie Mellon University.
® Capability Maturity Model Integration and CMMI® are registered in the U.S. Patent & Trademark Office.
CMMI® Provides Several Related Benefits
10
Project Performance
Organizational Performance
Quality/Rework
Institutionalization
Rick Hefner, “Achieving the Promised Benefits of CMMI,” CMMI Technology Conference &
User Group, Denver, CO, 14-17 Nov 2005
Project Performance
CMMI
• Project performance problems
often arise because of
incomplete or unrealistic
planning
– Forgotten activities
– Unconscious decisions
– Overly-optimistic estimates
• When cost/schedule pressure
arises, people abandon the
plans, leading to more
problems
– Individual judgment versus
best use of resources
• Identifies the elements of good
planning
– Proven engineering processes
– Estimates based on historical
data, using these processes
• When cost/schedule pressure
arises, CMMI® practices track and
correct
– Reactive (L2)
– Proactive, risk management
(L3)
– Quantitative management (L4)
• QA, management ensures
processes/plans are followed
 Train project managers on how to use the tools (estimation, earned
11
value, risk management)
 Project managers (not organizational staff) must be responsible for
implementing the improved processes
 Demand realistic, data-driven estimates
Organizational Performance
CMMI
• Each project’s processes are
unique
– Personnel must re-learn
with each project
– Difficulty moving people
from project to project
– Historical data of little use
in estimation
• No way to compare project-toproject
– Which process was best?
– What did we learn?
• Standard organizational process,
tailored to fit each project
– Can be documented, trained,
supported by templates
– Over time, people learn the
process
• Common processes/measures
allow better use of historical data
– Calibrate cost estimation
models
– Project to project comparisons
– Over time, the organization can
optimize the process
 Develop an organizational process(es) which fits the full range of your
projects (small/large, all life cycles and project types)
 Capture and use historical data (measurement repository)
 Capture and share project documents (process asset library)
12
Rework/Quality
CMMI
• Focus on “faster and cheaper”
leads to skipping of essential
steps
– Key steps are not obvious,
often counter intuitive
• A disciplined engineering and
management process
– Do it right the first time
– CMMI identifies the essential
steps
• Fixing latent defects often
accounts for 30-40% of project
cost
– The cost of defects
(rework) is seldom
measured
• Peer reviews find defects early,
where it is cost effective to fix them
– Requirements, designs, code,
plans, etc.
– Often more efficient and
effective than testing
– Many types (Fagan inspections,
walkthroughs, desk checks, etc.)
 Focus on eliminating defects, not on faster and cheaper
 Measure the cost of finding and fixing defects
 Invest time in learning different methods of peer review and when
each is effective
13
Institutionalization
CMMI
• Some improvement efforts
focus on quick fixes
– Driven by yearly budget
cycles
– Expectation that results will
be immediate
• It is tempting to reduce
overhead to reduce cost
– Training
– Staff support to projects
– Use of outside process
experts
• Short-term investment for longterm gain
– Initial investment in the cost of
change, learning curve, new
overhead structures
– Long-term benefits in increased
productivity
• Organizational infrastructure exists
to support the policies and process
– Measurement repositories
 Expect 18-24 months before benefits begin to be realized
 Senior management must demand that everyone follow the new
processes
 QA can be the organization’s strongest tool – if they are focused!
14
Benefits
• The typical benefits are:
–
–
–
–
–
Reduced cost
Faster schedules
Greater productivity
Higher quality
Increased customer satisfaction
• Over 40 published studies on the benefits of SW-CMM®
– DoD DACS website: http://www.thedacs.com/databases/roi/
• Similar results starting to be seen for CMMI®
– “Demonstrating the Impact and Benefits of CMMI: An Update and
Preliminary Results,” Software Engineering Institute, CMU/SEI-2003-SR009, Oct 2003
– http://www.sei.cmu.edu/cmmi/results/results-by-category.html
15
Typical CMMI Benefits Cited in Literature
• Reduced Costs
– 33% decrease in the average cost
to fix a defect (Boeing)
– 20% reduction in unit software
costs (Lockheed Martin)
– Reduced cost of poor quality from
over 45 percent to under 30
percent over a three year period
(Siemens)
– 10% decrease in overall cost per
maturity level (Northrop Grumman)
• Faster Schedules
16
– 50% reduction in release
turnaround time (Boeing)
– 60% reduction in re-work following
test (Boeing)
– Increase from 50% to 95% the
number of milestones met (General
Motors)
• Greater Productivity
– 25-30% increase in productivity
within 3 years (Lockheed Martin,
Harris, Siemens)
• Higher Quality
– 50% reduction of software defects
(Lockheed Martin)
• Customer Satisfaction
– 55% increase in award fees
(Lockheed Martin)
Cost vs. Benefit
• Both theoretical models and industry data suggests that CMMI-compliant
projects achieve a cost reduction of 10% per level,
i.e., Level 3 is 20% cheaper than Level 1
– The key is reducing rework
• Knox Model – Theoretical Benefits
Prevention A pprais al Int Failure Ext Failure TCoSQ
COCOMO predicts
similar benefits
based on current
industry data
60
Cost as a Percent of Development
50
40
30
20
10
0
1
17
2
3
4
SEI CMM Lev el
5
When Good Organizations Go Bad
• Some organizations are driven to achieve a maturity
level only for it’s marketing value
18
Improvement goals are not set
realistically (“Level 5 in ’05”)
Focus on passing the appraisal, not
understanding and deciding among
possible interpretations
Only some of the projects participate in
the improvement effort
Practitioners/customers perceive
CMMI as more expensive
Only some of the projects get appraised
The remaining projects don’t
implement
Insufficient resources (e.g., training, QA,
metrics, consultants)
People don’t learn or become
proficient in the new behaviors
Management doesn’t enforce using
processes on new programs
Benefits are not realized because
projects do not start up effectively
Rick Hefner, “CMMI Horror Stories: When Good Projects Go Bad,” Software Engineering Process Group
Conference , 6-9 March 2006
What Does a CMMI Level Guarantee?
Decisions made on the basis of maturity level ratings are only valid if the
ratings are based on known criteria.
- SCAMPI A Method Description Document
• A CMMI appraisal indicates the organization’s capacity to
perform the next project, but cannot guarantee that each new
project will perform in that way
• The CMMI methodology assumes the organization will
propagating their processes to every new project
– An organization that gets appraised solely to demonstrate a maturity
level might not have that intent
– Organizations may not have developed the skills to roll out their
processes effectively
• A CMMI appraisal judges the maturity of the organization’s
processes – based upon the projects sampled
19
– New projects must embrace the new processes
How Does Level 4 & 5 Benefit the
Customer?
 Organizational
process performance
 More accurate estimates
 Quantitative project
management
 Problem behaviors are
recognized faster, enabling
quicker resolution
 Organizational
innovation and
deployment
 The project benefits from
improvements found and
proven on other projects
 Causal analysis
 The project fixes the source
of defects to prevent future
defects
Better Products and Services Produced Faster And Cheaper
20
Rick Hefner, “How Does High Maturity Benefit the Customer?,” Systems & Software Technology
Conference, 18-22 April 2005
The Project Manager’s Dilemma at Level 3
I want to use the organization’s
standard process, but…
… Does it’s performance and
quality meet my customer’s
expectations?
… If not, how should I tailor the
process?
21
Understanding the Process
Managing by Variation
• How many errors are typically found in reviewing Ian
interface
specification?
Chart
for Errors
15
Expected
Variation
Average
Individual Value
UCL=12.30
10
5
Mean=4.799
0
LCL=-2.705
• Useful in evaluating future reviews
– Was the review effective?
– Was the process different?
– Is the product different?
22
0
1
2
3
4
5
6
7
8
9
10
Observation Number
Corrective and
preventative actions
Typical Choices in Industry
• Most customers care about:
– Delivered defects
– Cost and schedule
Defect Detection Profile
• So organizations try to
predict:
23
160.00
140.00
120.00
Defects/KSLOC
– Defects found throughout the
lifecycle
– Effectiveness of peer reviews,
testing
– Cost achieved/actual
(Cost Performance Index – CPI)
– Schedule achieved/actual
(Schedule Performance Index –
SPI)
180.00
100.00
All Projects
New Process
80.00
60.00
40.00
20.00
0.00
Req'mts
Design
Code
Unit Test
Phase
Integrate
Sys Test
Del 90 Days
What Can a Level 4 Organization Do?
• Determine whether processes are behaving consistently or
have stable trends (i.e., are predictable)
• Identify processes where the performance is within natural
bounds that are consistent across process implementation
teams
• Establish criteria for identifying whether a process or process
element should be statistically managed, and determine
pertinent measures and analytic techniques to be used in such
management
• Identify processes that show unusual (e.g., sporadic or
unpredictable) behavior
• Identify any aspects of the processes that can be improved in
the organization's set of standard processes
24
• Identify the implementation of a process which performs best
Lessons Learned
Based on over 20 Northrop Grumman CMMI Level 5 organizations
• Six Sigma is an enabler for higher maturity
– Focus on data, measurement systems, process improvement
– Tying improvements to business goals
– Tools and methods support the Level 4/5 analysis tasks
• Level 3 metrics, measurement processes, and goal setting are
generally inadequate for Levels 4 and 5
– Better definitions of the measures
– Lower level metrics of lower level subprocesses
• Having all the tools at Level 5 gives you the insight to manage
each project the way the customer needs it to be managed
25
Agenda
Underlying CMMI® Principles
Does CMMI Benefit the Customer?
How Projects Fail
• Start up problems
• Appraisal inaccuracies
How to Get Contractors to Live Up to Their CMMI Ratings
26
Where Could Problems Arise?
• The projects within the organization may not live up to the
capability
–
–
–
–
Start-up problems with planning, subcontractors, and infrastructure
Problems with staffing, either as the prime or with subcontractors
Differences in domain experience
Back-sliding
• The appraisal results may not be an accurate reflection of the
organization’s capability
– Sampling bias
– Appraisal inaccuracies
– Organization’s inability to immediately apply their appraised processes
• Note that all of these problems are equally possible with both
the staged and continuous representations
27
The First Three Months:
Essential Project Start-Up Activities
• Many process-related problems arise
in the first few months of a project
–
–
–
–
New relationships are established
Personnel changes and shortfalls
Pressure to produce quickly
Gaps between the planned processes and what was bid
• If a project is going to live up to the organization’s process
capability, it is essential to fully implement the processes from
the beginning
28
– Processes should be defined during the proposal, by tailoring the
organization’s standard process
– Estimates should be based on historical data from the organization’s
measurement repository
– Process assets (e.g., templates) should support detailed planning to
ensure consistency with the organization’s best practices
– Evidence reviews should be used to ensure CMMI compliance
Preventing Back-Sliding
• The CMMI generic practices
ensure that processes are
institutionalized – sustained
over time
• The approach for
implementing the generic
practices must reflect:
– Efficiency
– Effectiveness
– Applicability to ALL projects
• Frequent appraisals can be
used to assess the
effectiveness of the
institutionalization
29
Geoff Draper and Rick Hefner, “Applying CMMI
Generic Practices with Good Judgment,” SEPG
Conference, 2004.
Commitment to
Perform
Ability to Perform
Policies and
sponsorship
Project and/or
organizational resources
GP 2.1 Establish
Organizational Policy
GP 2.2 Plan the Process
GP 2.3 Provide Resources
GP 2.4 Assign
Responsibility
GP 2.5 Train People
GP 3.1 Establish a Defined
Process
Directing
Implementation
Verifying
Implementation
Managing performance
of the process
Management review,
process conformance
GP 2.6 Manage
Configurations
GP 2.7 Identify/Involve
Relevant Stakeholders
GP 2.8 Monitor and
Control the Process
GP 3.2 Collect
Improvement Info.
GP 2.9 Objectively
Evaluate Adherence
GP 2.10 Review Status
with Higher Level
Management
Sampling Bias
The size and number of instantiations investigated should be selected to
form a valid sample of the organizational unit to which the results will be
attributed.
- SCAMPI A Method Description Document
• The Lead Appraiser is permitted to select sample projects as
“representative” of the organization as a whole
– Little guidance in the MDD
– Wide variation among Lead Appraisers
Only some of the projects get appraised
The remaining projects don’t
implement
• If an organization is only interested in a good appraisal result, they will
appraise large organizations with a handful of samples, and/or exclude/hide
inferior projects
• This potential abuse exist with both staged and continuous representations
30
Organizational Sampling
• An organization with 50 projects at multiple sites may select 45 sample projects
• Are the appraisal results representative of the organization?
Sector
Division A
Site A1
Division B
Site A2
Division C
Project XX
Project XX
Project XX
Project XX
Project XX
Project XX
Project XX
Project XX
Project XX
Project XX
Project XX
Project XX
Project XX
Project XX
Sampled projects
31
Appraisal Inaccuracies
• Methodology
– SCAMPI A appraisals are the only approach that provides benchmark
quality appraisal results
– SCAMPI B, C, and other appraisal methods may be useful, but they are
not designed to provide the same accuracy
• Appraiser Skill
– There is wide variation in appraiser skill, experience and insight
– Although appraisal experience is a crucial contributor to accuracy, the
appraisal methods do little to ensure sufficient experience
– There is also wide variation in how the model is interpreted
• Appraiser Independence
– Appraiser independence in needed to ensure unbiased results
– It is difficult to establish a completely independent situation
32
Fiction or Non-Fiction: How to Read
Appraisal Results for Fun and Profit
The ADS is a summary statement describing the appraisal results that
includes the conditions and constraints under which the appraisal was
performed. It contains information considered essential to adequately
interpret the meaning of assigned maturity level or capability level ratings.
- SCAMPI A Method Description Document
• The Appraisal Disclosure Statement (ADS) provides keys to
assessing an appraisal’s accuracy
– Organizational unit appraised (the unit to which the ratings are
applicable and the domains examined)
– Appraisal team leader and appraisal team members and their
organizational affiliations
– Process areas rated and process areas not rated
– Dates of on-site activity
• Not included - sampling approach or percentage of projects
sampled
33
How to Write a Better RFP
Acquirers seeking to ensure that the proposed project will
implement mature practices should request the following:
• SCAMPI A Appraisal Disclosure Statement
–
–
–
–
Organizational unit appraised
Appraisal team leader affiliation
Process areas rated and not rated
Dates of on-site activity
• Explanation of sampling approach used in appraisal
• Approach to be used to ensure proper project start-up
• Data to demonstrate the speed with which new projects adopt
and execute the organization’s processes
• Approach to be used to prevent back-sliding
34
Agenda
Underlying CMMI® Principles
Does CMMI Benefit the Customer?
How Projects Fail
How to Get Contractors to Live Up to Their CMMI Ratings
• Contenders and Pretenders
35
Background
• There is a marked difference between organizations that truly
want to implement CMMI®, and those who simply want a
“certificate”
• Contenders invest time and energy on understanding the
industry best practices in the model, fitting them to their
projects and organization, and improving their effectiveness
and efficiency
• Pretenders simply do enough to convince an appraiser to give
them the maturity level -- along the way, they de-motivate
their staff with bureaucratic processes, disappoint their
customers with inconsistent performance, and generally give
the model a bad name
36
Where Could Problems Arise?
Assuming the contractor’s CMMI® rating is accurate,
and applicable to the team doing the work,
where could problems arise?
• Areas outside of the CMMI®
• Start-up problems
• Back-sliding
37
Areas Outside of the CMMI®
Process
People
Domain knowledge
Sufficient quantity
Motivation
38
Technology
Domain-specific
Maturity
Tools
Top Five System Engineering Issues
39
1.
Lack of awareness of the importance, value, timing, accountability, and
organizational structure of SE on programs
2.
Adequate, qualified resources are generally not available within
Government and industry for allocation on major programs
3.
Insufficient SE tools and environments to effectively execute SE on
programs
4.
Requirements definition, development and management is not applied
consistently and effectively
5.
Poor initial program formulation
“Top Five Systems Engineering Issues In Defense
Industry”, NDIA Systems Engineering Division Task Group
Report, Jan, 2003
Top Software Engineering Issues
1. The impact of requirements upon software is not consistently quantified
and managed in development or sustainment
2. Fundamental system engineering decisions are made without full
participation of software engineering.
3. Software life-cycle planning and management by acquirers and suppliers is
ineffective.
4. The quantity and quality of software engineering expertise is insufficient
to meet the demands of government and the defense industry.
5. Traditional software verification techniques are costly and ineffective for
dealing with the scale and complexity of modern systems.
6. There is a failure to assure correct, predictable, safe, secure execution of
complex software in distributed environments.
7. Inadequate attention is given to total lifecycle issues for COTS/NDI
impacts on lifecycle cost and risk.
40
“Top Software Engineering Issues In Defense Industry”,
NDIA Systems Engineering Division and Software
Committee, Sep 2006
Start-Up Issues
• Project Planning starts in the proposal phase, is refreshed at
contract start, and re-occurs throughout the project lifecycle
• Contenders extend their CMMI practices to proposal teams and
re-planning efforts
• Pretenders focus on contract start
– Costs and schedules defined at proposals may be immature and overlyaggressive
– Re-planning may be ad hoc
• Mature estimates may also be overruled by business interests
41
CMMI® Project Planning - Goal 1
SG 1 Establish Estimates
Estimates of project planning parameters are established and maintained.
SP 1.1 Estimate the Scope of the Project
Establish a top-level work breakdown structure (WBS) to estimate the scope of the
project.
SP 1.2 Establish Estimates of Work Product and Task Attributes
Establish and maintain estimates of the attributes of the work products and tasks.
SP 1.3 Define Project Lifecycle
Define the project life-cycle phases upon which to scope the planning effort.
SP 1.4 Determine Estimates of Effort and Cost
Estimate the project effort and cost for the work products and tasks based on estimation
rationale.
42
CMMI® Project Planning - Goal 2
SG 2 Develop a Project Plan
A project plan is established and maintained as the basis for managing the project.
SP 2.1 Establish the Budget and Schedule
Establish and maintain the project’s budget and schedule.
SP 2.2 Identify Project Risks
Identify and analyze project risks.
SP 2.3 Plan for Data Management
Plan for the management of project data.
SP 2.4 Plan for Project Resources
Plan for necessary resources to perform the project.
SP 2.5 Plan for Needed Knowledge and Skills
Plan for knowledge and skills needed to perform the project.
SP 2.6 Plan Stakeholder Involvement
Plan the involvement of identified stakeholders.
SP 2.7 Establish the Project Plan
Establish and maintain the overall project plan content.
43
CMMI® Project Planning – Goal 3
SG 3 Obtain Commitment to the Plan
Commitments to the project plan are established and maintained.
SP 3.1 Review Plans that Affect the Project
Review all plans that affect the project to understand project commitments.
SP 3.2 Reconcile Work and Resource Levels
Reconcile the project plan to reflect available and estimated resources.
SP 3.3 Obtain Plan Commitment
Obtain commitment from relevant stakeholders responsible for performing and
supporting plan execution.
44
Keys to Success
• Ask suppliers to show how they extend the CMMI practices to
proposal activities
• Request planning documents with the proposal
• During re-planning, ask suppliers to show how they performed
the CMMI practices
45
Back-Sliding:
A Failure of Institutionalization
Institutionalization: The ingrained way of doing business that an
organization follows routinely as part of its corporate culture.
- CMMI-DEV v1.2
When mentioned in the generic
goal and generic practice
descriptions, institutionalization
implies that the process is
ingrained in the way the work is
performed and there is
commitment and consistency to
performing the process.
An institutionalized process is
more likely to be retained during
times of stress.
46
GG 2 Institutionalize a Managed Process
GP 2.1 Establish an Organizational Policy
GP 2.2 Plan the Process
GP 2.3 Provide Resources
GP 2.4 Assign Responsibility
GP 2.5 Train People
GP 2.6 Manage Configurations
GP 2.7 Identify and Involve Relevant Stakeholders
GP 2.8 Monitor and Control the Process
GP 2.9 Objectively Evaluate Adherence
GP 2.10 Review Status with Higher Level
Management
GG 3 Institutionalize a Defined Process
GP 3.1 Establish a Defined Process
GP 3.2 Collect Improvement Information
Common Features –
A Lost Perspective in CMMI ® v1.2!
Commitment to Perform
GP 2.1 Establish an Organizational Policy
Directing Implementation
GP 2.6 Manage Configurations
GP 2.7 Identify and Involve Relevant Stakeholders
GP 2.8 Monitor and Control the Process
GP 3.2 Collect Improvement Information
Ability to Perform
GP 2.2 Plan the Process
GP 2.3 Provide Resources
GP 2.4 Assign Responsibility
GP 2.5 Train People
GP 3.1 Establish a Defined Process
Verifying Implementation
GP 2.9 Objectively Evaluate Adherence
GP 2.10 Review Status with Higher Level Management
47
Organizational Support
48
Contenders
Pretenders
• Fully support the CMMI ® based improvement program
by providing training,
templates, tools, process
assets libraries,
measurement repositories
and other work aids focused
on improving the ability of
practitioners to competently
adopt the model
• Largely ignore organizational
support, often to save money
• Where required by the
model, they establish process
asset libraries and
measurement repositories,
but they are largely
shelfware
Organizational Infrastructure Required for
CMMI® Level 3
Process Group
Training Program
Measurement Repositories
Predictive Modeling
Best-Practice Libraries
Audits & Appraisals
Defects per component
Policies, Processes,
Templates & Tools
Process Improvement
Communications
25
20
15
UCL
10
_
X
5
0
1
11
21
31
41
51
61
71
Component #
Developing and maintaining mature processes requires
significant time and investment in infrastructure
49
Organizational Culture
A pattern of shared basic assumptions that the group learned as it solved its
problems of external adaptation and internal integration, that has worked well
enough to be considered valid and, therefore, to be taught to new members
as the correct way you perceive, think, and feel in relation to those problems.
• Artifacts
– The practices that can be observed in such areas as dress code,
leadership style, communication processes
• Espoused values
– The elements the organization says it believes in, the factors that it says
influence the practices in which it engages
• Basic underlying assumptions
– Unstated beliefs the organization has come to accept and abide by
50
Organizational Culture & Leadership,
Edgar H Schein, used with permission
Management Commitment and Support
• Understands the key messages
• Is willing to take actions to reinforce them
• Provides resources to support/sustain process improvement
efforts
• Sets expectations that essential project functions will be funded
and processes will be followed
– Project planning, estimation, tailoring, CM, QA, etc.
• Supports process improvement and sustainment, rather than
passing appraisals
• Rewards mature processes development and sustainment
rather than individual heroics
51
Rick Hefner, “Sustaining CMMI Compliance,” 2006 CMMI
Technology Conference and User Group
Keys to Success
• Ask suppliers to show how they perform the CMMI generic
practices
• When problems occur, ask why the CMMI practices were not
effective in sustaining the desired behavior, and what will be
done to prevent future problems
52
Summary
• There is a marked difference between organizations that truly
want to implement CMMI®, and those who are simply try to get
a “certificate”
• By discussing the differences, we hope to help the CMMI®
community the true value of CMMI®
53
54

similar documents