COCOMO-III-Workshop-v4 - Center for Software Engineering

Report
University of Southern California
Center for Systems and Software Engineering
COCOMO III Workshop
USC CSSE Annual Research Review 2014
Brad Clark, Moderator
5/1/2014
© USC-CSSE
1
University of Southern California
Center for Systems and Software Engineering
Observations
• COCOMO II challenged by different development
strategies
• 2000 calibration dataset is over 14 years old
• Productivity appears to be increasing over time
• Levels of reported process maturity increasing in
Software Engineering data
• Productivity appears to decline with multiple
incremental development
5/1/2014
© USC-CSSE
2
University of Southern California
Center for Systems and Software Engineering
COCOMO II Challenges
1995: one-size-fits-all model for 21st century software
1999: poor fit for schedule-optimized projects;
CORADMO
2000: poor fit for COTS-intensive projects: COCOTS
2003: need model for product line investment: COPLIMO
2003: poor fit for agile projects: Agile COCOMO II
(partial)
2012: poor fit for incremental development: COINCOMO
5/1/2014
© USC-CSSE
3
University of Southern California
Center for Systems and Software Engineering
COCOMO II Data by 5-Year Periods
5/1/2014
© USC-CSSE
4
University of Southern California
Center for Systems and Software Engineering
COCOMO II Data: Productivity Trends
5/1/2014
© USC-CSSE
5
University of Southern California
Center for Systems and Software Engineering
COCOMO II Data: Process Maturity Trends
5/1/2014
© USC-CSSE
6
University of Southern California
Center for Systems and Software Engineering
Trends Confounded by Missing Variables
Incremental Development Productivity Decline
QMP
12
10
8
Productivity
6
Linear (Productivity)
Log. (Productivity)
4
y = -0.7989x + 8.5493
R² = 0.3693
2
y = -2.708ln(x) + 8.7233
R² = 0.5326
0
1
5/1/2014
2
3
4
5
© USC-CSSE
6
7
University of Southern California
Center for Systems and Software Engineering
Workshop Objectives
• Discuss how do we get from COCOMO II to
COCOMO III
– Explore models for unexplained existing sources or drop
– Try added variables for mostly-general fit to existing data
– Obtain more data to validate results
5/1/2014
© USC-CSSE
8
University of Southern California
Center for Systems and Software Engineering
Workshop Topics
1. Consider incorporating Software Application Domains
2. Discuss additional model forms
3. Review current set of cost drivers
5/1/2014
© USC-CSSE
9
University of Southern California
Center for Systems and Software Engineering
Software Application Domains
• Account for different productivities in data
• Pre-set cost drivers
• Select model form
• Select effort / schedule distributions
• CAUTION:
– More domains requires More data
– Calibration takes place within the domain
5/1/2014
© USC-CSSE
10
University of Southern California
Center for Systems and Software Engineering
SEER-SEM Application Domain
Artificial Intelligence
Data Mining
Financial Transactions
MIS
Relational Database
Testing Software
Business Analysis Tool
Data Warehousing
Graphical User Interface
Multimedia
Report Generation
Training / CBT / CAl
CAD - Computer Aided
Design
Device Driver
Graphics
Object Oriented
Database
Robotics
Transaction Processing
Command/Control
Diagnostics
Internet Server Applet
Office Automation
Simulation
Flight Systems
Communications
Embedded
Electronics/Appliance
Mathematical & Complex
OS/Executive
Algorithms
Software Development
Tools
Mission Planning &
Analysis
Database
Expert System
Message Switching
System & Device Utilities
Radar
Process Control
Common Domain Titles
Signal Processing
Business Domain
• Pros
• Most popular taxonomy within DoD (developers, cost agencies, support KTR)
• Common titles and comprehensive definitions
• Compatible with NRO/ STSC/SMC/AFCAA Application Domains/Types
• Cons
• List is consider too large by many practitioners (37 domains)
• 12 of 37 considered Business type
• Missing 2 popular DoD domains: Payload, Weapons Delivery and Control
University of Southern California
Center for Systems and Software Engineering
SLIM Application Types
Microcode & Firmware
Command & Control
Real Time
Telecommunications
Avionic
Scientific
System Software
Process Control
Business
Common Domain Titles
Business Domain
• Pros
• Very popular within Army (ODASA-CE, Army-wide license) and Commercial
• Definitions provided
• Cons
• Missing 4 popular DoD domains: Training, Test, Tools, and Simulation
• Some terms are unique (e.g. Microcode vice Signal Processing)
• Some terms overlap (e.g. Real Time with Avionics)
University of Southern California
Center for Systems and Software Engineering
ISBSG Application Types
Catalogue or register of
things or events
Customer billing
Customer relationship
management
Document management
Job, case, incident,
project management
Logistic or supply
planning & control
Management Information
System
Management or
performance reporting
Stock control & order
processing
Trading
Transaction/production
system
Workflow support &
management
Electronic Data
Interchange
Office Information System Automatic Data Logging
Executive Information
System
Online analysis and
reporting
Financial transaction
process & accounting
Command & control
Reservation system (e.g.
system (e.g. military, air
airline, hotel)
traffic, police)
Embedded software for
simple device control
Complex process control
(e.g. oil refinery, steel
Artificial Intelligence
manufacture)
Geographic or spatial
Fault Tolerance
information system
Image, video or sound
Robot control
processing
Telecom & network
Mathematical modelling
management
Transportation control
Scientific/ engineering
(includes avionics,
application
signalling)
Software development
tool
Statistical analysis
3D modelling or animation
Image, video or sound
processing
Data or database
management
Device or interface driver
Graphics & publishing
tools or system
Operating system or
software utility
Personal productivity (e.g.
word processor,
spreadsheet)
Geographic or spatial
information system
Common Domain Titles
Business Domain
• Pros
• Most popular taxonomy world wide; used on more than 6,000 projects
• Cons
• List is consider too large (41) and most domains are business (21 of 41)
• Missing definitions on 33 out of 42 domains
• Missing 5 popular DoD domains:
•
Weapons Delivery and Control , Payload, Training, OS/Executive, Radar
University of Southern California
Center for Systems and Software Engineering
AFCAA Productivity Types (PT)
(used in the online Software Cost Metrics Manual)
Real-Time Embedded
Telecommunications
Test Software
Sensor Control and Signal Processing
Mission Processing
System Software
Mission Planning
Training
Vehicle Payload
Scientific Software
Process Control
Vehicle Control
Software Tools
Intelligence and Information Systems
Common Domain Titles
Business Domain
• Pros
• 24 AFCAA Application Domains 14 Complexity Zones (Productivity Types)
• Productivity Types determined via a Delphi-Method:
•
•
2-Day Workshop hosted at the Massachusetts Institute of Technology , 24th International
Forum on COCOMO and Systems/Software Cost Modeling, 4-5 Nov 2009
21 Participants -- academia, SEI-CMU, Lockheed, Northrop, Rockwell…
• Same AFCAA Application Domain Definitions
• PTs maps to SEER-SEM, SLIM, COCOMO II, and ISBSG Application
Domains
• Cons
• 3 DoD domains are hidden:
1. OS/Executive captured within Vehicle Control
2. Command and Control captured within Mission Processing
3. Simulation captured within Scientific Software
University of Southern California
Center for Systems and Software Engineering
Super-Domains and AFCAA Productivity Types
Productivity Types
Super Domain
1 Sensor Control and Signal Processing
Real-Time
2 Vehicle Control
(RT)
3 Vehicle Payload
4 Real Time Embedded-Other
5 Mission Processing
6 Executive
Engineering
(ENG)
7 Automation and Process Control
8 Scientific Systems
9 Telecommunications
10 Planning Systems
Mission Support
(MS)
11 Training
12 Software Tools
13 Test Software
Automated Information System
14 Intelligence and Information Systems
Software Services
(AIS)
March 2014
Software Applications
Software Characterization
15
University of Southern California
Center for Systems and Software Engineering
Domain Comparisons
SEER (37)
AFCAA Domains (24)
AFCAA PT (14)
COCOMO II (12)
SLIM (9)
Signal Processing
•
•
Signal Processing
Sonar
Sensor Control and Signal
Processing
Signal Processing
Microcode & Firmware
•
•
Weapons Delivery and Ctrl
Payload
Vehicle Payload
ISBSG (42)
Transportation control (includes
avionics, signalling)
Flight Systems
Avionic
Infrastructure
Vehicle Control
OS/Executive
Radar
•
Robotics
•
Embedded
Electronics/Appliance
•
GUI (cockpit displays)
•
Device Driver
•
System & Device Utilities
Radars
Real-Time Embedded
•
•
Electronic Warfare
Controls & Display
•
•
Platform
Information Assurance
System
•
•
Operating System
Utilities
Process Control
Process Control
Process Control
Process Control
Command and Control
•
•
Mission Processing
Command and Control
Mission Planning & Analysis
•
Communications
•
Message Switching
Training , CBT, CAl
Command and Control
Mission Management
Mission Planning
Communications
•
•
Robot control
Embedded software for simple
device control
System
•
•
Device or interface driver
Operating system or software
utility
Process Control
•
Complex process control
Command and Control
•
•
Command & control system
Automatic Data Logging
•
Telecom & network
management
Image, video or sound
processing
Real Time
Mission Planning
Telecommunications
Training
Training
Tool and Tool Systems
Software Tools
Communications
Telecommunication
•
•
•
Software Development Tools
•
Business Analysis Tool, CAD
•
Diagnostics
•
•
Testing Software
•
Software Tools
•
Maintenance and
Diagnostics
Test & Evaluation
Test Software
•
•
Diagnostics
Testing
Graphics & publishing tools or
system
Software development tool
Fault Tolerance
University of Southern California
Center for Systems and Software Engineering
Workshop Topics
1. Consider incorporating Software Application Domains
2. Discuss additional model forms
3. Review current set of cost drivers
5/1/2014
© USC-CSSE
17
University of Southern California
Center for Systems and Software Engineering
Additional Model Forms
• Keep COCOMO II models?
– Application Composition
– Early-Design
– Post-Architecture
• Should COCOMO III be backwards compatibility to
COCOMO 81 & COOCMO II?
• New parameters, e.g.,
– to indicate the type of processes that are planned for the
development e.g.: plan-driven, rapid development, architected
agile, formal methods, COTS integration.
5/1/2014
© USC-CSSE
18
University of Southern California
Center for Systems and Software Engineering
Three COCOMO II Models
• Application Composition Model
– Involves prototyping to resolve risks
– Uses application points to bound size of the job
• Early Design Model
– Exploration of alternative architectures
– Uses function points and set of 7 cost drivers
• Post Architecture Model
– Actual development & maintenance of software
– Uses variety of size measures, 17 cost drivers and 5 scale
factors to estimate resources
2/288/00
Copyright 2000, Reifer Consultants
19
University of Southern California
Center for Systems and Software Engineering
Effort Model Forms
Log-Linear Model
PM = A* Size * Õ EM
B
Non-Linear Model 1
PM = C + A* SizeB * Õ EM
Non-Linear Model 2
PM = C + Size B * Õ EM
Where
PM
Size
A
B
C
EM
=
=
=
=
=
=
Software development effort (in Person-months)
Size in Thousand Equivalent Source Lines of Code (KESLOC)
Calibrated Productivity constant (ESLOC/PM)
B-exponent
Fixed level of effort support activities (in Person-Months)
Effort Multipliers
University of Southern California
Center for Systems and Software Engineering
Schedule Model Forms
TDEV = A* PM F *(%Comp)
COCOMO II Model
Where:
TDEV
Size
FTE
PM
A
B
C
=
=
=
=
=
=
=
F
=
TDEV  A * SizeB * FTE C
Non-Linear Model
Time (in months) to develop the Software Product
Software Size in Equivalent Source Lines of Code (ESLOC)
Full Time Equivalent (FTE) Staffing Levels
Total Estimated Effort in Person-Months (PM)
is a duration constant
Scaling factor to account for changing productivity as size increases,
C-Scaling Factor accounts for the non-linear relationship between
increasing staffing levels and shortening development time, TDEV
Scaling factor for effort changes
University of Southern California
Center for Systems and Software Engineering
Workshop Topics
1. Consider incorporating Software Application Domains
2. Discuss additional model forms
3. Review current set of cost drivers
5/1/2014
© USC-CSSE
22
University of Southern California
Center for Systems and Software Engineering
COCOMO II Cost Driver Review
• New cost driver values based on post-2000 data points
• Review cost drivers for
– Relevance?
– Additions / deletions?
• Which cost drivers need a better rating selection
system that reduces rating subjectivity
5/1/2014
© USC-CSSE
23
University of Southern California
Center for Systems and Software Engineering
• Vu’s cost driver analysis
5/1/2014
© USC-CSSE
24

similar documents