• Home
  • Resources
    • Find Resources by Topic Tags
    • Cybersecurity Policy Chart
    • CSIAC Reports
    • Webinars
    • Podcasts
    • Cybersecurity Digest
    • Standards & Reference Docs
    • Journals
    • Certifications
    • Acronym DB
    • Cybersecurity Related Websites
  • Services
    • Free Technical Inquiry
    • Core Analysis Task (CAT) Program
    • Subject Matter Expert (SME) Network
    • Training
    • Contact Us
  • Community
    • Upcoming Events
    • Cybersecurity
    • Modeling & Simulation
    • Knowledge Management
    • Software Engineering
  • About
    • About the CSIAC
    • The CSIAC Team
    • Subject Matter Expert (SME) Support
    • DTIC’s IAC Program
    • DTIC’s R&E Gateway
    • DTIC STI Program
    • FAQs
  • Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
Login / Register

CSIAC

Cyber Security and Information Systems Information Analysis Center

  • Resources
    • Find Resources by Topic Tags
    • Cybersecurity Policy Chart
    • CSIAC Reports
    • Webinars
    • Podcasts
    • Cybersecurity Digest
    • Standards & Reference Docs
    • Journals
    • Certifications
    • Acronym DB
    • Cybersecurity Websites
  • Services
    • Free Technical Inquiry
    • Core Analysis Task (CAT) Program
    • Subject Matter Expert (SME) Network
    • Training
    • Contact
  • Community
    • Upcoming Events
    • Cybersecurity
    • Modeling & Simulation
    • Knowledge Management
    • Software Engineering
  • About
    • About the CSIAC
    • The CSIAC Team
    • Subject Matter Expert (SME) Support
    • DTIC’s IAC Program
    • DTIC’s R&E Gateway
    • DTIC STI Program
    • FAQs
  • Cybersecurity
  • Modeling & Simulation
  • Knowledge Management
  • Software Engineering
/ Journal Issues / Cost Estimation & Systems Acquisition / Parametric Modeling to Support System Acquisition

Parametric Modeling to Support System Acquisition

Published in Software Tech News
Volume: 15 Number: 1 - Cost Estimation & Systems Acquisition

Author: William Roetzheim
Posted: 03/14/2016 | Leave a Comment

Parametric Modeling Using High-Level Objects (HLOs)

At the core, there are fundamentally three approaches to estimating:

  1. Bottom up3, in which the project is decomposed to individual components that can be estimated by some other means, often expert judgment for labor.
  2. Analogy, in which historic budgets or actual expenditures are used as a basis for current estimates.  These numbers are normally adjusted to account for known differences between the historic project and the current project, including as a minimum an allowance for inflation based on the relative timeframes;
  3. Parametric, in which models are used to forecast cost based on some representation of project size combined with adjusting factors.

Traditional Project Management guidance is that Bottom up is the most accurate; followed by Analogy; followed by Parametric.  Whether this is true is highly dependent on the maturity of the parametric models in the domain you are trying to estimate.  For example, in the software domain we have had an opportunity to track budgets versus actuals for more than 12,000 projects that were estimated using the above three approaches.  What we have found is quite the opposite.

In the software domain, parametric is the most accurate, followed by analogy, followed by bottom up.  The standard deviation of the estimate for parametric estimates is 55% smaller than estimates by analogy and 64% smaller than bottom up estimates.  These ratios hold more-or-less true for estimates prepared at any stages of the project lifecycle (budgeting, feasibility, planning).  A value that is often more critical, at least when dealing with project portfolios or clusters of estimates for a given scope change, is estimation bias.  If you look at a statistically significant sample of estimates (e.g., 20 projects in a portfolio), and you total up both the estimates and the actuals for that collection, the bias is the difference between the numbers.  With parametric estimates and a properly calibrated model, this bias approaches zero (we consistently see it under 5% for large organizations, with a random direction).  With estimates by analogy, this number is typically closer to 10%, also with a random bias.  But with Bottom Up estimates, this number is typically between 15% and 20% with a bias toward under-estimating.  In the remainder of this section we’ll discuss parametric estimation in more detail.

image1_1

Figure 1:  Core Estimating Concept

As shown in Figure 1, the core requirements for effective parametric estimation in any domain are relatively simple4.  Step one in the process is to identify one or more high level objects (HLOs) that have a direct correlation with effort.  The HLOs that are appropriate are domain specific, although there is sometimes an overlap.  Examples of HLOs include yards of carpet to lay, reports to create, help desk calls to field, or claims to process.  In activity based costing (ABC), these would be the cost drivers.  HLOs are often assigned a value based on their relative implementation difficulty, thereby allowing them to be totaled into a single numeric value.  An example is function points, which are a total of the values for the function point HLOs (EQ, EI, EO, ILF, and EIF).  Don’t worry if you’re not familiar with those terms, it’s the idea that they represent something that’s important here.

HLOs may have an assigned complexity or other defining characteristics that cause an adjustment in effort (e.g., simple report versus average report).  It’s also typically necessary to have a technique for managing work that involves new development, modifications or extensions of existing components, or testing/validation only of existing components.  Various formulas or simplifying assumptions may be used for this purpose.  For example, in the case of reuse, the original COCOMO I model5 reduced the HLO size to:

HLO = HLO * (. 4DM + .3CM +  .3IT )

where DM is the percent design modification (1% to 100%); CM is the percent code modification (1% to 100%); and IT is the percent integration and test effort (1% to 100%).

Step two is to define adjusting variables that impact either on productivity, or on economies (or diseconomies) of scale.  The productivity variables tend to be things like the characteristics of the labor who will be performing the work or the tools they will be working with; characteristics of the products to be created (e.g., quality tolerance) or the project used to create them; and characteristics of the environment in which the work will be performed. The variables that impact on economies or diseconomies of scale are typically things that drive the necessity for communication/coordination, and the efficiency of those activities.  These adjusting variables are important both to improve the accuracy of any given estimate, and also to normalize data to support benchmarking across companies or between application areas.

Step three involves defining productivity curves.  These are curves that allow a conversion between adjusted HLO sizing counts and resultant effort.  They are typically curves (versus lines) because of the economies or diseconomies of scale that are present.  Curves may be determined empirically or approximated using industry standard data for similar domains.  Curves may also be adjusted based on the degree to which the project is rushed.  In any event, procedures are put in place to collect the necessary data to support periodic adjustment of the curves to match observed results, a process called calibration.

The outputs of the process are driven by the needs of the organization.  These outputs can be broken down into three major categories:

  1. Cost (or effort, which is equivalent for this purpose):  In addition to the obvious total value, most organizations are interested in some form of breakdown.  Typical breakdowns include breakdowns by organizational unit for budgetary or resource planning purposes; breakdowns by type of money from a GAAP perspective (e.g., opex versus capex); or breakdown by WBS elements in a project plan.  These outputs will also typically include labor needed over time, broken down by labor category.  These outputs are generated using a top down allocation.
  2. Non-Cost Outputs:  Non-cost outputs are quantitative predictions of either intermediate work product size, or non-cost deliverable components.  Examples include the number of test cases (perhaps broken down by type), the engineering documents created with page counts, the number of use-case scenarios to be created, or the estimated help desk calls broken down by category.  These outputs are typically created using curves similar to the productivity curves, operating either on the HLOs or on the total project effort.
  3. Lifecycle Costs: If the estimate is for a product to be created, delivered, and accepted then the cost and non-cost items above would typically cover the period through acceptance. In most cases there would then be an on-going cost to support and maintain the delivered product throughout its lifecycle. These support costs are relatively predictable both in terms of the support activities that are required and the curves that define the effort involved. For many of them, the effort will be high immediately following acceptance, drop off over the course of one to three years to a low plateau, then climb again as the product nears the end of its design life.

In the next few sections of this article we’ll focus on the application of parametric modeling techniques to IGCEs in support of the three procurement phases.

Pages: Page 1 Page 2 Page 3 Page 4 Page 5 Page 6 Page 7

Previous Article:
« History is the Key to Estimation Success
Next Article:
Outcome-Based Acquisition »

Author

William Roetzheim
William Roetzheim
Mr. Roetzheim (PMP, RMP, MBA, CRISC, CISA, IFPUG) is founder and CEO of Level 4 Ventures, Inc., an SDVOSB that offers independent cost analysis, risk management, and project oversight services for large IT projects. He has written 27 published books, over 100 articles, and three columns. He is a frequent lecturer and instructor at multiple technology conferences and two California universities. He can be reached at William@level4ventures.com.

Reader Interactions

Leave a Comment Cancel

You must be logged in to post a comment.

sidebar

Blog Sidebar

Featured Content

Data Privacy Day - Jan 28

Data Privacy Day is January 28th

You can help create a global community that respects privacy, safeguards data, and enables trust. You can help teach others about privacy at home, at work, and in your community.

Learn How

Featured Subject Matter Expert (SME): Daksha Bhasker

A dynamic CSIAC SME, Senior Principal Cybersecurity Architect, Daksha Bhasker has 20 years of experience in the telecommunications services provider industry. She has worked in systems security design and architecture in production environments of carriers, often leading multidisciplinary teams for cybersecurity integration, from conception to delivery of complex technical solutions. As a CSIAC SME, Daksha's contributions include several published CSIAC Journal articles and a webinar presentation on the sophiscated architectures that phone carriers use to stop robocalls.

View SME's Contributed Content

The DoD Cybersecurity Policy Chart

The DoD Cybersecurity Policy Chart

This chart captures the tremendous breadth of applicable policies, some of which many cybersecurity professionals may not even be aware, in a helpful organizational scheme.

View the Policy Chart

CSIAC Report - Smart Cities, Smart Bases and Secure Cloud Architecture for Resiliency by Design

Integration of Smart City Technologies to create Smart Bases for DoD will require due diligence with respect to the security of the data produced by Internet of Things (IOT) and Industrial Internet of Things (IIOT). This will increase more so with the rollout of 5G and increased automation "at the edge". Commercially, data will be moving to the cloud first, and then stored for process improvement analysis by end-users. As such, implementation of Secure Cloud Architectures is a must. This report provides some use cases and a description of a risk based approach to cloud data security. Clear understanding, adaptation, and implementation of a secure cloud framework will provide the military the means to make progress in becoming a smart military.

Read the Report

CSIAC Journal - Data-Centric Environment: Rise of Internet-Based Modern Warfare “iWar”

CSIAC Journal Cover Volume 7 Number 4

This journal addresses a collection of modern security concerns that range from social media attacks and internet-connected devices to a hypothetical defense strategy for private sector entities.

Read the Journal

CSIAC Journal M&S Special Edition - M&S Applied Across Broad Spectrum Defense and Federal Endeavors

CSIAC Journal Cover Volume 7 Number 3

This Special Edition of the CSIAC Journal highlights a broad array of modeling and simulation contributions – whether in training, testing, experimentation, research, engineering, or other endeavors.

Read the Journal

CSIAC Journal - Resilient Industrial Control Systems (ICS) & Cyber Physical Systems (CPS)

CSIAC Journal Cover Volume 7 Number 2

This edition of the CSIAC Journal focuses on the topic of cybersecurity of Cyber-Physical Systems (CPS), particularly those that make up Critical Infrastructure (CI).

Read the Journal

Recent Video Podcasts

  • Privacy Impact Assessment: The Foundation for Managing Privacy Risk Series: The CSIAC Podcast
  • Agile Condor: Supercomputing at the Edge for Intelligent Analytics Series: CSIAC Webinars
  • Securing the Supply Chain: A Hybrid Approach to Effective SCRM Policies and Procedures Series: The CSIAC Podcast
  • DoD Vulnerability Disclosure Program (VDP) Series: CSIAC Webinars
  • 5 Best Practices for a Secure Infrastructure Series: The CSIAC Podcast
View all Podcasts

Upcoming Events

Wed 27

Enterprise Data Governance Online 2021

January 27 @ 08:00 - 13:30 EST
Organizer: DATAVERSITY
Thu 28

Data Privacy Day

January 28
Jan 28

Data Privacy Day

January 28, 2022
View all Events

Footer

CSIAC Products & Services

  • Free Technical Inquiry
  • Core Analysis Tasks (CATs)
  • Resources
  • Events Calendar
  • Frequently Asked Questions
  • Product Feedback Form

About CSIAC

The CSIAC is a DoD-sponsored Center of Excellence in the fields of Cybersecurity, Software Engineering, Modeling & Simulation, and Knowledge Management & Information Sharing.Learn More

Contact Us

Phone:800-214-7921
Email:info@csiac.org
Address:   266 Genesee St.
Utica, NY 13502
Send us a Message
US Department of Defense Logo USD(R&E) Logo DTIC Logo DoD IACs Logo

Copyright 2012-2021, Quanterion Solutions Incorporated

Sitemap | Privacy Policy | Terms of Use | Accessibility Information
Accessibility / Section 508 | FOIA | Link Disclaimer | No Fear Act | Policy Memoranda | Privacy, Security & Copyright | Recovery Act | USA.Gov

This website uses cookies to provide our services and to improve your experience. By using this site, you consent to the use of our cookies. To read more about the use of our site, please click "Read More". Otherwise, click "Dismiss" to hide this notice. Dismiss Read More
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary
Always Enabled

Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.

Non-necessary

Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.