Approved for Public Release; Distribution Unlimited The MITRE Corporation. All rights reserved.

Similar documents
Best Practices for Source Selection Planning

Source Selection The Good and Not So Good

HOWEVER, IT S NOT THAT SIMPLE.

The Best Value Source Selection Debate Tradeoff or LPTA

It s Not Just LPTA: Best Practices for the Lowest-Price Technically Acceptable Process

Aiming for Best Value in Government Contracts

Kirk W Buffington, CPFIM, CPPO, C.P.M.,

UNCLASSIFIED RFP SECTIONS L&M. Table Exchange CONTRACTING OFFICERS & PROGRAM MANAGERS UNCLASSIFIED DISTRIBUTION A: APPROVED FOR PUBLIC RELEASE

Source Selection for Construction and AE Contracts

AIR FORCE SUSTAINMENT CENTER

Constructing a Price-to-Win 2011 ISPA/SCEA Conference, Albuquerque, NM Frank R. Flett, Senior Vice President MCR Technologies, LLC

RFP Sections L&M Exchange CONTRACTING OFFICERS & PROGRAM MANAGERS

CMMI Project Management Refresher Training

Version 1.0. The Contract Management Standard Final Edition. Version 1.0

Developing Effective Capture and Proposal Strategies. Presented by Red Team Consulting

RFP SUBMISSION WRITING. for Geniuses. Crucial Steps to Craft Strategic Responses

Version 1.0. The Contract Management Standard Final Edition. Version 1.0

Overview of SAE s AS6500 Manufacturing Management Program. David Karr Technical Advisor for Mfg/QA AFLCMC/EZSM

How To Win Business with the Government

Price to Perform: Winning the Battle of Business in an Austere Environment

Section M: Evaluation Factors for Award HQ R LRDR Section M: Evaluation Factors for Award For HQ R-0002

Source Selection. NAVAIR Public Release SPR ; Distribution Statement A Approved for Public Release; distribution is unlimited

Government Proposals Technical, Management and Past Performance Guidelines

Let s be Practical- Evaluating RFPs for Key Requirements. Breakout Session Speaker: Brad A. Edwards

NAVAIR-Industry Communication Plan for Competitive Procurements. 12 December 2017

The ABCs of Government Proposal Evaluation

PHASE 4 - Post-Award. Type of Feedback Type of Contract Feedback Category Feedback

Tamara Buckman Director, Business Winning. Brad Douglas EVP Global Strategy. David Bol SVP, Business Winning. 1 certification CEU.

Software Project & Risk Management Courses Offered by The Westfall Team

Breakout Session #: A14 Presented by: Bridget Gauer. Date: July 24, 2017 Time: 11:15AM-12:30PM

Proposal Window Lessons Learned

#evaluate: The Quality- Infused Price Methodology

Request for Proposal (RFP) Overview

ANNEX I - TERMS OF REFERENCE

NAVAIR SOURCE SELECTION PROCESS OVERVIEW

REQUEST FOR PROPOSAL: WIOA Integrated Data Systems Report

Guidance on Establishing an Annual Leadership Talent Management and Succession Planning Process

NAVFACSW Procurement Forum-- Improving Environmental Contract Vehicles and Execution

At the Defense Acquisition University, we spend a lot of time with incoming program. Calling on Mission Assistance. John Higbee Jesse Stewart

Current Practices are Threatening Past Performance as an Effective Tool

Talent Review and Development Process: A Step-by-Step Guide

Information Technology Independent Verification and Validation

Services Acquisition Update

PRIMARY PHYSICAL WORK ADDRESS: 1700 Broadway Suite 200 Denver, CO 80290

APPLICATION INFORMATION: 1. All five sections of the application must be completed.

Contents About This Guide... 5 Upgrade Overview... 5 Examining Your Upgrade Criteria... 7 Upgrade Best Practices... 8

Request for Proposals- Final RFP & Responses to Questions 10/26. Updates since Issuance in Red

Safety Perception / Cultural Surveys

US Army Corps of Engineers Tulsa District

FASSET TVET Learner Work Experience Project FAS/QAL/NF/FETWBEPMI/CON583/14 Briefing Session 11 April 2014

Winning New Business in a Difficult Market

Value of Systems Engineering. Kerri Polidore, Systems Engineer ARDEC- Systems Engineering Infrastructure

Knowledge Management Accelerate Knowledge. Create Value.

Coastal Environments, Inc.

The PM's Role in Capturing, Bidding, and Winning Performance-Based Contracts

TCAQ UNCLASSIFIED. Together, we deliver.

Best Value Procurement

TIPS PREPARING AN EVALUATION STATEMENT OF WORK ABOUT TIPS

5 Tips for Successful Goal Setting

Procurement Presentation to the Financial Management Institute of Canada

PART THREE: Work Plan and IV&V Methodology (RFP 5.3.3)

getting materiality right a whitepaper by brownflynn

Negotiating Price in Competitive Acquisitions

Defense Microelectronics Activity (DMEA) Advanced Technology Support Program IV (ATSP4) Small Business Set-Aside RFP. Pre-Proposal Conference

THE RESOURCE MANAGEMENT SERIES. Resource Management Trends Agencies Can't Afford to Miss

Little Rock District. Submitting Winning Best Value Proposals

Performing a Successful Audit. Fundamentals of Auditing ERO Compliance Audit Process Jim Hughes Manager, Audit Assurance and Oversight

TERMS OF REFERENCE FOR INDIVIDUAL CONTRACTOR

Institute of Internal Auditors 2019 CONNECT WITH THE IIA CHICAGO #IIACHI

Human Resources. Preparing your resume. The purpose of a resume. The function of a resume

Making the most of the transition. How to best use your transition advantages as incumbent in your rebid submission

To Bid or Not to Bid That is the Question

A Contracting Officer s Guide To Getting Stronger Contractors

GGGI EVALUATION RULES

Maximizing The Value Of Your Smart Grid Investment

Procure to Pay (4.1 & 4.2) As-Is Process Flow and Narrative

OVERCOMING MARKET PRESSURES: COMPENSATION DESIGN THAT BALANCES COMPANY GOALS & INDUSTRY SHIFTS

Collaborative Planning Methodology (CPM) Overview

Decision. Matter of: APEX-MBM, JV. File: B Date: October 3, 2011

Mission-Focused Organizational Alignment: Guiding Principles and Insights

Step 2: Analyze Stakeholders/Drivers and Define the Target Business Strategy

Why PMOs Fail: Is Your Organization at Risk?

Developing Operations Strategies: What, Why, And How?

SESSION 107 Wednesday, November 1, 10:15am - 11:15am Track: People, Culture, and Value

Nine Steps to a Better Statement of Work

1.0 BASIS OF ESTIMATE (BOE) HOW TO

Proposal Writing Benchmark Study. Proposal Automation

Chapter 15. Competitive Negotiation: Evaluating Proposals

How to Develop an Executive Compensation Philosophy. A Guide to Defining Your Bank s Mission Statement for Compensation

Writing Compelling Proposals:

FAA PMIWDC LUNCHEON SERIES STRATEGIC PLANNING; WHAT, WHY, AND HOW

REQUEST FOR OFFER RFO: For: Business Continuity Program Development Consultant Services. For: Covered California

AMENDMENT OF SOLICITATION/MODIFICATION OF CONTRACT

Don t Just Price Price-to-Win It!

What We Know Now: Lessons Learned Implementing Federal Financial Systems Projects

Tools for Building Monitoring and Evaluation Systems in Public Works Projects

FORMULATING A HUMAN RESOURCE DEVELOPMENT POLICY FOR THE PUBLIC SERVICE (EXPERIENCE UGANDA)

International Diploma in Project Management. (Level 4) Course Structure & Contents

Strategic Thinking for the New Reality: Creating and Executing a Blueprint for Your Custom Building Business

Position to Win: The Art of Minimizing Money Left on the Table Christine Campbell

Transcription:

Approved for Public Release; Distribution Unlimited. 14-2256 -2014 The MITRE Corporation. All rights reserved.

The Art of Evaluation Criteria Breakout Session #E02 Erin M. Schultz, The MITRE Corporation Kelly Horinek, The MITRE Corporation Ginny Wydler, The MITRE Corporation Date: July 29, 2014 Time: 2:30-3:45 p.m.

Agenda Introduction Presentation Goals The Problem Contracting Officer Role Source Selection Evaluation Quick Tips Lessons Learned How to Develop Well-Crafted Criteria Benefits Key Take-Away Ideas 2

Introduction GAO Bid Protest Annual Report to Congress for Fiscal Year 2013, January 2, 2014 (GAO 14-276SP) Number one reason in sustaining protests was failure of Government to follow stated evaluation criteria If evaluation criteria do not adequately reflect business needs or program mission, an evaluation team may stray from the confines of those criteria in search of the best value GAO: Our Records show the most prevalent reasons for sustaining protests during the 2013 fiscal year were: (1) failure to follow solicitation evaluation criteria, (2) inadequate documentation of the record, (3) unequal treatment of Offerors, and (4) unreasonable price or cost evaluation. 3

Presentation Goals Presentation Focus Develop best value source selection criteria using Trade-off Analysis (not LPTA) Leveraging the CO s role as business partner to facilitate development of source selection criteria Providing the CO with quick tips and take-aways for a concrete place to start when drafting criteria Providing examples and using proven techniques for leading a technical evaluation team to formulate meaningful criteria 4

The Problem Poor criteria are not just born, they are made Schedule Problems Building an acquisition package can be months-long As pressure builds to issue final RFP and schedule milestones loom, criteria may be hastily drafted Process Problems Relying on previous acquisition packages not tailored to suit the current acquisition - cut-and-paste approach Focus on requirements and acquisition strategy to the detriment of source selection mechanics Final RFP may not align with requirements due to oversight/review/changes or feedback from industry 5

The Problem (cont.) People Problems Rotation of government personnel further increases risk of fundamental disconnects between original intent of criteria and ultimate application in the evaluation process Staff members responsible for drafting these criteria may not be part of the technical evaluation panel New evaluation panel members see these criteria for the first time on the day proposals are received No history or buy-in with new team members 6

Contracting Officer s Role Business advisor to support acquisition strategy, market research, and industry engagement Business partner to facilitate the development of the source selection plan and criteria (FAR Part 101) RFP developer and manager of Sections L and M to ensure RFP and source selection plan consistency Navigator of TMI to synthesize information into meaningful acquisition artifacts A CO can most effectively influence quality criteria when engaged early in the acquisition process as a collaborative business partner 7

Source Selection Evaluation Process SOW Review Risk Assessment & Program Documents Develop Eval Criteria Factors Review Weight Specific Criteria Develop Standards Sections L & M The process is rigorous and complex The process is linear each step is important Each acquisition is unique not a re-do of last time Each Federal agency and organization has its own culture and style 8

Evaluation Criteria (FAR 15.304) Broad discretion is provided in the FAR for selecting evaluation criteria The Government is required to evaluate Quality Can be done through the evaluation of one or more non-cost evaluation factors: technical excellence; management capability; or personnel qualifications Past performance Price or cost Other non-cost factors typically include: Security, OCI, Compliance Certs and Reps 9

Quality Quality of the product or service shall be addressed in every source selection (FAR 15.304(c)(2)) High quality evaluation criteria categories Are linked to critical aspects of the program (values and risks) Are limited to those that will yield meaningful discrimination between Offerors Are independent of one another Are consistent with SOW and Specs Quick Tip: Recommend inclusion of unpriced Basis of Estimates (BOEs) in the Tech/Mgmt volume(s) 10

Quality Evaluation Criteria - Examples Technical Approach Program Management Subcontract Management Risk Mitigation Approach Staffing and Key Personnel Resumes Systems Engineering Process or Key Processes Transition Plan Quick Tip: Too many criteria dilute the proposal evaluation, making it difficult to select a clear winner 11

Specific Criteria Weights and Relative Importance Weights establish the relative importance of criteria All factors and significant sub-factors that will affect contract award and their relative importance shall be stated clearly in the solicitation The solicitation shall also state, at a minimum, when all evaluation factors other than cost or price combined, are: 1. Significantly more important than cost or price 2. Approximately equal to cost or price 3. Significantly less important than cost or price Quick Tip: Do not provide exact percentages or formulas, but use qualitative descriptions Stay away from numbers 12

Warning Label Developing Criteria SOW Review Risk Assessment & Program Documents Develop Eval Criteria Factors Review Weight Specific Criteria Develop Standards Sections L & M Bottoms up technique Start with clean slate Large team, stakeholders Define Requirements Industry input Cut-to-the chase technique Apply lessons learned Small team, specialists Update or refresh requirements Leverage successful criteria Quick Tip: Be Aware - One Size Does Not Fit All! 13

Benefits of Bottoms Up Evaluation panel team members develop criteria Team consensus is achieved on true discriminators All stakeholders are included early in process Brain storming is very effective for buy-in, all are heard Software tools can be used for team facilitation Quick Tips: Revisit criteria prior to the evaluating the first proposal to level set the team Conduct a mock consensus on the first factor to ensure clear understanding by the evaluators 14

Benefits of Cut to the Chase Key stakeholders determine true discriminators Small groups can be more effective Shorten acquisition schedule Reuse valuable documents Capitalize on what worked successfully in the past Quick Tips: Get buy in from your evaluators on this approach Identify a strong technical writer to participate 15

Past Performance Considerations How many references do you allow? How do you allocate Prime and Sub references? Questionnaire elements? Recency is judgment call Relevance is binary GAO Upheld Protest Quick Tips: Tailor past performance questionnaires Offerors should coordinate receipt of questionnaires Interview the references for deeper insight 16

Price or Cost Per FAR each solicitation shall state when all evaluation factors other than cost or price combined, are: 1. Significantly more important than cost or price; 2. Approximately equal to cost or price 3. Significantly less important than cost or price Quick Tips: Section L Cost Volume Instructions Always require page numbers Require priced BOEs mirroring the unpriced BOEs Provide Excel spreadsheet template(s) Responses must include unlocked formulas 17

Overall Proposal Risk Evaluation Criteria Tie breaker for the Source Selection Decision Document Can address proposal aspects not covered by criteria Mitigates the individual panel stovepipe evaluations Addresses all aspects of the proposal Technical, Management, OCI, Security, Past Performance and Cost Risk Rating High High probability of schedule disruption, cost growth, performance impact Moderate Moderate probability, close contract monitoring Low Low probability, normal contract monitoring Quick Tip: Be sure to apply this to the overall proposal and not just in Quality (many panels use this differently) 18

Overall Proposal Risk Evaluation Criteria SAMPLE LANGUAGE Overall proposal risk reflects the degree of confidence that the proposed approach shall achieve the goals or objectives of the acquisition. Overall proposal risk reflects the combined risk of schedule disruption, cost growth, and performance impact. If an identified risk has a direct cost, schedule, or performance impact, the evaluation shall reflect that assessment. In general, the risk shall be evaluated across all items (quality, past performance, OCI, security, and cost) and characterized as high, moderate or low. The standard is met when the Offeror s proposal demonstrates an acceptable level of risk. 19

Overall Proposal Risk Evaluation Criteria SAMPLE Decision Criteria Rating Scale Offeror A Offeror B Technical Area (significantly more important than cost) Adjective Good Satisfactory Past Performance Confidence High High OCI Acceptability High Moderate Security Pass/Fail Pass Pass Compliance Pass/Fail Pass Pass Cost $500,000 $450,000 Most Probable Cost Cost Estimate $500,000 $525,000 Overall Risk Risk Low Moderate Award 20

How to Develop Well Crafted Evaluation Criteria Factors Sub-factors Standards Quick Tip: Write your evaluation criteria first (Section M) and then write your associated RFP instructions (Section L) 21

Evaluation Factors and Standards FAR 15.304 - Evaluation factors and significant subfactors must: Represent the key areas of importance and emphasis to be considered in the source selection decision Support meaningful comparison and discrimination between and among competing proposals Quick Tips: Limit the number of Factors 3-5 Limit the number of Sub-factors 1-2 Avoid too many sub-items (commas, bullet lists, etc.) Review factors, sub-factors and weights to ensure balanced evaluation 22

Evaluation Factors and Standards Factors represent key areas proposals are evaluated against, i.e. Technical, Management, Past Performance Sub-factors are specific areas of factors to be assessed (Technical Solution, Key Personnel, Transition Plan) Standards serve as measurements for the evaluators to determine how well a proposal meets, fails to meet, or exceeds the requirements Technical Solution Sub-Factor Standards Standard Technical Factor Key Personnel Sub-Factor Standards Standard Transition Plan Sub-Factor Standards Standard 23

Evaluation Standards and Sub-factors Standards are the baseline against which the proposals are evaluated to determine their acceptability and value Standards minimize bias that can result from direct comparison of proposals Standards establish levels of expectation single metric Write in a manner so that evaluators will have a common grasp of what constitutes meets the standard Quick Tips: Prior to RFP release - evaluators need to discuss each standard and agree on what meets the standard After receipt of proposals - revisit Factors/Standards at Source Selection Training 24

Quick Tips for Standards/Sub-factors Do not try to quantify the unquantifiable ( innovative ) Write in a manner that facilitates the ability to rate proposals above or below the standard ability to distinguish between strengths and weaknesses Maintain flexibility don t assume that the Government knows everything that should be proposed If possible, have evaluators help write standards Do not try to evaluate everything in the requirements documents - only select key discriminators Never use a standard to create a new or unstated requirement not in the SOW 25

Evaluation Standards Poor Examples Personnel The standard is met when the Offeror provides a staffing plan The standard is met when the program manager has seven years of experience The standard is met when Offeror presents a forward leaning approach The standard is met when the Technical Lead has a master s degree 26

Evaluations Standards Good Example #1 Personnel This factor addresses the suitability of the personnel that the Offeror has proposed as well as their plans for continuing to provide and manage a qualified workforce The standard is met when the Offeror proposes a cadre of personnel whose skills expertise is appropriate for meeting the requirements of the SOW The standard is met when the Offeror has an acceptable approach for providing qualified (cleared) personnel that have the qualifications required to meet the SOW and other program requirements 27

Evaluations Standards Good Example #2 Transition Plan This factor addresses the Offeror s plan to effectively and efficiently initiate support immediately at contract award The standard is met when the Offeror presents an acceptable process for transitioning program management to minimize impact to customers The standard is met when the Offeror presents an appropriate skill mix for transitioning staffing to minimize impact to customers The standard is met when the Offeror presents an acceptable approach for managing transition risk 28

Lesson Learned: Internal Compliance Matrix Section L and M must map to your SOW/requirements and must be consistent with the source selection plan A compliance matrix will ensure you: Ask for all information necessary to make evaluation Do not ask for extra information hard on evaluators Matrix mapping is easier: When the factors are developed and written first, then standards, then Section L Excel spreadsheet will reveal gaps and overlap Quick Tip: Recommended order of matrix mapping 1) SOW, 2) Criteria/Section M, 3) Section L Instructions 1 2 3 29

Why Map? If Section L asks for data, but there is no associated criteria in Section M to evaluate it, it is the Government s problem If Section M states an evaluation criteria, and there is no associated request in Section L, it is the Government s problem The quality of the Offeror s proposal can be directly proportional to the quality and coordination of the Government s solicitation sections 30

Lesson Learned: Developing Quality Criteria Tailored to specific program characteristics Criteria should provide reasonable expectation of discrimination among Offerors Criteria should allow the evaluators to clearly document how or why an Offeror will be able to successfully perform the established requirements under the contract Relative importance must be established Criteria and factors are independent of each other Quick Tip: If your criteria are vague or too general, a best value award will be difficult - destination LPTA 31

Benefits Influencing the quality of source selections with focused criteria that make the source selection process more manageable - clearly defined requirements Facilitating best value decisions with criteria that reflect the critical aspects of the requirements and business and mission needs - real discriminators Conducting improved source selections that provide the foundation to improved contract outcomes - performance Reducing risk of protest by providing effective source selection road map for evaluators to follow process Ultimate Quick Tip: Save time and money by doing it right the first time and eliminating re-work 32

Key Take-Aways Takeaway 1: Source selection and evaluation quick tips and techniques that COs can put into practice immediately for any source selection Takeaway 2: How to develop well-crafted criteria that result in a selection of the best value proposal Takeaway 3: Lessons learned from previous source selections 33

Contact Information Erin M. Schultz, The MITRE Corporation eschultz@mitre.org Tel: 703-983-3005 Kelly Horinek, The MITRE Corporation khorinek@mitre.org Tel: 703-983-9227 Ginny Wydler, The MITRE Corporation vwydler@mitre.org Tel: 703-324-0752 34

QT# Quick Tip 1 Quality: Recommend inclusion of unpriced Basis of Estimates (BOEs) in the Technical/Management volume(s) 2 Quality: Too many criteria dilute proposal evaluation, making it difficult to select a clear winner 3 Quality: Do not provide exact percentages or formulas, but use qualitative descriptions Stay away from numbers 4 Quality: Be Aware: One Size Does Not Fit All; use either Bottoms-Up or Cut-tothe-Chase technique to develop criteria know the pros and cons of each 5 Quality: Revisit criteria prior to evaluating the first proposal to level set the team 6 Quality: Conduct a mock consensus on the first sub-factor prior to RFP release to ensure clear understanding by the evaluators 7 Quality: Get buy in from evaluators to use Cut-to-the-Chase approach 8 Quality: Identify a strong technical writer to participate in writing Sections L and M 9 Past Performance: Tailor Questionnaires, Offerors should coordinate receipt of Questionnaires, Interview the references for deeper insight 10 Section L Cost Volume Instructions: Always require page numbers; Require priced BOEs mirroring the unpriced BOEs; Provide Excel spreadsheet template(s); Responses must include unlocked formulas 35

QT# Quick Tip 11 Overall Proposal Risk: Be sure to apply this to the overall proposal and not just in the Quality area (many panels use this differently) 12 Criteria: Write your evaluation criteria first (Section M) and then write your associated RFP instructions (Section L) 13 Factors: Limit the number of Factors to 3-5; Limit number of Sub-factors to 1-2; Avoid too many sub-items (commas, bullet lists, etc.); Review factors, sub-factors and weights to ensure balanced evaluation 14 Factors/Standards: Prior to RFP release - evaluators need to discuss each standard and agree on what meets the standard ; After receipt of proposals - revisit Factors/Standards at Source Selection Training 15 Standards: Do not try to quantify the unquantifiable ( innovative ) Write in a manner that facilitates the ability to rate proposals above or below the standard ability to distinguish between strengths and weaknesses Maintain flexibility don t assume that the Government knows everything that should be proposed If possible, have evaluators help write standards Do not try to evaluate everything in the requirements documents - only select key discriminators Never use a standard to create a new or unstated requirement not in the SOW 36

QT# Quick Tip 16 Criteria: Recommended order of compliance matrix mapping 1) SOW, 2) Section M Criteria, 3) Section L Instructions 17 Criteria: If your criteria are vague or too general, a best value award will be difficult - destination LPTA 18 Ultimate Quick Tip: Save time and money as a result of doing it right the first time and eliminating re-work 37