Marking Regimes in Local Government bids. A snapshot of price vs quality ratios, and how price and quality are marked

Similar documents
Making the most of the transition. How to best use your transition advantages as incumbent in your rebid submission

The Dangers of e-auctions

Lessons from GAO Decisions. Lessons for incumbents from GAO decisions on rebid protests

PLEASURES AND PI OF PROCUREMENT

AUCTION. Just price? Only commodities? Shooting down common objections

The Devil is in the Details!

ENGINUITY TUTORIAL. Making Procurement Decisions. Copyright Virtual Management Simulations

BIS Research & Evaluation Framework. Guide for Buyers

Think. Feel. Do. Making law firm bids more persuasive

Exceptional vs. Average: What Top Leaders Do Best

Improving the quality of tender responses

EIB Procurement Policy and Best Practice

Questions and answers

SCALING LAND-BASED INNOVATION GROUP DECISION-MAKING TOOLKIT

Audit services outline procurement strategy

HOW TO DEAL WITH AGGREGATION A BASIC GUIDE PUBLIC PROCUREMENT MIKE SMITH (PROCUREMENT MANAGER)

Annual Review 2016/17. easy read

The Devil is in the Details!

Presentation 7 November Procurement Tool-kit. Rebecca Rees, Partner, Trowers & Hamlins LLP

SOCIAL VALUE CASE STUDY INTEGRATING SOCIAL VALUE INTO PROCUREMENT AT HARROW COUNCIL

INVITATION TO TENDER TO SUPPORT ORGANISATIONAL CHANGE AND TRANSFER OF UNDERTAKINGS OF THE CHESHIRE AND WARRINGTON GROWTH HUB

Invitation to Tender

Make sure to listen to this audio: as you go through this handout, to get maximum value.

Frequently Asked Questions

Procurement Policy. Education is for improving lives and for leaving your community and world better than you found it.

European Maritime and Fisheries Fund Competitive Tendering and Public Procurement. Technical Guidance Notes

Public Health Agency. Social Care Procurement Awareness Session November 2013

RFP SUBMISSION WRITING. for Geniuses. Crucial Steps to Craft Strategic Responses

How to do Business with NATO. Participation in Competitions - Common Oversights

LM032: Advanced Tendering Procedures & Bid Evaluation

SWALA Evaluation and the Audit Trail. Emily Heard Head of Procurement, Competition and State Aid Bevan Brittan 7 June 2016

SECTION 17 CONTRACT PROCEDURE RULES

Pacific Horticultural and Agricultural Market Access (PHAMA) Program

Advanced Tendering Procedures & Bid Evaluation

STATE OF NEW YORK OFFICE OF THE STATE COMPTROLLER

Data, Dialogue, Decisions! Lisa Hulet: SIRVA Worldwide Linda Ward O Farrell: Ward O Farrell Consultants

XpertHR Podcast. Original XpertHR podcast: 30 June 2017

White Paper: Executive Search Firm How to Engage and Utilise Them Successfully. By Simon Fransca Khan of Leading Headhunters Hunter & Chase

XpertHR Podcast: Gender pay gap reporting - what we have learnt so far. Original XpertHR podcast: 19 th October 2018

TENDER, BID & PROPOSAL SERVICES

Honesty pays. How to detect and prevent bid-rigging cartels INFORMATION FROM THE SWEDISH COMPETITION AUTHORITY

Contents What you have to do summary

Examiner s report F5 Performance Management December 2017

7 questions to answer when making bid/no-bid decisions

How To Win Business with the Government

Audit services procurement strategy

General Certificate of Education Economics 1141 ECON2: The National Economy Mark Scheme

The Impact of Tendering & Procurement on Providers

Examiner s report F5 Performance Management September 2016

APSE s Competitiveness Continuum and Efficiency

The Provision of North West Non-Emergency Patient Transport Services

Collaborate To Compete Developing consortia to deliver contracts

Paper P6 Management Accounting Business Strategy Post Exam Guide May 2007 Examination

Procurement and Tendering Rules. Guidance for Parishes

Invitation to Tender. Development Legal Services. August 2015

Learning from using the Volunteer Impact Assessment Toolkit in NHS Scotland Guidance and Tips for new users March 2009

Contract Award Procures a Promise; Supplier Performance Reviews Help Deliver the Promise

Learning from using the Volunteer Impact Assessment Toolkit in NHS Scotland. Guidance and Tips for new users March 2009

A GUIDE TO REVIEWING OUTSOURCED CONTRACTS

Conception Design Construction Operation.

Effective Technical RFP Development: A Guide for Jurisdictions and Other Organizations December 19, 2016

ECI: Two-Stage Contracts

Beijing, February 18, Request (sent to Bidder's designated address) for presentation of a Bid

Watson Glaser III (US)

Invitation to Tender. External Audit Services. July 2015

Gambling-Related Harm Minimisation in Criminal Justice Invitation to Tender

Advanced Tendering Procedures & Bid Evaluation. Effective Tendering, Bidding Strategies & Evaluation September 2016 Qatar

Commissioning and Procurement Toolkit

November 13th 2009, Konkurrensverket Michael Steinicke

HOWEVER, IT S NOT THAT SIMPLE.

London Borough of Newham E-Auction Case Study

About This Guide. What do we mean by social value? What does that mean in practice?

Procurement Options for NZ Construction Contracts

Introduction to Working with the Public Sector Module 3 Tender and Award

Electricity for Education User Guide. Low costs Budget certainty Dedicated point of contact at YPO and npower

Toolkit. The Core Characteristics of a Great Place to Work Supporting Framework and Tools. Author: Duncan Brodie

Project Procurement Considerations in This Market. October 14, 2016

audit monitoring 2010

What BEE score do I need to get?

National Police Promotion Framework

Summary of Assessment

Responding to RFPs. Slide 1 Responding to RFIs and RFPs Welcome this module on responding to RFIs and RFPs.

a detailed picture of their financial position, management effectiveness, organisational culture, customer satisfaction and more.

Writing a Request for Proposals for an E-learning Solution

THE ULTIMATE GUIDE TO HIGH-PERFORMING. Amazon Sponsored Ad PPC Campaigns

Law firms & the 7 Ps. Why is there no real legal marketing?

growing and marketing your scheme

Reverse eauctions and NHS procurement: Executive Summary

Public Procurement Challenges - Modern Slavery Act March 2018

Advanced Tendering Procedures & Bid Evaluation Effective Tendering, Bidding Strategies & Evaluation

Fairer Finance Data Portal A How-to Guide for Premium Users

DRAFT. Renewable Energy Sources

Apprentices KPIs In Public Contracts. Daniel Milnes Head of Business Law Forbes 2012

CONSTRUCTION INDUSTRY COUNCIL Technical Seminar on Construction Procurement Selection of Consultants 27 November 2015

Views from Adams Hendry Consulting Ltd on Communication in EIA

Watson-Glaser III Critical Thinking Appraisal (US)

5 best (and worst) uses for Net Promoter Score

Examiners Report/ Lead Examiner Feedback. Summer NQF BTEC Level 1/2 Firsts in Business. Unit 9: Principles of Marketing (21325E)

5 Digital Marketing mistakes that Insurance Brokers make, and how to avoid them

Transcription:

Marking Regimes in Local Government bids A snapshot of price vs quality ratios, and how price and quality are marked

1 Snapshot of marking regimes in Local Government bids This snap shot covers over 50 Local Government procurements. The information was primarily sourced through Freedom of Information requests, asking for information on the Authorities last 5 service procurements, supplemented by some Local Government procurements we have been involved in ourselves. We were primarily interested in three things: What is the range of Quality/ Technical vs Price split in evaluating bids? How are marks in procurements awarded for price do higher prices suffer disproportionately lower marks in evaluation schemes? What do Authorities award top marks for in Quality/ Technical sections? The snapshot covered County Councils, Unitary and District/ Borough Councils and a range of services and size of contract dictated only by the answers we received from our FOI of the last 5 procurements of service type contract. We received responses from across England and Wales. Quality vs Price Since the start of the financial crisis, and particularly the squeeze on public sector budgets, there have been complaints that customers are basing their contract awards much more on price than quality. Whilst this debate has been crystallised in the USA with the expansion of use of Lowest Price Technically Acceptable (LPTA) procurement formats, the UK has not been immune to the apparent drive to low prices at the expense of quality. Clearly procurements for some types of products and services are likely to be more price oriented than others. Logic would suggest that commodities will be procured more on price. However one of the complaints about LPTA has been its use not just in commodity type situations, but for bids where there are potentially complex and significant technical and quality elements to a product or service. We work with a number of incumbents, where we have been seeing mixed use of the % marks awarded for price vs quality in evaluations of tenders (sometimes wildly different proportions of marks awarded for quality vs price for the same service by different Authorities) and wanted to get a snapshot of a wider set of procurements. The summary of the results we received are below: Evaluation marking % score for Technical / Quality 16 14 12 10 8 6 4 2 0 80 75 70 65 60 55 50 45 40 35 30 25 20 15 10 5 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17

2 The highest marks for quality / technical elements for bids we received were 80% (leaving 20% for price / commercial elements) The mode was 60% quality:40% price Only 2 procurements were based entirely on the final price once basic quality hurdles were passed (one of which was an LPTA bid, the other required a basic pass on commercial elements followed by an electronic auction) Over 2/3 of procurements rated quality/technical elements as important as, or more important than price. Whilst the number of responses means this is not a statistically significant result, we were only looking for a snapshot, and that snapshot would suggest that Quality / technical element still outweigh price in the importance of evaluations of bids in the majority of service procurements. How are price scores calculated? Even with quality accounting for a higher apparent % of marks than price, there are still methods we have seen in use in the past to disproportionately reward lower priced bids. For example we have seen marking of price where the lowest priced bidder is awarded 100% of the price marks, with the next lowest awarded 80% of the marks, the next lowest getting 60% of the marks etc, even if the difference in actual price between each bid is only 1. Clearly, when you see this type of marking regime in a tender, even if it is ostensibly a 50% quality:50 % price mix for award, it doesn t take long to work out that being second cheapest, even by a marginal value will immediately cost you 10% of the total evaluation score available, and third cheapest will cost you 20% of total scores very difficult to make up with a superior technical solution. Especially if the customer doesn t award very different marks between bidders for their technical/quality answers. So, even with Quality seeming to be the dominant factor in top level evaluation % mixes, are customers using other means within their scoring regimes to increase the importance of price in their awards? What we found Whilst some of our respondents didn t give us answers to our FOI requests in a format we could use to calculate the precise methodology, the majority of answers we received indicated that this is not the case in most procurements. Three quarters of our responses that were calculable gave marks that were in proportion to the difference in actual price differences, with the lowest price getting 100% of the price score available, and more expensive bids losing a % of available marks in line with the % by which they were more expensive so for example a bid 10% more expensive than the cheapest receives 10% less of the price marks available. Some others used the mean of all prices as a benchmark and then awarded this a nominal score, with bids below this mean gaining marks proportionately, whilst those more expensive than the mean losing marks proportionately.

3 As we have said there were two procurements where the lowest price won irrespective of quality scores (although bidders did have to exceed a basic hurdle score on quality or commercial factors to get to this stage). In essence these mean that the lowest priced competent (Technically Acceptable) bidder wins. One of these was based simply on the price submitted with the tender, the other put the competent bidders into an e-auction (without weightings for their technical scores) to battle it out to the lowest price. One other procurement set a budget and then essentially asked bidders to tell the customer how much of the service required they could deliver for that price. Whilst on the face of it this might also seem to be a very price oriented procurement there was evidence that the customer was also looking for proof of relative quality of the service and would award marks for this quality as well as just quantity, so we don t include this as a variation of LPTA, (quite). Only one of the responses gave stepped scores for different prices. In this case the customer set a baseline figure (of their choosing). Bids then were scored as below: Mark given Variation from baseline figure 5 If bidders price is more than 20% less than baseline 4 If bidders price is between 10% and 20% less than baseline 3 If bidders price is between the baseline and 10% less than baseline 2 If bidders price is between the baseline and 10% higher than the baseline 1 If the bidders price is between 10% and 20% higher than the baseline 0 if the bidders price is more than 20% higher than the baseline Clearly we would expect bidders to have asked a number of clarifications about this scoring regime. Firstly what the baseline figure was, and secondly exactly where the bands started and ended for example would a bid exactly 10% lower than the baseline score 3 or 4? So generally our findings for this question were that most procurements are scoring price proportionately to actual differences in the prices bid. However it is always essential to check, and we think to work through actual scenarios of what can be complex formula, asking Clarification Questions of the customer if you aren t completely clear of the implications. What drives getting top marks in technical / quality questions? We always encourage clients to look carefully at exactly what the customer says they will award top marks for in their ITT instructions. This can sometimes be ignored by bidders. We find this especially with some incumbents we work with, who sometimes just assume that telling the customer in detail how well they have run the existing contract is sufficient to get top marks. But what are customers looking for in an answer that will give it the highest possible marks? In looking at this we ve relied on the scoring regimes published by customers, on the assumption that they are also the instructions given to evaluators. You can be the judge of whether that is a sound assumption to make. We found a range of scoring matrices from the responses to our snapshot. The number of possible scores an answer could receive ranged from 4 to 11.

4 Whilst some scoring matrices gave little or no indication of what the customer is actually looking for in an answer: Score Term 0 Unacceptable 1 Poor 2 Fair 3 Satisfactory 4 Good 5 Very Good These were in the minority. Most gave at least a reasonable indication of what an answer needed to convey in order to get a particular score, such as the extract here: Score Definition Approach to scoring 5 Demonstrates overall ability to deliver in full plus offers potential added value (excellent) The score will be awarded where a Bidder s response is of exceptional quality, demonstrating a very good understanding of the requirements and identifying factors that will offer potential added value, with evidence to support the response. Response is supported by comprehensive and robust evidence We were interested in what might be common factors that would lead to different scores particularly the highest scores, and how these were different to acceptable or middle range scores. Our aim being to help bidders focus on these areas, not just in their writing and reviewing, but in their pre bid thinking and work so they could be properly prepared to put in the highest scoring solution. Clearly different customers will focus to an extent on different things, and different procurements even from the same customer may vary in what they are looking for (although some Local Authorities did have standard templates for all their procurements). Bidders need to look carefully at the instructions the customer sends out and react to these. But there was a surprising commonality across many of the procurements which enabled us to form the following summary of what most customers are looking for:

5 We found four common areas customers look for most often in their scoring: Confidence that the bidder can and will deliver Demonstration that the bidder understands the contract, customer and task at hand Clear evidence provided by the bidder for all their claims Added value over and above just meeting the specification Typical extracts from scoring tables covering each area included (for gaining the highest marks): Confidence in ability to deliver Exceptional capability to deliver High confidence in ability to deliver Provides confidence in certainty of delivery Relevant examples fully explained so the marker has high confidence the tenderer will be able to deliver the solution Offers full confidence that the Tenderer will deliver the service in full Gives the Council a very high level of confidence Exceptional response that inspires confidence Understanding requirements Very good understanding of requirements Addresses requirements with fine tuning to match customer expectations Evidences significant levels of understanding Indicates a full understanding of the requirement meets the requirement in an exceptional manner Excellent demonstration by the Tenderer of the relevant ability, understanding, experience, skills, resources and quality measures required Demonstrates a thorough understanding of what is being asked for demonstrating a very good understanding of the requirements Evaluator has been able to identify that the potential provider has an excellent understanding of the requirements Evidence Supported by comprehensive and robust evidence Excellent evidence that demonstrates how targets will be achieved Excellent level of evidence provided

6 Evidence provided is of excellent quality Thorough and detailed evidence provided which fully satisfies the requirements Provides evidence of how that understanding can be applied in practice Supported by very strong evidence With evidence to support the response Has provided information with strong supporting evidence to demonstrate an ability to deliver the requirements to an excellent standard Full evidence is provided Added Value includes details of ways that could assist the council in providing a better or more efficient solution beyond what was documented Offers an innovative solution that includes desirable value add Exceed the Authorities requirements Fully satisfies requirements. With innovation /added value Has exceeded all requirements and gives confidence the provider will add real value Indicates the ability to exceed the required standards of the contract Exceptional response / exceptional value added Provides evidence of factor(s) that will add exceptional value above the requirements Evidence identifies factors that will offer significant added value Exceeds requirements in most areas Response identifies factors that will offer potential added value Demonstrates overall ability to deliver in full plus offers potential added value Response may also identify factors that will offer potential added value, and with evidence to support this Taking one of these four areas to give the equivalent wordings used for middle and lowest scores: Evidence (adequate scoring for): Adequate evidence that demonstrates how targets will be achieved Moderate level of evidence provided / acceptable match Evidence shown of provision/experience sought Supported by a reasonable level of evidence Good standard of evidence to support the response Some, but patchy or brief, evidence is given to support the answers Evidence (Lowest scoring for): Response is not supported by credible evidence Fails to evidence how targets will be achieved No or unsatisfactory evidence provided / no match Evidence provided fails to satisfy the stated requirement Contains insufficient information to evidence overall meeting the requirement Little or no evidence to support the response. The response is not supported by credible evidence

7 Implications for Bidders We found the most interesting result from our snapshot was the commonality of what procurers award marks for. Whilst our snapshot indicates that most bids still focus more on technical / quality than price, and price scoring in most cases will be proportionate to the pricing level submitted vs competitors, in any one bid the actual scoring mix could vary, so bidders must take this on a case by case basis. However there are more common areas that customers look for in scoring answers for technical questions. Ensuring that your bid clearly gives the customer confidence in your ability to deliver, demonstrates your understanding, provides evidence for your claims, and adds value above the specification are all things that can be prepared for in any bid, and reviewed against to improve your submission. And ensuring that you give high confidence (rather than some), show clear understanding (rather than patchy), provide comprehensive evidence (rather than adequate) and significant added value (rather than meeting the specification) should be a benchmark you set the team if you want to be confident of getting the highest scores available (rather than just being seen as acceptable). This paper is one of many you will find in the Rebid Centre. All the ideas we refer to in the paper are more fully described in the Rebid Guide and the Rebid Centre, together with step by step guidance of how to put them into action on your contract and rebid. You will also find papers and advice on all aspects of how to prepare for and run your rebid. To join the Rebid Centre, or buy the Guide go to www.rebidsolutions.co

8 Rebidding Solutions helps incumbents win their rebids. As well as providing articles, advice and processes for incumbents in the Rebid Centre we have also published the Rebid Guide which contains 60 ideas for incumbents to put into practice from day one of their contract to improve their chances of winning their rebid. We also provide consultancy and bespoke training for incumbent companies and contracts, helping them put together the processes and actions that lead to rebid success. For an overview of all our services visit our website at www.rebidding.co.uk and sign up to our free newsletter giving hints and tips on what to do to win your rebid. Rebidding Solutions Ltd. Registration: 7959219. Registered address: Venture House, Calne Road, Chippenham, SN15 4PP, UK