The Contracting Education Academy

Contracting Academy Logo
  • Home
  • Training & Education
  • Services
  • Contact Us
You are here: Home / Archives for best value

November 15, 2018 By AMK

GAO reiterates that agencies must meaningfully consider price in ‘best value’ tradeoffs

In three related bid protest decisions made public in the last few weeks, the Government Accountability Office (GAO) reaffirmed the principle that agencies must meaningfully consider price when making best value tradeoff decisions. 

GAO sustained the protests, stressing that merely paying lip service to price while selecting a more expensive, higher-rated offeror is not sufficient — agencies must provide a rational explanation for why they have decided to pay a premium for the awardee’s technical superiority.

In Solers, Inc., B-414672.3 et al.; Technatomy Corporation, B-414672.5; and OGSystems, LLC, B-414672.6 et al., three disappointed offerors challenged the Defense Information Systems Agency’s (DISA) award of Multiple Award Task Order contracts to 14 contractors as part of the Systems Engineering, Technology, and Innovation program.

The solicitation provided that DISA would make award on a best-value tradeoff basis considering price and four technical factors that, when combined, were significantly more important than price.  The agency made award to the 14 highest rated proposals in the non-price factors, opining — without elaboration — that “the technical merit of those proposals justifies paying a price premium over lower-rated, lower-priced proposals.”  Indeed, throughout the evaluation process, the agency repeatedly noted — again without elaboration — that the awardees’ proposals were worth a premium.

Keep reading this article at: https://www.insidegovernmentcontracts.com/2018/11/hey-big-spender-gao-reiterates-that-agencies-must-meaningfully-consider-price-in-best-value-tradeoffs/

Filed Under: Government Contracting News Tagged With: best value, DISA, evaluation criteria, evaluation factor, GAO, multiple award contract, price, selection criteria, trade off

May 3, 2018 By AMK

Bridge contracts need better definition, more disclosure, says Senate report

A Senate report calls for establishing a common definition in the Federal Acquisition Regulation for bridge contracts and for a series of reports on how agencies use such contracts.

“When used too frequently, bridge contracts reduce competition and can result in the government paying more than it should for needed services and supplies,” the report says. “When a contract is awarded outside of the competitive process, such as when an incumbent contractor is granted a sole-source contract, heightened oversight is necessary to ensure the government is getting the best value.”

The report was filed in support of S-2413, which cleared the committee level in February; with the filing of the report, the measure could reach a floor vote at any time.

Keep reading this article at: http://www.fedweek.com/federal-managers-daily-report/bridge-contracts-need-better-definition-more-disclosure-says-report/

Read Senate report 115-232 at: https://www.congress.gov/115/crpt/srpt232/CRPT-115srpt232.pdf

Filed Under: Government Contracting News Tagged With: best value, bridge contract, competition, GAO, Senate, sole source

October 11, 2017 By AMK

Procurement may be trending toward value over price

The number of times federal agencies have requested lowest-price, technically acceptable (LPTA) bids in contract solicitations has shot up over the past decade, an examination of Bloomberg Government data shows.

Federal contract solicitations stating that awards will be made on the basis of LPTA source-selection procedures have steadily grown, from 920 in fiscal year 2008 to more than 12,000 in each of the past two fiscal years, according to Bloomberg Government data.

But contracting industry groups and, increasingly, members of Congress have been agitating for a best-value purchasing approach in more cases, taking into account other factors, including whether the benefits of higher-priced proposals are worth the extra cost.

This renewed priority for best-value procurements has been reflected in the fiscal 2017 and 2018 defense authorization bills in Congress, which significantly narrow the range of types of procurements in which the Defense Department can use LPTA as a guiding philosophy.

Keep reading this article at: https://about.bgov.com/blog/procurement-may-trending-toward-value-price/

Filed Under: Government Contracting News Tagged With: best value, cost benefit, lowest price technically acceptable, LPTA, NDAA, OFPP, quality, source selection, trade off

December 13, 2016 By AMK

2017 NDAA restricts DoD’s use of LPTA procedures

The 2017 National Defense Authorization Act (NDAA) is full of important changes that will affect federal contracting going forward.

lpta-out-best-value-inOne of these important changes severely limits the use of lowest-price technically-acceptable (LPTA) evaluations in Department of Defense (DoD) procurements. Following the change, “best value” tradeoffs will be prioritized for DoD acquisitions. This post briefly examines when LPTA procurements will and won’t be allowed under the 2017 NDAA.

The 2017 NDAA sets a new DoD policy: To avoid using LPTA evaluations when doing so would deny DoD with the benefits of cost and technical tradeoffs.  

Keep reading this article at: http://smallgovcon.com/statutes-and-regulations/2017-ndaa-restricts-dods-use-of-lpta-procedures/

Filed Under: Government Contracting News Tagged With: acquisition workforce, best value, cost, DoD, LPTA, NDAA, trade off

May 5, 2016 By AMK

Common feedback to unsuccessful bidders

The Federal Acquisition Regulation (FAR) allows several opportunities for the government to provide feedback to bidders during or after competitions. The post-award debriefing of offerors is one of those opportunities, and can be a very valuable tool for companies seeking feedback on their proposals.

anthony j. davisThe government is required by the FAR to provide a post-award debriefing to any offeror who requests one in writing within 3 days of notification of contract award.

In the dozens of competition debriefs I’ve conducted or attended in more than a decade, I’m consistently surprised by how often we repeat the same information. The following reviews the format we use for debriefings, questions we’re frequently asked during the discussions, and some of the common feedback we seem to repeat regularly.

The Department of Defense (DoD) guidance on debriefings states the objective as: “The crux of any post award debriefing is the SSA [Source Selection Authority] award decision and whether that decision is well supported and resulted from a source selection conducted in a thorough, fair and sound manner consistent with the requirements and source-selection methodology established in the RFP [request for proposal].” The preceding quote (under section B.8.3.1) and other information about DoD source selections can be found on DAU’s Acquisition Community Connection at: https://acc.dau.mil/dodssp.

First, our standard debriefing format: The objective of this post-award debriefing is to highlight the significant elements in your proposal and to summarize the rationale for award. The ground rules are open and honest discussions within the limits of FAR 15.506.

The focus is on your proposal submission. But overall evaluated cost, task order management and technical proposal ranking for the successful bidder will be provided, including summary of the rationale for award.

Reasonable responses will be given to relevant questions about whether the source-selection procedures, applicable regulations and other applicable authorities were followed in eliminating your proposal from the competition.

You are encouraged to ask questions. Answers not provided today will be provided in writing as soon as possible. In accordance with the FAR 15.506(e), the government will not disclose:

• Trade secrets

• Privileged or confidential processes and techniques

• Commercial and financial information that is privileged or confidential

• Names of individuals providing reference information on past performance

Source Selection Process/Evaluation Factors

In this section, we read a summary of the source-selection process outlined in Sections L and M of the RFP, including the rating scheme and prioritization of factors evaluated. An example is shown below:

A color-code rating technique was used to evaluate the Management and Technical proposals. Past Performance was evaluated for an overall confidence rating and cost proposals were not given a rating. Each proposal was evaluated against the following four factors: (1) Management, (2) Technical Proposal, (3) Past Performance, and (4) Cost. Evaluation of Factors 1 and 2 focused on the strengths, weaknesses, significant weaknesses and deficiencies of the proposals. Evaluation of risk associated with the proposals for these factors are inherent in the evaluation.

As outlined within the RFP, Management and Technical are equal in importance and more important than Past Performance. When combined, these three are significantly more important than Cost.

Following the reading of our standard debriefing, we review the ratings the company in question received. In particular, we focus on the “strengths, weaknesses, significant weaknesses, and deficiencies of the proposal” that resulted in the final overall rating.

Some Common Questions and Answers

Q: Can you tell us how we might compete more favorably next time?

A: Our response to this generally is fairly standard, and tracks directly back to what we tell you in Sections L (Instructions, conditions, and notices to offerors or respondents) and M (Evaluation factors for award). First, your proposal should show that you understand the requirement, preferably without regurgitating it. Second, your proposal should demonstrate how you are going to meet the requirement. Last, but certainly not least, the higher color ratings are awarded when the proposal (1) meets requirements; (2) shows a thorough (or exceptional) approach and understanding of the requirements; (3) contains strengths which outweigh (or far outweigh) any weaknesses; (4) and when risk—not evaluated separately—of unsuccessful performance is low or very low.

Q: Why wasn’t our “concept X” evaluated as a strength?

A: The DoD source-selection procedures (https://acc.dau. mil/dodssp) define a strength as “an aspect of an offeror’s proposal that has merit or exceeds specified performance or capability requirements in a way that will be advantageous to the government during contract performance.” It is incumbent on the vendors to demonstrate their understanding of the requirement, and explain how their approaches will provide value to the government. In many cases, good ideas do not rise to the level of a strength in evaluation because: (1) the concept expressed in the proposal does provide value to the government but is part of what was asked for in the RFP (i.e., is part of how you will meet our requirements, not a way to meet them better, smarter, faster, etc.); or (2) the concept isn’t supported by or integrated with the rest of the proposal (does not track to pricing, is not supported by staffing, is not integrated with service-delivery model, etc.). For example, nearly all proposals we review include ideas such as reach-back support, a council of graybeards to provide strategic consultation, or something else intended to differentiate the proposal from others. But, without providing details on the specific, tangible outcomes (in terms of hours, work products or deliverables) that meet the definition of strength, the government will not evaluate them as strengths during a source selection.

Q: Why were we evaluated with a weakness for “Y?”

A: In general, we would prefer that it never come to this. Our intent is to have significant and substantive discussions throughout our acquisitions to the broadest extent authorized. As a result of those discussions, we should at the very least have communicated to the vendors any significant deficiencies or weaknesses in their proposals and given them time to correct those deficiencies. The presence of a weakness in the final evaluation generally means (1) we don’t believe the vendor understands or recognizes the weakness we’ve pointed out and hasn’t changed its proposal to respond to it; or (2) despite the vendor’s attempt(s) to respond to the weakness, we still don’t understand how the vendor plans to address it or don’t see the staffing or other resources to resolve the matter.

Q: Wasn’t this just a Lowest Priced, Technically Acceptable (LPTA) source selection?

A: There is a time and place for LPTA, but the RFP will always state specifically where the evaluation falls on the best value continuum. The vast majority of our source selections are conducted as best value trade-offs. From the top down in Special Operations Research, Development and Acquisition, we’re strong believers in best value source selections and actively strive to be the best in DoD at conducting them. We focus a great deal of time and effort to ensure we have a well-trained and prepared acquisition workforce with the experience and tools to properly execute, document and communicate the source selections we make and to defend the selections in the event of any protests.

Q: Can you tell us how our cost or proposal compared with the other offerors?

A: Unfortunately, no. In most cases, we will provide the winning offeror’s total cost, and the winner’s evaluation results in terms of colors. We are prohibited by the FAR from disclosing any proprietary information (including other offerors’ costs), directly comparing vendors or providing point-bypoint comparisons.

Some Common Feedback

The evaluation team felt you spent too much of your proposal regurgitating the requirement to us. It’s sometimes a fine balance, but you need to convey to us that you understand the requirement without just reading it back to us. In addition, including examples of work on past efforts does not demonstrate your understanding of the requirement. That experience is evaluated as part of past performance.

Your pricing, staffing model or overall approach (or portions of them) did not make sense to us, were not well supported or didn’t track back clearly to your understanding of the requirement. When evaluating your proposal, we take a very structured approach. We read to understand your overall approach and understanding of the requirement, evaluate whether your proposal meets our requirements, and then identify any strengths or weaknesses of your approach. Well-written proposals lead us clearly and unambiguously through that process and are consistent throughout. An example of this is dividing a large proposal into sections by different vendor offices or organizations. This can save time by having the subject-matter expert write each proposal area, but frequently results in a disjointed proposal when the different sections are not well integrated. We recommend a detailed final review by the offeror of the entire proposal to ensure it is clear and consistent and that the data are not repeated in multiple sections.

Evaluation of past performance is based on the offeror’s recent/relevant performance record from a variety of sources. This may include information provided by the offeror, information obtained from questionnaires (internally or externally), or information obtained from any other source available to the government (Past Performance Information Retrieval System, electronic Subcontract Reporting System, etc.).

So, that’s a quick down and dirty overview of the format we use for debriefings of unsuccessful offerors, questions we’re frequently asked during the discussions, and some of the common feedback we seem to repeat regularly. Hopefully, it provides some insight into the thought patterns and work processes of the evaluation team and background for your next source selection.

The author can be contacted at anthony.davis@socom.mil.

See the full Mar.-Apr. 2016 issue of Defense AT&L magazine at: http://dau.dodlive.mil/files/2016/02/DATL-Mar_Apr_2016.pdf 

 

Filed Under: Government Contracting News Tagged With: AT&L, best value, DAU, debriefing, DoD, evaluation criteria, FAR, LPTA, offer, offeror, past performance, post-award, proposal, proposal evaluation, RFP, source selection, technical evaluation

  • « Previous Page
  • 1
  • 2
  • 3
  • 4
  • 5
  • Next Page »

Popular Topics

abuse acquisition reform acquisition strategy acquisition training acquisition workforce Air Force Army AT&L bid protest budget budget cuts competition cybersecurity DAU DFARS DHS DoD DOJ FAR fraud GAO Georgia Tech GSA GSA Schedule GSA Schedules IG industrial base information technology innovation IT Justice Dept. Navy NDAA OFPP OMB OTA Pentagon procurement reform protest SBA sequestration small business spending technology VA
Contracting Academy Logo
75 Fifth Street, NW, Suite 300
Atlanta, GA 30308
info@ContractingAcademy.gatech.edu
Phone: 404-894-6109
Fax: 404-410-6885

RSS Twitter

Search this Website

Copyright © 2023 · Georgia Tech - Enterprise Innovation Institute