The Federal Acquisition Regulation (FAR) allows several opportunities for the government to provide feedback to bidders during or after competitions. The post-award debriefing of offerors is one of those opportunities, and can be a very valuable tool for companies seeking feedback on their proposals.
The government is required by the FAR to provide a post-award debriefing to any offeror who requests one in writing within 3 days of notification of contract award.
In the dozens of competition debriefs I’ve conducted or attended in more than a decade, I’m consistently surprised by how often we repeat the same information. The following reviews the format we use for debriefings, questions we’re frequently asked during the discussions, and some of the common feedback we seem to repeat regularly.
The Department of Defense (DoD) guidance on debriefings states the objective as: “The crux of any post award debriefing is the SSA [Source Selection Authority] award decision and whether that decision is well supported and resulted from a source selection conducted in a thorough, fair and sound manner consistent with the requirements and source-selection methodology established in the RFP [request for proposal].” The preceding quote (under section B.8.3.1) and other information about DoD source selections can be found on DAU’s Acquisition Community Connection at: https://acc.dau.mil/dodssp.
First, our standard debriefing format: The objective of this post-award debriefing is to highlight the significant elements in your proposal and to summarize the rationale for award. The ground rules are open and honest discussions within the limits of FAR 15.506.
The focus is on your proposal submission. But overall evaluated cost, task order management and technical proposal ranking for the successful bidder will be provided, including summary of the rationale for award.
Reasonable responses will be given to relevant questions about whether the source-selection procedures, applicable regulations and other applicable authorities were followed in eliminating your proposal from the competition.
You are encouraged to ask questions. Answers not provided today will be provided in writing as soon as possible. In accordance with the FAR 15.506(e), the government will not disclose:
• Trade secrets
• Privileged or confidential processes and techniques
• Commercial and financial information that is privileged or confidential
• Names of individuals providing reference information on past performance
Source Selection Process/Evaluation Factors
In this section, we read a summary of the source-selection process outlined in Sections L and M of the RFP, including the rating scheme and prioritization of factors evaluated. An example is shown below:
A color-code rating technique was used to evaluate the Management and Technical proposals. Past Performance was evaluated for an overall confidence rating and cost proposals were not given a rating. Each proposal was evaluated against the following four factors: (1) Management, (2) Technical Proposal, (3) Past Performance, and (4) Cost. Evaluation of Factors 1 and 2 focused on the strengths, weaknesses, significant weaknesses and deficiencies of the proposals. Evaluation of risk associated with the proposals for these factors are inherent in the evaluation.
As outlined within the RFP, Management and Technical are equal in importance and more important than Past Performance. When combined, these three are significantly more important than Cost.
Following the reading of our standard debriefing, we review the ratings the company in question received. In particular, we focus on the “strengths, weaknesses, significant weaknesses, and deficiencies of the proposal” that resulted in the final overall rating.
Some Common Questions and Answers
Q: Can you tell us how we might compete more favorably next time?
A: Our response to this generally is fairly standard, and tracks directly back to what we tell you in Sections L (Instructions, conditions, and notices to offerors or respondents) and M (Evaluation factors for award). First, your proposal should show that you understand the requirement, preferably without regurgitating it. Second, your proposal should demonstrate how you are going to meet the requirement. Last, but certainly not least, the higher color ratings are awarded when the proposal (1) meets requirements; (2) shows a thorough (or exceptional) approach and understanding of the requirements; (3) contains strengths which outweigh (or far outweigh) any weaknesses; (4) and when risk—not evaluated separately—of unsuccessful performance is low or very low.
Q: Why wasn’t our “concept X” evaluated as a strength?
A: The DoD source-selection procedures (https://acc.dau. mil/dodssp) define a strength as “an aspect of an offeror’s proposal that has merit or exceeds specified performance or capability requirements in a way that will be advantageous to the government during contract performance.” It is incumbent on the vendors to demonstrate their understanding of the requirement, and explain how their approaches will provide value to the government. In many cases, good ideas do not rise to the level of a strength in evaluation because: (1) the concept expressed in the proposal does provide value to the government but is part of what was asked for in the RFP (i.e., is part of how you will meet our requirements, not a way to meet them better, smarter, faster, etc.); or (2) the concept isn’t supported by or integrated with the rest of the proposal (does not track to pricing, is not supported by staffing, is not integrated with service-delivery model, etc.). For example, nearly all proposals we review include ideas such as reach-back support, a council of graybeards to provide strategic consultation, or something else intended to differentiate the proposal from others. But, without providing details on the specific, tangible outcomes (in terms of hours, work products or deliverables) that meet the definition of strength, the government will not evaluate them as strengths during a source selection.
Q: Why were we evaluated with a weakness for “Y?”
A: In general, we would prefer that it never come to this. Our intent is to have significant and substantive discussions throughout our acquisitions to the broadest extent authorized. As a result of those discussions, we should at the very least have communicated to the vendors any significant deficiencies or weaknesses in their proposals and given them time to correct those deficiencies. The presence of a weakness in the final evaluation generally means (1) we don’t believe the vendor understands or recognizes the weakness we’ve pointed out and hasn’t changed its proposal to respond to it; or (2) despite the vendor’s attempt(s) to respond to the weakness, we still don’t understand how the vendor plans to address it or don’t see the staffing or other resources to resolve the matter.
Q: Wasn’t this just a Lowest Priced, Technically Acceptable (LPTA) source selection?
A: There is a time and place for LPTA, but the RFP will always state specifically where the evaluation falls on the best value continuum. The vast majority of our source selections are conducted as best value trade-offs. From the top down in Special Operations Research, Development and Acquisition, we’re strong believers in best value source selections and actively strive to be the best in DoD at conducting them. We focus a great deal of time and effort to ensure we have a well-trained and prepared acquisition workforce with the experience and tools to properly execute, document and communicate the source selections we make and to defend the selections in the event of any protests.
Q: Can you tell us how our cost or proposal compared with the other offerors?
A: Unfortunately, no. In most cases, we will provide the winning offeror’s total cost, and the winner’s evaluation results in terms of colors. We are prohibited by the FAR from disclosing any proprietary information (including other offerors’ costs), directly comparing vendors or providing point-bypoint comparisons.
Some Common Feedback
The evaluation team felt you spent too much of your proposal regurgitating the requirement to us. It’s sometimes a fine balance, but you need to convey to us that you understand the requirement without just reading it back to us. In addition, including examples of work on past efforts does not demonstrate your understanding of the requirement. That experience is evaluated as part of past performance.
Your pricing, staffing model or overall approach (or portions of them) did not make sense to us, were not well supported or didn’t track back clearly to your understanding of the requirement. When evaluating your proposal, we take a very structured approach. We read to understand your overall approach and understanding of the requirement, evaluate whether your proposal meets our requirements, and then identify any strengths or weaknesses of your approach. Well-written proposals lead us clearly and unambiguously through that process and are consistent throughout. An example of this is dividing a large proposal into sections by different vendor offices or organizations. This can save time by having the subject-matter expert write each proposal area, but frequently results in a disjointed proposal when the different sections are not well integrated. We recommend a detailed final review by the offeror of the entire proposal to ensure it is clear and consistent and that the data are not repeated in multiple sections.
Evaluation of past performance is based on the offeror’s recent/relevant performance record from a variety of sources. This may include information provided by the offeror, information obtained from questionnaires (internally or externally), or information obtained from any other source available to the government (Past Performance Information Retrieval System, electronic Subcontract Reporting System, etc.).
So, that’s a quick down and dirty overview of the format we use for debriefings of unsuccessful offerors, questions we’re frequently asked during the discussions, and some of the common feedback we seem to repeat regularly. Hopefully, it provides some insight into the thought patterns and work processes of the evaluation team and background for your next source selection.
The author can be contacted at anthony.davis@socom.mil.
See the full Mar.-Apr. 2016 issue of Defense AT&L magazine at: http://dau.dodlive.mil/files/2016/02/DATL-Mar_Apr_2016.pdf