The Contracting Education Academy

Contracting Academy Logo
  • Home
  • Training & Education
  • Services
  • Contact Us
You are here: Home / Archives for technical evaluation

February 23, 2017 By AMK

Federal IT acquisition worth $50 billion cleared for takeoff

A major federal acquisition opportunity with a potential contract value of $50 billion for information technology vendors is back on track.

The General Services Administration (GSA) recently resumed processing vendor applications after a legal challenge to the contract was resolved in its favor.

As a result, the GSA this fall will reveal the names of approximately 60 vendors who will be eligible to participate in the Alliant 2 IT contract vehicle. The program is notable not only for the money involved, but also for a major contracting change designed to facilitate business opportunities for IT vendors.

Under the contract, vendors will be able to provide a broad range of IT capabilities to multiple federal agencies.

The scope of work is designed to provide agencies with maximum flexibility in acquiring an IT services-based solution, according to GSA, encompassing existing technologies as well as leading-edge capabilities and virtually any future developments in IT. Components will include email, cloud, cybersecurity, networks, Internet of Things, and big data.

Keep reading this article at: http://www.ecommercetimes.com/story/Federal-IT-Acquisition-Worth-50B-Cleared-for-Takeoff-84320.html

Filed Under: Government Contracting News Tagged With: Alliant, cloud, cybersecurity, GSA, IoT, IT, lowesr price technically acceptable, LPTA, technical evaluation, technology

January 23, 2017 By AMK

A practical program manager’s guide to requests for equitable adjustment

My first experience with a request for equitable adjustment (REA) was brief and decisive. The O-6 program director didn’t literally drop it in the trash bin, but he clearly wanted to.

His message to the development contractor was to not expect any action by the government, despite the contractor fastidiously mentioning it month after month on a chart listing unresolved contracts business. The REA resulted from a technical disagreement between the contractor and the government regarding how much in-scope testing was required to properly resolve a spacecraft test fault.

According to Federal Acquisition Regulation (FAR) subpart 43.2, a contractor requests equitable adjustment — essentially a type of proposal — in response to a unilateral contract change order, but other unplanned changes to contract terms, such as a late delivery of government furnished property (GFP) or disputes over scope, can lead the contractor to send an unexpected REA.

In the daily life of a program office, REAs are rare because planned contract changes are accompanied by requests for proposal. Likewise, when the contractor and government agree about an unplanned change, the program manager (PM) would treat the REA similarly to any other proposal. However, when the REA results from disagreement on contract terms, delay of work, or scope (either in type or magnitude), the working relationship may become tense if it isn’t tense already. Both the government and contractors must weigh issues of fairness and duty to stakeholders when deciding how to proceed.

Decision making may become emotionally charged, to the detriment of the relationship and program progress.

In the situation described above, the REA was a small blip that did not threaten the program’s overall success — we had an enormous cost-plus satellite contract and recognized the need for all parties to work together to get the spacecraft to the launch pad. The issue slowly died and eventually went away. In that instance, it wasn’t a bad strategy for the government, but it was not the ideal learning experience for a young field grade officer on how to deal with the situation in the future.

Years later I joined an Acquisition Category (ACAT) I equivalent, open-architecture development program using multiple fixed-price contracts with interdependent (but competing) developers. Team members knew going in that we had the perfect environment for spawning REAs. Not only does the government have a duty to respond to contractor requests for adjustment, but unlike my previous experience on the satellite development program, here even a modest REA had the potential to derail the program. The willingness of the associate contractors to work with each other would quickly degrade if they distrusted the government to enforce the assumptions and terms of each contract.

The actions a contracting officer takes to respond to an REA are clearly outlined by the FAR and Defense Federal Acquisition Regulation Supplement (DFARS), but nothing similar exists for technical evaluators. The standard process for evaluating reasonableness of proposed costs is meaningless if there is no way to analyze whether claimed impacts were in scope in the first place.

When the first REA arrived from my open architecture integrator for “low-quality GFP,” we looked for standard guidance on how to handle REAs. Finding none, our team developed a methodology to determine whether REA claims had merit. Taking the contractor’s claim seriously and conducting a dispassionate analysis keeps the interactions professional and de-escalates emotions. Defining an objective process upfront increases acceptance of the result and perhaps more important, it shows that the government is exercising due diligence.

The process we developed includes a flow chart (Figure 1) and a six-step evaluation methodology. It is intended for PMs and action officers conducting a technical evaluation of merits and quanta of the claims and complements the contracting officer’s evaluation.

The six steps in the REA evaluation process are:

Step 1: Establishment of facts.

List all of the claims made by the contractor and sort them into facts the government agrees with upfront and those which require further substantiation. Statements about which the government has no direct knowledge or a conflicting opinion should not be agreed to upfront. Usually the chain of events can be agreed upon by all parties, but a claim that GFP was inadequate (for example) will require supporting evidence. It’s the contractor’s responsibility to provide such evidence.

This step forces the government to articulate and understand what exactly the contractor thinks happened, what it wants, and on what grounds. It establishes the major issues of the REA. It defines the points the government must address in the analysis and for which the contractor must provide support.

Step 2: Examination of scope.

The contract statement of work (SOW) may or may not be very specific. However, in a scope dispute, all relevant paragraphs must be brought forward and considered against the claims. It’s helpful to quote all relevant SOW language and contractual clauses directly in the writeup to facilitate the work of other reviewers.

This is where program management needs to confront the truth of how a contractor could have ended up performing out-of-scope work. Going for a quick and easy kill on scope by broad-brushing the topic will not satisfy anybody, and it probably won’t stand up to legal scrutiny, if it comes to that.

The evaluator should use the relevant contractual language to conclude whether the work was out of scope. If all parties agree on this point, say so. If not, the reviewer needs to present a more detailed argument as to why the work was in scope or not.

Sometimes comparison with the text isn’t enough. The quality or condition of GFP may not be explicitly defined in the contract, but it’s not an excuse to stick the contractor with the added cost of dealing with unreasonably low quality GFP. Contextual factors such as proposal assumptions, reasonable person tests and possible interpretations should be discussed.

Step 3: Review contractual direction.

A contractor cannot self-generate out-of-scope work. After the contracting officer gives authority to proceed, there is a presumption that all tasks started are in scope. It is critical to examine all relevant formal and informal communication between parties. For the benefit of reviewers, list communications such as letters and emails and summarize what was said. Conclude whether the contractor requested direction and if direction was provided by the contracting officer.

Step 4: Substantiate all claims.

If it is the program’s first REA, the contractor may not recognize the need to provide any evidence in support of the REA’s claims. My contractors built REAs just like any other proposal: They predominantly were written by the business team focusing on cost data and pricing labor hours, so the impact basis of estimate was well supported. Justifying the claim was given cursory treatment by contracting staff, if not ignored completely. Resist the temptation to handle this in negotiations—making the contractor write down its justification will force it to think matters through.

By this point, looking at scope and contractual direction should give the action officer an idea about where the evaluation will end up, but it is still necessary to analyze any evidence provided by the contractor. Analyze the logic and applicability of arguments and contract interpretations. If the REA justification is weak or nonexistent, be clear in the writeup about what is missing.

Step 5: Minimization.

The contractor has a duty to minimize out-of-scope work and perform in-scope work first. When operating where REAs are being generated, the government PM needs to embrace this principle—it provides the only downward cost pressure for an REA. Contracting normally relies on competition or negotiation backed up by engineering expertise to secure fair prices for the government, but REAs have no such protection. If the contractor allows out-of-scope, unnegotiated work to occur in place of negotiated work defined by the SOW with the expectation that it can be reimbursed through an REA, the government has lost control of the program.

Another consequence of minimization is that negotiating an REA is not simply a matter of negotiating actuals. For example, if the contractor decided to perform the work with Level 5 engineers but could have used Level 3s, the government is fully justified in taking exception. If this were in-scope work, cost would have been controlled first by negotiation and then by cost incentives. With an REA, the minimization principle is the primary lever.

From a practical standpoint, this step is an extension of the previous one. However, there is value in keeping this step separate so that a reviewer can easily see which costs were substantiated in Step 4 and what incremental adjustments were made in Step 5.

Step 6: Reciprocal consideration.

If the government has any reciprocal or offsetting equitable adjustments against the contractor, this is where positive and negative dollar amounts cancel each other out to produce a lower or zero net payment. Theoretically, the government could press that other claim against the contractor separately and receive funding back (similar to a descope proposal), but this is so rare I have never seen it prove worth the effort. Despite this, the government should never give up leverage on contractor performance—it is still valuable.

Where the contractor refuses to drop the REA, a trade gives the contractor PM something to sell to his or her corporate management. The trade doesn’t need to be dollar exact— the flexibility afforded by negotiations could allow the contractor PM to make the trade fit even when the amount supported by government analysis is lower than the contractor’s original request.

Gray Areas

The REAs my team dealt with generally fell into two categories—some sort of problem with GFP or proposal assumptions being violated. We spent many long evenings weighing various factors to determine how much liability fell in the government’s corner.

In one case, our contractor started in-scope work and continued working even past the point where the contractor considered it to be out of scope. The contractor received a buggy GFP software delivery for integration into the weapon system, but the software code required extensive troubleshooting, repeated attempts at integration, and integration of multiple drops once the software was fixed. Although the work was in scope, they made a good point that they didn’t sign up for unlimited integration costs in their fixed-price proposal. Nobody knew what constituted a reasonable upper limit, but we all theoretically agreed one existed. In this case, the auto-generation principle decided the way forward: As soon as the contractor thought work was out of scope, it should have stopped and requested direction before proceeding. Finishing work, later deciding it’s out of scope, and submitting a REA is irresponsible.

In another instance, low-quality GFP also caused the contractor to work less efficiently than it had bid. We all agreed it would have been impractical to request direction. The contractor had a fairly strong case when this happened, except that the SOW, not proposed price, determines the limits of scope. To allow otherwise is to reward the contractor for low-balling the bid. This is especially true if the bid was competitive (it was) and the GFP condition is not documented in the contract (it was not). At the end of the day, the government met the letter of the contract. The argument was bolstered with an “experienced contractor” standard—an experienced bidder should always expect some level of integration difficulty.

In a final case, a subsystem provider underbid the amount of integration support (software bug-fixes) required for the quality and maturity of their offering. The contractor planned to do this work during system integration but did not win the integrator contract, putting all parties in an awkward position. In pushing the contractor to comply with the SOW and continue bringing the subsystem up to specification, we discovered the practical limits of fixed price contracting. The contractor sent an REA claiming the extraordinarily high amount of support required exceeded its interpretation of the SOW. This REA did derail the program, and we were at the point of deciding between litigation and finishing the weapon system. The government sustained the request and finished the system.

The Big Picture

Although supporting an REA is disadvantageous to the government, the objective of this process is not to summarily crush all REAs. It was designed to produce a transparent position all parties can understand. Sometimes even airtight logic isn’t enough to satisfy the contractor. They are accountable to corporate management, financial, and shareholder concerns and may not be free to simply drop an REA if the corporation sees a reasonable chance of success. Although the REA disposition is unilateral, the contractor can always initiate a legal claim. A thorough and well-reasoned government analysis decreases the likelihood and success of litigation.

When it comes to building a weapon system, contractor and government PMs are in it together. The contractor’s decision to send an REA and the government’s disposition both take place in the context of the larger relationship. I have seen government PMs give away the farm in the interest of maintaining a good working relationship, and I have seen working relationships degrade to the point of yelling phone calls and slow progress. It’s important to navigate between extremes with full understanding of the short- and long-term costs of a decision to support or reject a contractor’s request for equitable adjustment.

The views expressed in this article are those of the author and do not necessarily reflect the official policy or position of the Air Force, the Department of Defense or the U.S. Government.

Source: Defense AT&L Magazine, Jan-Feb. 2017 – http://www.dau.mil/publications/DefenseATL/DATLFiles/Jan-Feb2017/DATLJan_Feb2017.pdf

Filed Under: Government Contracting News Tagged With: acquisition workforce, AT&L, consideration, contracting officer, DFARS, equitable adjustment, FAR, GFP, PCO, price reasonableness, program management, program manager, property, proposed costs, REA, technical evaluation

May 5, 2016 By AMK

Common feedback to unsuccessful bidders

The Federal Acquisition Regulation (FAR) allows several opportunities for the government to provide feedback to bidders during or after competitions. The post-award debriefing of offerors is one of those opportunities, and can be a very valuable tool for companies seeking feedback on their proposals.

anthony j. davisThe government is required by the FAR to provide a post-award debriefing to any offeror who requests one in writing within 3 days of notification of contract award.

In the dozens of competition debriefs I’ve conducted or attended in more than a decade, I’m consistently surprised by how often we repeat the same information. The following reviews the format we use for debriefings, questions we’re frequently asked during the discussions, and some of the common feedback we seem to repeat regularly.

The Department of Defense (DoD) guidance on debriefings states the objective as: “The crux of any post award debriefing is the SSA [Source Selection Authority] award decision and whether that decision is well supported and resulted from a source selection conducted in a thorough, fair and sound manner consistent with the requirements and source-selection methodology established in the RFP [request for proposal].” The preceding quote (under section B.8.3.1) and other information about DoD source selections can be found on DAU’s Acquisition Community Connection at: https://acc.dau.mil/dodssp.

First, our standard debriefing format: The objective of this post-award debriefing is to highlight the significant elements in your proposal and to summarize the rationale for award. The ground rules are open and honest discussions within the limits of FAR 15.506.

The focus is on your proposal submission. But overall evaluated cost, task order management and technical proposal ranking for the successful bidder will be provided, including summary of the rationale for award.

Reasonable responses will be given to relevant questions about whether the source-selection procedures, applicable regulations and other applicable authorities were followed in eliminating your proposal from the competition.

You are encouraged to ask questions. Answers not provided today will be provided in writing as soon as possible. In accordance with the FAR 15.506(e), the government will not disclose:

• Trade secrets

• Privileged or confidential processes and techniques

• Commercial and financial information that is privileged or confidential

• Names of individuals providing reference information on past performance

Source Selection Process/Evaluation Factors

In this section, we read a summary of the source-selection process outlined in Sections L and M of the RFP, including the rating scheme and prioritization of factors evaluated. An example is shown below:

A color-code rating technique was used to evaluate the Management and Technical proposals. Past Performance was evaluated for an overall confidence rating and cost proposals were not given a rating. Each proposal was evaluated against the following four factors: (1) Management, (2) Technical Proposal, (3) Past Performance, and (4) Cost. Evaluation of Factors 1 and 2 focused on the strengths, weaknesses, significant weaknesses and deficiencies of the proposals. Evaluation of risk associated with the proposals for these factors are inherent in the evaluation.

As outlined within the RFP, Management and Technical are equal in importance and more important than Past Performance. When combined, these three are significantly more important than Cost.

Following the reading of our standard debriefing, we review the ratings the company in question received. In particular, we focus on the “strengths, weaknesses, significant weaknesses, and deficiencies of the proposal” that resulted in the final overall rating.

Some Common Questions and Answers

Q: Can you tell us how we might compete more favorably next time?

A: Our response to this generally is fairly standard, and tracks directly back to what we tell you in Sections L (Instructions, conditions, and notices to offerors or respondents) and M (Evaluation factors for award). First, your proposal should show that you understand the requirement, preferably without regurgitating it. Second, your proposal should demonstrate how you are going to meet the requirement. Last, but certainly not least, the higher color ratings are awarded when the proposal (1) meets requirements; (2) shows a thorough (or exceptional) approach and understanding of the requirements; (3) contains strengths which outweigh (or far outweigh) any weaknesses; (4) and when risk—not evaluated separately—of unsuccessful performance is low or very low.

Q: Why wasn’t our “concept X” evaluated as a strength?

A: The DoD source-selection procedures (https://acc.dau. mil/dodssp) define a strength as “an aspect of an offeror’s proposal that has merit or exceeds specified performance or capability requirements in a way that will be advantageous to the government during contract performance.” It is incumbent on the vendors to demonstrate their understanding of the requirement, and explain how their approaches will provide value to the government. In many cases, good ideas do not rise to the level of a strength in evaluation because: (1) the concept expressed in the proposal does provide value to the government but is part of what was asked for in the RFP (i.e., is part of how you will meet our requirements, not a way to meet them better, smarter, faster, etc.); or (2) the concept isn’t supported by or integrated with the rest of the proposal (does not track to pricing, is not supported by staffing, is not integrated with service-delivery model, etc.). For example, nearly all proposals we review include ideas such as reach-back support, a council of graybeards to provide strategic consultation, or something else intended to differentiate the proposal from others. But, without providing details on the specific, tangible outcomes (in terms of hours, work products or deliverables) that meet the definition of strength, the government will not evaluate them as strengths during a source selection.

Q: Why were we evaluated with a weakness for “Y?”

A: In general, we would prefer that it never come to this. Our intent is to have significant and substantive discussions throughout our acquisitions to the broadest extent authorized. As a result of those discussions, we should at the very least have communicated to the vendors any significant deficiencies or weaknesses in their proposals and given them time to correct those deficiencies. The presence of a weakness in the final evaluation generally means (1) we don’t believe the vendor understands or recognizes the weakness we’ve pointed out and hasn’t changed its proposal to respond to it; or (2) despite the vendor’s attempt(s) to respond to the weakness, we still don’t understand how the vendor plans to address it or don’t see the staffing or other resources to resolve the matter.

Q: Wasn’t this just a Lowest Priced, Technically Acceptable (LPTA) source selection?

A: There is a time and place for LPTA, but the RFP will always state specifically where the evaluation falls on the best value continuum. The vast majority of our source selections are conducted as best value trade-offs. From the top down in Special Operations Research, Development and Acquisition, we’re strong believers in best value source selections and actively strive to be the best in DoD at conducting them. We focus a great deal of time and effort to ensure we have a well-trained and prepared acquisition workforce with the experience and tools to properly execute, document and communicate the source selections we make and to defend the selections in the event of any protests.

Q: Can you tell us how our cost or proposal compared with the other offerors?

A: Unfortunately, no. In most cases, we will provide the winning offeror’s total cost, and the winner’s evaluation results in terms of colors. We are prohibited by the FAR from disclosing any proprietary information (including other offerors’ costs), directly comparing vendors or providing point-bypoint comparisons.

Some Common Feedback

The evaluation team felt you spent too much of your proposal regurgitating the requirement to us. It’s sometimes a fine balance, but you need to convey to us that you understand the requirement without just reading it back to us. In addition, including examples of work on past efforts does not demonstrate your understanding of the requirement. That experience is evaluated as part of past performance.

Your pricing, staffing model or overall approach (or portions of them) did not make sense to us, were not well supported or didn’t track back clearly to your understanding of the requirement. When evaluating your proposal, we take a very structured approach. We read to understand your overall approach and understanding of the requirement, evaluate whether your proposal meets our requirements, and then identify any strengths or weaknesses of your approach. Well-written proposals lead us clearly and unambiguously through that process and are consistent throughout. An example of this is dividing a large proposal into sections by different vendor offices or organizations. This can save time by having the subject-matter expert write each proposal area, but frequently results in a disjointed proposal when the different sections are not well integrated. We recommend a detailed final review by the offeror of the entire proposal to ensure it is clear and consistent and that the data are not repeated in multiple sections.

Evaluation of past performance is based on the offeror’s recent/relevant performance record from a variety of sources. This may include information provided by the offeror, information obtained from questionnaires (internally or externally), or information obtained from any other source available to the government (Past Performance Information Retrieval System, electronic Subcontract Reporting System, etc.).

So, that’s a quick down and dirty overview of the format we use for debriefings of unsuccessful offerors, questions we’re frequently asked during the discussions, and some of the common feedback we seem to repeat regularly. Hopefully, it provides some insight into the thought patterns and work processes of the evaluation team and background for your next source selection.

The author can be contacted at anthony.davis@socom.mil.

See the full Mar.-Apr. 2016 issue of Defense AT&L magazine at: http://dau.dodlive.mil/files/2016/02/DATL-Mar_Apr_2016.pdf 

 

Filed Under: Government Contracting News Tagged With: AT&L, best value, DAU, debriefing, DoD, evaluation criteria, FAR, LPTA, offer, offeror, past performance, post-award, proposal, proposal evaluation, RFP, source selection, technical evaluation

March 18, 2016 By AMK

GAO further clarifies its rule on differing technical re-revaluations

It is not surprising that after four protests of the same task order, three corrective actions by the agency, and four evaluations of technical proposals, the final evaluation ratings may differ from prior evaluations.

GAO-GovernmentAccountabilityOffice-SealSuch variations are not necessarily improper as the GAO made clear in a recent protest.

On January 29, 2016, the GAO released a decision denying a protest filed by MILVETS Systems Technology, Inc., B-409051.7; B-409051.9. The procurement history at issue in MILVETS was complicated, beginning with the release of the solicitation by the Department of Agriculture (“USDA”) in July 2013. In sum, two consecutive awards were made to MILVETS and each was protested, causing the USDA to take corrective action twice by reevaluating technical proposals.

After the second award to MILVETS was protested, the USDA assembled a new technical evaluation panel (“TEP”) and source selection authority (“SSA”) that had no knowledge of the first two evaluations. The new TEP reevaluated quotations and the new SSA awarded a task order to DKW Communications, Inc. MILVETS protested the third award, causing the USDA to take yet another round of corrective action by amending the solicitation and seeking revised quotations.

Keep reading this article at: http://govcon.mofo.com/protests-litigation/gao-clarifies-rule-on-differing-technical-reevaluations/

Filed Under: Government Contracting News Tagged With: award protest, corrective action, evaluation criteria, GAO, proposal evaluation, protest, technical evaluation, USDA

  • « Previous Page
  • 1
  • 2

Popular Topics

abuse acquisition reform acquisition strategy acquisition training acquisition workforce Air Force Army AT&L bid protest budget budget cuts competition cybersecurity DAU DFARS DHS DoD DOJ FAR fraud GAO Georgia Tech GSA GSA Schedule GSA Schedules IG industrial base information technology innovation IT Justice Dept. Navy NDAA OFPP OMB OTA Pentagon procurement reform protest SBA sequestration small business spending technology VA
Contracting Academy Logo
75 Fifth Street, NW, Suite 300
Atlanta, GA 30308
info@ContractingAcademy.gatech.edu
Phone: 404-894-6109
Fax: 404-410-6885

RSS Twitter

Search this Website

Copyright © 2023 · Georgia Tech - Enterprise Innovation Institute