Agencies rely on contractor tool to spot procurement errors

Does federal contract award data need a few good proofreaders?

According to Jeff Sopko, executive vice president of the Alexandria, Va.-based PotomacWave consulting firm, about 20 percent to 30 percent of records in the government’s procurement database contain errors. “They stem from frequent changes in policy and procedures that take time to be implemented in source contract writing, as well as human errors,” he said. “The errors get pushed from the contract writing system to [the Federal Procurement Data System-Next Generation] with few validation checks.”


To address that problem, PotomacWave in 2012 created a software tool called FedDataCheck.

Keep reading this article at:

Agencies setting off into the next data frontier — procurement

Agencies are starting to grasp the real value of procurement data. Several agencies are asking the General Services Administration, NASA and others for more details on what they buy, how they buy it and how they could make better decisions.

NASA, for example, is working closely with the Veterans Affairs Department to provide them with an assortment of data points around energy efficiency, such as how VA’s IT products are rated for Energy Star or E-Peat. NASA also plans to provide VA with information about how their purchases meet the Trade Agreements Act and about their buying habits based on product classifications.

Joanne Woytek, the program manager for NASA SEWP governmentwide acquisition contract, said the fact that VA and other agencies are asking for and receiving this type of data is a sign of maturity for both the GWAC providers and the agencies in understanding what’s available and why the data matters.

“I’ve seen this happening more with our contracts and SEWP V. A lot of what we are putting into that is to make it a more mature model. We can’t just say, ‘we can do that,’ we will actually demonstrate the things we can do,” she said at the 2014 Acquisition Excellence conference in Washington Thursday sponsored by GSA, the Homeland Security Department and ACT-IAC. “We will be able to show agencies what they are buying. We’re going to be able to provide them with more information. We always said we could do that, but we actually are going to start doing that. I think that’s going to have a bigger effect on agencies who no longer will say ‘I don’t want to use you because I’m not sure you can give me that information. I’m not sure you can control what we’re purchasing.’ We can do that for them and we’ll actually start showing that. So I see us having a better impact on people now that we’ve gotten to this point.”

Keep reading this article at: 

GSA acquisition database integration pushed back to 2018

The General Services Administration (GSA) pushed back the planned completion date of an integrated acquisition database to 2018 because of development problems and cost overruns, GSA Assistant Commissioner Kevin Youel Page told a Senate panel March 6.

“We’ve suffered our own missteps,” Page said during a hearing of the Senate Homeland Security and Governmental Affairs Committee subcommittee on financial and contracting oversight.

Plans were made in 2001 to combine governmentwide acquisition databases into a single system called the Integrated Acquisition Environment.

But the project has been plagued with problems.

A March 2012 Government Accountability Office report says cost overruns, which grew by 89 percent, were largely due to mistakes GSA has made. GAO initially estimated it would cost about $95.7 million, but the 2012 estimate came in at $181.1 million.

Keep reading this article at: 


GSA official touts progress on

The General Services Administration is “very proud” of the work it has done with the federal government’s online clearinghouse of downloadable information, but challenges remain, a project manager said on Wednesday. has grown from hosting 47 data sets when it launched in May 2009 to offering more than 300,000 today, said Program Director Marion Royal, during the International Open Government Data Conference in Washington. Royal leads the team of federal employees and contractors working on the site.

GSA launched an open data community page within the site on Monday, according to Royal. The page aims to bring together policy makers, technologists, data owners and citizens, encouraging them to make recommendations on information that should be shared.

“[I’m] happy to finally open up . . . two-way communication on,” he said. “New two-way communication is going to be helpful to not only us but the public as well.”

GSA also is working on hosting data sets that will be easier to view without downloading the entire file, Royal said, adding, this will be especially helpful for people using mobile devices.

“[The] tools we are looking at will allow us to do that kind of thing and allow us to [display] the data in a way the average person” will understand, he said.

Despite this progress, hurdles remain, according to Royal. “If we have additional funding, we could do additional things,” he said, though he did not elaborate on how much extra money would be necessary, or what he’d like to accomplish with it.

Royal has big plans for the site nonetheless. “I hope we can get around or over the roadblocks to achieve some really special things,” he said. “No matter how well we do at, we [always have] to kick it up a notch.”

– by Brian Kalish – 11/18/2010 –

Officials hope unique identifiers will sharpen procurement data reporting

Agencies may have to develop unique identifiers for their contracts and orders under a new proposal meant to improve the quality of federal data.

The Civilian Agency Acquisition and the Defense Acquisition Regulations councils have proposed standardizing the use of unique procurement instrument identifiers (PIID) throughout the government and giving agencies policies on how to use the identifiers, according to a notice published in today’s Federal Register. A PIID consists of alpha characters in the first positions to indicate the agency, followed by alphanumeric characters identifying bureaus, offices, or other administrative subdivisions.

Under the proposal, agencies would have to ensure each PIID reported to the Federal Procurement Data System (FPDS) would be unique for all contracts and orders. The identifier would have to be unique for at least 20 years from the contract’s award and used on all solicitations. Agencies also would have to submit the format of their identifier to the General Services Administration’s Integrated Acquisition Environment Program Office, according to the notice.

Officials hope to improve transparency, the quality of the contract and spending reports with the proposed policy.

As it stands now, the Federal Acquisition Regulation requires the unique identifiers, but it has no policies that accompany the rule. And the lack of specifics causes numerous hitches for contract data in governmentwide systems, such as FPDS, and other systems that issue reports on the data, according to the notice.

The result is duplication, errors and discrepancies, and these problems are exacerbated by multiple-award contracts that more than one agency uses, the notice states.

“Without a consistent means for distinguishing PIIDs for each agency to ensure uniqueness beyond FPDS reporting, it is difficult to report to the level of transparency required by” the Federal Funding Accountability and Transparency Act, which instituted, and the economic stimulus law, the notice states.

At a congressional hearing in July, Earl Devaney, chairman of the Recovery Accountability and Transparency Board, said his auditors hit numerous roadblocks in their oversight of contract and grant awards because there’s no cohesion among agencies on how they code their awards.

— by Matthew Weigelt – Aug. 17, 2010 – Federal Computer Week