Agencies urged to improve quality of spending data despite ‘higher’ ratings
Most agencies should take steps to improve spending-data quality, although the majority already have “higher” quality information, according to a Government Accountability Office report released Thursday.
The GAO looked at assessments by the offices of inspector generals (OIGs) at 51 agencies that the Digital Accountability and Transparency (DATA) Act of 2014 mandated to determine spending data quality, especially the rate of errors.
Thirty-seven of those OIGs determined their agency’s data was of higher quality, although not all those reports were complete. Another six agencies had moderate quality data, and four agencies lower quality data. Four others did reports but didn’t follow the DATA Act’s protocol. The data was from the first quarter of fiscal 2019.
OIGs evaluated their agencies’ completeness of submissions and data elements, timeliness of submissions and data elements, and accuracy and quality of data elements. GAO did not identify any agencies by name.
Error rates of 0% to 20% were deemed “higher” quality, 21% to 40% “moderate” quality, and 41% and above “lower” quality. The area with the highest error rate among completeness, timeliness and accuracy was used to determine data quality. So if an agency scored higher for completeness and timeliness but moderate for accuracy, it would receive a moderate data quality rating.
Despite the generally positive data quality ratings, 47 OIGs reported deficiencies in quality controls ranging from data entry errors to incorrect application of Department of the Treasury and Office of Management and Budget standards. The most common control deficiency was information technology system limitations, reported by 19 OIGs.
“These deficiencies related to information technology systems, including systems integration, configuration, and the lack of effective automated systems controls, such as those to help ensure proper system user access or the accuracy and completeness of data,” reads the GAO report.
Overall 14 OIGs found their agencies missing spending data with three from “significant” components or systems.
“For example, one OIG reported that its agency’s Q1 FY 2019 submission did not include award-level data totaling almost $10 billion for two of the agency’s components,” reads the GAO report. “In addition, another OIG reported that its agency was missing data for at least four components, including financial assistance award data for one of the components with an absolute value of $776 million.”
Missing data wasn’t reflected in error rates with most OIGs, 32 to be exact, reporting higher error rates for accuracy than completeness or timeliness.
A total of 44 OIGs recommended data quality improvements, which 39 agencies fully agreed with and five partially agreed with. Suggested improvements fell into five general categories:
- Implementing data quality procedures and guidance.
- Developing controls to resolve submission issues.
- Developing controls for reviewing and correcting data from source systems.
- Working with Treasury and OMB to resolve issues.
- Developing, implementing and evaluating automated system controls.
GAO offered no recommendations of its own, instead allowing the Council of the Inspectors General on Integrity and Efficiency to comment on the report.
“The report provides useful information on the federal inspectors general efforts to meet oversight and reporting responsibilities under the DATA Act,” CIGIE wrote in its response. “As such, we believe this report will contribute to a greater understanding of the oversight work performed by the IG community and of agency efforts to report and track governmentwide spending more effectively.”
How agency leaders can prepare a future-ready government
The Navy wants to use wearable tech to fight spread of COVID-19
After struggling with outbreaks of COVID-19 on its deployed ships, the Navy is considering using wearable tech to help track the proximity of sailors to one another in an attempt to ensure social distancing and fight the spread of the virus.
The service issued a request for information Thursday seeking commercially available proximity tracking technology based on wearables that continuously measure the distance between themselves and others nearby. Those devices will then be connected to a processing station that will upload their proximity data to “calculate the total time and at what distance two individuals with the wearables have been in close contact,” says the RFI.
“The proximity records will primarily be used to identify those individuals that were too close for too long to a person that has tested positive for COVID-19,” says the solicitation. “Secondarily, this data will be used to determine if social distancing policies put in place by the government employers are effective.”
It’s unclear from the RFI where the Navy would use the wearables — on deployed ships, where the spread of the coronavirus has been most problematic, or at bases onshore — though the latter appears more likely. The devices “are only intended to be worn while at work,” the RFI says. “At some point or at multiple times during the day, the proximity records will be transferred to a station(s) that will store the records for all employees of a given organization where all of these records can be viewed and analyzed.”
Many of the Navy’s large vessels deployed before the worldwide outbreak hit its peak have been forced to stay at sea for record durations. The ships have been forbidden from docking at ports to prevent the potential spreading of the coronavirus onto aircraft carriers and cruisers. Likewise, onboard visitors are not allowed and there are strict rules in place for the delivery of supplies.
The Navy is looking for a quick turnaround, asking vendors to submit information by July 16.
Pentagon’s second wave of 5G RFPs weeks away
The Department of Defense plans to issue additional requests for proposals (RFPs) in the coming weeks for 5G wireless prototypes at a second tranche of U.S. military bases.
DOD announced 5G experiments at four initial bases in 2019 — Hill Air Force Base in Utah, Joint Base Lewis-McChord in Washington, Naval Base San Diego, and Marine Corps Logistics Base Albany, Georgia — exploring dynamic spectrum sharing, augmented reality (AR) and virtual reality combat training, and smart warehouses.
The second tranche announced in June includes seven more bases and will expand the Pentagon’s three-part prototyping and experiment effort to accelerate the use of 5G, operate everywhere the military deploys, and innovate in preparation for future wireless generations.
“DOD recognized that industry is driving 5G technology with massive investments,” Joe Evans, technical director for 5G at DOD, said during an ACT-IAC webinar Thursday. “The projection on capital expenditure in the U.S. through 2025 is about $350 billion.”
Samsung alone spent $16.7 billion on 5G research and development in 2018, an amount no Pentagon network and communications program has ever reached, Evans added.
New RFPs will be issued through the National Spectrum Consortium and Information Warfare Research Program.
One prototype will test shipwide and pierside connectivity at Naval Station Norfolk in Virginia with private sector implications for both the cruise and freight industries.
Another prototype seeks to quickly move massive amounts of data from aircraft to mission support for predictive maintenance at Joint Base Pearl Harbor-Hickam in Hawaii, a capability that could also benefit the airline industry.
DOD will experiment with AR support for medical training and telemedicine at Joint Base San Antonio, allowing personnel to contact specialists at hospitals to assist with front-line treatment. Such work is already benefiting coronavirus pandemic response because 5G allows for better video and low-latency interactions, Evans said.
A more Pentagon-oriented effort is occurring around wireless connectivity for tactical operations centers at the Fort Irwin National Training Center in California, Fort Hood in Texas and Marine Corps Base Camp Pendleton in California.
“Really the idea there is to get rid of all the red and green wires you find in combat operations centers and replace those with wireless technology to improve mobility and make targeting our forces harder,” Evans said.
The prototype could help NASCAR, which has to quickly stand up and tear down operations centers as it moves track-to-track on a weekly basis.
A final prototype at Tinker Air Force Base in Oklahoma deals with bidirectional spectrum between commercial and DOD communications systems, more dynamic spectrum sharing than earlier technologies like the Citizens Broadband Radio Service, Evans said.
Joint Base San Antonio and other remote sites have formed a 5G core security experimentation network for evaluating the security and interoperability of 5G backbone networks. The tests could help additional U.S. companies enter the 5G market, Evans said.
VA looks to RPA to speed up digitization of its health records
The Department of Veterans Affairs is exploring the use of robotic process automation (RPA) to more quickly digitize external medical documents and link them to a veteran’s existing electronic health record.
The VA issued a request for information Wednesday in search of a managed RPA tool to “streamline the flow of external clinical document sets” into its EHR systems — both the existing Veterans Health Information Systems and Technology Architecture (VistA) and the forthcoming Cerner Millenium-based modernized platform it is planning to roll out in the near future.
This is particularly important as the VA since 2014 has allowed veterans to seek care outside of its facilities under the Veterans Access, Choice and Accountability Act. When that happens, those non-VA facilities generate records and send them, often as paper copies, to the VA to be integrated digitally with a veteran’s electronic health record.
Such a tool would “help eliminate backlog, reduce the number of manual scanning/indexing tasks, and increase quality and quantitative traceability with both VA’s existing VistA and newest Cerner Millennium platforms,” the RFI says. The VA inspector general reported last year that the department’s backlog of health records awaiting digitization “measured approximately 5.15 miles high and contained at least 597,000 individual electronic document files dating back to October 2016.”
Likewise, as veteran care in the community increases even more under the 2018 MISSION Act, “the automation solution must be scalable to satisfy new and increased workloads, throughput, and digital exchange capacities,” the RFI says.
The VA included some must-haves for a potential solution: automated scanning that can link documents to a veteran’s EHR, a flagging system that alerts any issues in automated processing, and the delivery of metrics. The tool should also be able to access medical documentation from a variety of sources, including an internal shared VA folder, email, electronic fax, paper, electronic exchange, and via API in the HealthShare Referral Management platform.
For now, VA medical facilities continue to use VistA as their primary EHR system. But soon, the VA will roll out the modernized Cerner platform at hospitals, starting in the Pacific Northwest. The department had planned the launch for March, before delaying the go-live until July due to a need for more training. But since the coronavirus struck, the VA has put the launch on pause indefinitely to prioritize resources for pandemic response.
The modernized EHR will cost the VA more than $16 billion and take a decade to roll-out
The VA is asking interested vendors to pitch RPA solutions by July 30.
DIU looks to automate cyberthreat detection for military IT networks
As the military services look for ways to address the dual challenge of workforce shortages and increasing cyberattacks, the Defense Innovation Unit issued an other transaction agreement (OTA) for a new automated cyber product.
The prototype will bring an “intelligent decision automation platform” to the Air Force Network (AFNET) built by Respond Software, a California-based software company that uses automation to detect attacks. The code behind the automation draws from a relatively old form of artificial intelligence called “expert systems” that, instead of creating large neural networks based on data, uses advanced probability-based mathematics that simulates decision-making.
While the prototype is starting with the AFNET, if it’s successful, DIU is looking to scale it across the military, Jeff Kleck, cyber portfolio director for DIU, told FedScoop. The prototyping of the platform is another attempt by the Air Force, replicated by other branches of the military, to automate cyberdefenses and ease the burden on cybersecurity professionals that come in limited supply.
“Today they really don’t have a chance; it is a tidal wave of alerts,” said Mike Armistead, Respond’s co-founder and CEO.
Previously, Air Force innovation officials have said they want to use artificial intelligence technology to support basic cybersecurity and reserve un-programable human ingenuity for offensive and bigger-picture work.
In developing the prototype product, Respond will use simulated network attack data while it works through the authority to operate (ATO) and Federal Risk and Authorization Management Program (FedRAMP) processes.
“During the lab trials/testing phase prior to obtaining ATO, these products are fed synthetic or anonymized data, and traffic emitting from connected modules are scrutinized for leakage and repudiation risks,” Kleck said. He later added that “as part of our prototyping process DIU runs products in a controlled environment before ATO.”
The tool aims to better triage and limit false positives in automated cyber threat detection and link together different sensors on the Air Force’s network, according to the company. Respond’s capabilities are modeled after human decision-making, like being able to “connect the dots” and seek out patterns.
“What we have done is put that into software, which has a very different scale to it,” said Armistead.
The vast majority of JAIC’s money is going toward warfighting
The Department of Defense’s Joint Artificial Intelligence Center has spent more money in fiscal 2020 on its Joint Warfighting Mission Initiative than its five other focus areas combined, its acting director said Wednesday.
This marks an evolution for the JAIC as money for battlefield AI is outpacing projects that had an earlier head start, like the center’s continued humanitarian assistance work, JAIC acting Director Nand Mulchandani said Wednesday, without revealing specific budget numbers.
The Warfighting Mission Initiative got its first major contract award in April, with an $800 million deal going to Booz Allen Hamilton to help the JAIC coordinate with other private sector companies on bringing AI to the battlefield. Other mission initiatives include warfighter health, business process transformation, threat reduction and protection, joint logistics, and joint information warfare which includes cyber.
The JAIC has stressed that at every step of development, its ethics principles are considered and inform how the department works to implement AI into kill chains.
“I think about them all the time,” Mulchandani said about the ethics principles at a press conference.
One of the JAIC’s major focuses in its warfighting is developing AI that enables Joint All Domain Command and Control (JADC2), the next-generation network-of-networks the military is developing to link operations across sea, air, land, space and cyber. The AI in JADC2 will be used to link “every sensor to every shooter,” so the saying goes, by having a common data architecture and means to rapidly process that data to get information to the right part of the chain of command.
“That is going to be a big focus,” Mulchandani said of the program.
Another high-priority focus area is in what the JAIC calls “cognitive assistance.” This type of AI technology is designed to filter information and assist service members in the decisionmaking processes. Mulchandani described cognitive assistance as AI that is designed to assist those assessing potential targets of attack. In particular, the JAIC is working with natural language processing (NLP) that can sift through layers of text data to assess the important information and synthesize less valuable text.
Much of the JAIC’s work is happening in partnership with industry, Mulchandani said. Despite protests and worry inside some Silicon Valley companies, he said engagement remains strong both with large companies, like Google and Microsoft, and with small startups.
“We have deep engagement with industry,” he said, adding that the JAIC’s industry engagement specialist is based right in the heart of Silicon Valley.
Mulchandani pointed to offensive information warfare and cyber weapons as an area in which industry hasn’t yet progressed much. While building automated cybersecurity and event analysis tools has been a “well-trodden path” for industry for years, Mulchandani noted that when it comes to offensive cyber tools, “industry has just barely started.”
“There is a huge goldmine of work there,” he said.
Mulchandani said he could not go into details, but noted that JAIC is working with U.S. Cyber Command on some projects around offensive cybersecurity work.
How new API adapters bring unity to the tangle of cybersecurity reporting tools
Bobby McLernon served in a variety of IT roles at the FBI before joining Axonius as vice president of federal government sales.
Most of us who’ve been involved with IT security over the past decade — especially in the federal government — can attest to the never-ending proliferation of new and increasingly powerful devices and applications accessing information on federal networks. We’re not just talking about physical and virtual desktops or servers or software-defined networking gear, but satellites, surveillance systems, IP communications tools and of course smartphones, to name a few.

Bobby McLernon, Vice President, Federal Government Sales, Axonius
And inevitably, so too came a steady wave of cybersecurity appliances and solutions designed to monitor and control what all those IT assets were trying to accomplish and whether they complied with federal security controls. Today it’s not uncommon for federal agencies to be maintaining between 20 and 50 tools just to keep up with managing all the IT assets and control systems on their networks.
Each of those solutions, no doubt, provides necessary capabilities and protections that agencies count on. But as every CISO knows, they also present their own layer of challenges. More often than not, these asset management and cybersecurity tools were designed to accomplish specific tasks. They typically got deployed in siloed environments. And they weren’t really built to speak to each other.
The result: IT security teams today face an increasingly complex sprawl of independent monitoring and management tools producing lots of independent reports.
All of that might have been manageable in the days when IT assets were mostly on premises — and we hadn’t become so reliant on operating in multi-cloud environments. But given the scale and dynamic nature of today’s enterprise IT environment today, it has become critical for cybersecurity practitioners to look for a better model for monitoring and managing IT assets and cybersecurity controls across the enterprise.
CDM — the government’s Continuous Diagnostics and Mitigation program — has certainly helped agencies develop the means to increase visibility into potential cybersecurity risks across their enterprise. But even with CDM, agencies still haven’t found an effective way to roll up all that information into a single reporting console, let alone respond as swiftly as they need to.
What’s been needed is the ability to not only collect cybersecurity data — but also actually extract the contextual information from all of these cybersecurity tools, aggregate and correlate that information, then deconflict it and present a comprehensive and unified picture of what’s going on across your entire IT environment.
We’ve found a way, in fact, to accomplish that.
We started with an engineering approach that allows us to create virtual, agentless adapters — and a system for easily deploying them — that can connect to the APIs of almost any data source that knows about an asset today. At last count, we have developed more than 240 adapters that work with every leading security and asset management tool on the market and a growing number of custom-built tools as well.
Once those adapters are in place, enterprise IT teams can then run all of that extracted information through a powerful set of correlation and deconfliction algorithms we’ve developed. Then using an equally powerful query platform, they can discover within a matter of seconds all the assets in their IT environment — managed and unmanaged, cloud and on-premises. They will also see what network components those assets are connected to, what security tools are tracking them, what firmware those assets are using and whether those assets are configured according to the agency’s latest security policies, among other security indicators.
From there, security teams can then turn their focus on identifying security gaps across their enterprise and, just as importantly, take actions to address them. They can also tighten their security posture, by setting up automated triggers and responses, including the ability to install software patches, scan new devices and perform many other tasks.
These capabilities give agency IT leaders the breakthrough they’ve needed to deal with today’s rapidly evolving enterprise IT environments. It not only gives CISOs more precise and immediate intelligence about their IT environments to meet various federal security requirements, like FISMA and FITARA. It also gives C-suite executives much more granular information about their IT investments across their agency — and across their cloud environments —for IT budgeting and cost control and better long term planning.
Learn more about how Axonius can help your agency gain a unified picture of your IT assets and security enforcement policies on-premise and in the cloud.
As data-sharing becomes more crucial, agencies say industry can help with privacy issues
Agencies like the Census Bureau want better commercial off-the-shelf (COTS) technologies for protecting data privacy and computation, so they can securely link datasets and make predictions about the coronavirus pandemic.
The bureau launched two new surveys and an interactive data hub to begin filling holes in the government’s understanding of COVID-19’s social and economic impacts in April. But surveys take time and only offer a snapshot of the population, when the bureau could be linking data from text-mined emergency room visits to its own.
If industry could provide a better tool for securing the environment in which data is stored and analyzed, ensuring trust, then more datasets could be linked painting a comprehensive geographic and economic picture of the virus, said Cavan Capps, the big-data lead at the Census Bureau, during a Data Coalition webinar Wednesday.
Linking hospital administrative data to cell phone data like Apple and Google wanted would lead to very efficient contact tracing, but there’s not enough public trust in current technology, Capps said.
“When we’re actually making decisions, when we’re running these models, when we’re tracking people, do you want any individual to basically sign a piece of paper and say, ‘I promise I won’t tell anyone about you?’” Capps asked. “Or would you rather have more rigorous mathematical protections?”
Currently there is no “silver bullet” solution, said Lynne Parker, White House deputy chief technology officer. She pointed to several reasons: Data de-identification can be accidentally undone when the scrubbed data is combined with other sources of information. Data aggregation limits analytics. Simulating data raises concerns about accuracy and reverse engineering, while homomorphic encryption — which allows data to be mined without sacrificing privacy — hurts performance and speed.
Other techniques and technologies also have their weaknesses, she said. Data enclaves — centralized services favored by academia, where users can work with sensitive research data — don’t scale well. Differential privacy, or systems that publicly share information on group patterns while withholding information on individuals in a dataset, water down insights. And the security of multi-party computation, a subfield of cryptography that allows different parties to privately compute the same data, hasn’t been fully vetted.
“Much more needs to be done to create scalable solutions that are not just a point solution for a particular data sharing goal, but an approach that can scale to more use cases,” Parker said. “So I close with a call to all of you across industry, academia and government: What we need is a better pathway forward for addressing data sharing hurdles more quickly and in the shorter term.”
TEE tests
Some technologies like trusted computing show promise, costing less to perform encryption and decryption and only breaking when there’s a backdoor in the microcode, Capps said.
The Census Bureau ran a pilot with the Defense Advanced Research Projects Agency and found trusted execution environments (TEEs) scale better than multi-party computing. The TEE core of an eight-core computer was able to collect, process, tabulate and link 1 million transactions a minute for 20 minutes. TEEs are basically walled off in a way that makes them more secure than the rest of a computer.
The bureau wants to arrange a pilot with Microsoft Azure to process with hundreds of computers, Capps said.
At the same time the bureau is working with the University of California, Berkeley to take a parallel processing system that runs large datasets, called Spark, and see if it can handle regressions, machine learning and other tasks.
Capps envisions linking data in commercial clouds without providing anyone direct access and then running that data through a filter, using differential privacy to add noise, before publishing the information. The Census Bureau hires academics to hack its de-identified data in an attempt to reidentify it.
Another data-collection effort related to the pandemic — contact tracing of people exposed to the coronavirus — will provide another test of public trust in government agencies at all levels, officials said.
“Many concerns around the contact tracing apps and tools are concerns that this information will be repurposed,” said Kelsey Finch, senior counsel at the Future of Privacy Forum. “We’ve seen potential drops in adoption rates related to law enforcement taking location information around the protests [of systemic racism] recently.”
Capps doesn’t believe the government can deduce what all the secondary uses of its data might be.
A decade ago, Google developed a Flu Trends tool that geolocated flu searches and even out-predicted the Centers for Disease Control and Prevention’s own disease surveillance model for a time.
“It was a secondary use we hadn’t anticipated,” Capps said. “So I don’t think we’re going to tell data scientists to shut down their brains.”
CMMC requirements show up in GSA’s STARS III contract
New Department of Defense contractor cybersecurity standards have tip-toed into a governmentwide federal contract, even before language around the new program has officially landed in defense contracts.
The Cybersecurity Maturity Model Certification (CMMC) — the new cybersecurity certification standards to be implemented into all DOD contracts over the next five years — was included in the General Services Administration’s $50 billion STARS III contract, posted earlier this week. GSA says it “reserves the right” to require CMMC certifications for small businesses awarded spots on the governmentwide IT contracting vehicle.
CMMC will require contractors to get third-party assessments proving their networks meet a certain maturity level, ranging from one to five with a corresponding increase in security controls.
“STARS III contractors should begin preparing for CMMC,” the contract states, adding that GSA could require STARS III small businesses to meet CMMC level 1 when it comes times for the contract’s five-year option. GSA also says in the contract it “reserves the right to survey 8(a) STARS III awardees from time-to-time in order to identify and to publicly list each industry partner’s CMMC level and ISO certifications.”
STARS III is designed to get federal IT work to small businesses participating in the Small Business Administration’s 8(a) Businesses Development program, meaning they are majority-owned by “socially and economically disadvantaged individuals.”
The DOD is one of the biggest buyers on STARS III’s predecessor, STARS II, according to Bloomberg Government analysis. Since 2011, DOD has spent more than $3 billion on the contract, which had a $15 billion ceiling until it was recently increased to $22 billion.
“While CMMC is currently a DoD requirement, it may also have utility as a baseline for civilian acquisitions; so it is vital that contractors wishing to do business on 8(a) STARS III monitor, prepare for and participate in acquiring CMMC certification,” the GSA contract says.
Small businesses that bid to be on STARS III must also submit a brief cybersecurity assessment in which GSA asks them to address their “intention in regards to obtaining CMMC, the target certification level, and a tentative timetable for attaining it.”
The DOD has said CMMC will start to show up in defense requests for information this summer and is currently in the process of a regulatory rule change to include CMMC in contracts before the end of the year. The program is still in its tumultuous early phase, with applications for the credentialed assessors that will certify contractors just recently opening.
Katie Arrington, DOD’s acquisition and sustainment chief information security officer and CMMC leader, said her team did not work with GSA on adding the language. For now, she is focused on getting the program up-and-running and in DOD contracts — but “we would certainly embrace any who desire to participate,” she said in an email.
How small businesses will be able to meet the cost and time associated with getting a CMMC certification has been a concern for many that do work with the DOD. Now, that concern could spread to small businesses that do work across the federal government.
“While we are not working directly with GSA on this specific procurement, it is no secret that other Federal agencies are actively watching, exploring and/or considering adoption of CMMC,” Ty Schieber, chairman of CMMC’s accreditation body, said in an email. “We applaud GSA in its forward-thinking by positioning the CMMC as an anticipatory element in this procurement.”
Others applauded GSA for including the CMMC language in its contract.
“I am pretty impressed that GSA took the initiative,” said Alan Chvotkin, executive vice president and counsel to the Professional Services Council, a government contractor trade association. Chvotkin said he is advising all contractors to practice good cybersecurity, whether CMMC will become a requirement or not.
The initial rollout of the program is largely done by the all-volunteer, nonprofit accreditation body and has had several initial stumbles. The assessors that will issue certifications to contractors to be able to do work with the DOD, their training, the assessment methodology and most other parts of the CMMC ecosystem have yet to be fully rolled out.
This is not the first time the potential of CMMC’s growth out of the defense world has been raised. Arrington said previously she believes CMMC will become a federal requirement “very rapidly.” She has even said CMMC could become an international standard and a part of cybersecurity insurance. In past remarks, she has cited the Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) as another interested agency.
“CMMC has become widely recognized as the path to ensure our industry partners have adequate safeguards in place to protect our data,” Arrington told FedScoop.