James Saunders takes CISO role at Office of Personnel Management
The Office of Personnel Management has named James Saunders as chief information security officer.
He starts work in the new role Feb. 28 after joining the agency last year as a senior adviser for cloud and cybersecurity.
Previously, Saunders held the post of CISO at the Small Business Administration and moved to OPM in April 2021. One federal IT source speaking to this publication said that Saunders has already been acting as an “unofficial CISO” since joining the agency.
At the SBA, Saunders worked closely with then-deputy CIO Guy Cavallo, who subsequently moved to OPM as deputy CIO and in July was installed as permanent CIO. While at SBA, Saunders and Cavallo worked to implement the requirements of the CARES Act as well as the IT systems for the Paycheck Protection Program and Economic Injury Disaster Loan program.
Technology leaders at OPM have worked to turn around its IT systems since the agency attracted criticism under the Trump administration and was targeted for deconstruction.
In September, the agency said it would support the idea of a working capital fund to finance its backlog of necessary IT modernization projects, if given congressional approval.
This came after the National Academy of Public Administration in March published an independent study that identified glaring IT deficiencies but argued that the agency should not be folded into the General Services Administration.
OPM’s director was mandated by Congress to commission the report from the National Academy of Public Administration, which included 23 recommendations across a range of separate issues.
OPM did not respond to a request for comment on Saunders’ appointment.
News of the appointment was first reported by MeriTalk.
Agencies underscore software vulnerabilities in supply chain assessments
Several Cabinet agencies published reports Thursday citing the current software ecosystem as a key weakness across supply chains crucial to U.S. economic prosperity and national security.
The departments of Commerce and Homeland Security found open-source software and firmware within the information and communications technology (ICT) industry vulnerable to exploitation by foreign adversaries and crime groups in their joint report, while the Department of Energy‘s report deemed untrusted software developers a key vulnerability within the clean energy supply chain.
President Biden’s executive order on America’s Supply Chains issued in February 2021 gave seven Cabinet agencies a year to assess six critical industries for supply chain vulnerabilities, software being a big one.
“The ubiquitous use of open-source software can threaten the security of the software supply chain given its vulnerability to exploitation,” reads Commerce and DHS’s report. “Furthermore, the complexity of the ICT supply chain has led many original equipment manufacturers (OEMs) to outsource firmware development to third-party suppliers, which introduces risks related to the lack of transparency into suppliers’ programming and cybersecurity standards.”
The pandemic revealed an overreliance on software developers with opaque supply chains and a high risk of “cascading effects” should their products be compromised, according to DOE’s report.
For that reason Commerce and DHS recommended increasing investment in domestic software development, which already accounts for 40% of the U.S. workforce but is still seeing a talent shortage.
DOE recommended developing new supply chains for emerging technologies like machine learning and artificial intelligence with cybersecurity in mind, given the fact that energy sector systems are increasingly interconnected and automated.
“With the increasing application of AI/ML capabilities to the operation and defense of U.S. energy sector systems, and the centrality of DOE AI/ML research and development efforts (at DOE National Laboratories) to national and economic security, a proactive approach to ensuring cybersecurity and integrity of the global supply chain for data is critical,” reads the report.
The report further advised DOE partner with other agencies to create an Energy Sector Industrial Base Database and analytical and decision-modeling capabilities while increasing oversight.
Commerce and DHS suggested promoting cybersecurity-supply chain risk management (C-SCRM) practices through procurement and monitoring efforts, including the establishment of a Critical Supply Chain Resilience Program at the former.
Similarly the Department of Defense called cyber posture “essential” to mission success in its report and stressed a focus on C-SCRM to counter threats presented by suppliers, their products and subcomponents, and the supply chain itself. That’s especially true for high-priority suppliers and integrators of missile systems and munitions, according to the assessment.
More than 220,000 companies make up the defense industrial base.
“The size and complexity of defense procurement activities offer numerous pathways for adversaries to access sensitive systems and information,” reads DOD’s report. “New entry points for U.S. adversaries are created daily as companies use technologies in new and innovative ways across supply chains.”
To that end DOD recommended improving cyberthreat intelligence with more detailed Validated Online Lifecycle Threat (VOLT) reports and quarterly cyberthreat intelligence briefings of program offices and key acquisition officials. The department further advised increasing sharing of unclassified and classified cyber intelligence by growing its Cyber Crime Center (D3C) Defense Collaborative Information Sharing Environment (DCISE) and the National Security Agency’s Cybersecurity Collaboration Center.
DOD also plans to increase interagency partnerships, develop international cyber approaches, and require timely and complete incident reporting from its contractors — while making cyber expectations clearer for them during the procurement process.
“To leverage commercial sector innovations, and to embed modernizing technologies in weapon systems, the DOD will work, where possible, to limit its use of military-unique requirements when developing performance requirements,” reads its report.
Antitrust review waiting period ends in Oracle-Cerner deal
The initial antitrust waiting period for Oracle’s $29.8 billion bid to acquire the main technology provider behind the Department of Veterans Affairs’ electronic health record modernization program expired Monday.
Oracle’s all-cash offer to acquire Cerner for $95 per share is scheduled to expire at midnight at the end of the day on March 16. In a press release, Oracle said the parties “anticipate extending the tender offer to allow additional time for the satisfaction of the remaining conditions to the tender offer.”
The two companies are seeking clearance for the transaction to proceed. The deal would be Oracle’s largest-ever acquisition, and earlier this month the waiting period was extended until midnight on Feb.22.
Under the Hart-Scott-Rodino Act, the Federal Trade Commission and the Department of Justice’s anti-trust division typically have 30 days to conduct a preliminary review of the deal.
At the time of the transaction announcement, both companies said the transaction would improve the availability of technologies such as cloud, artificial intelligence and machine learning to federal agencies, and that the goal would be to focus on delivering zero unplanned downtime for Cerner systems running on Oracle’s Gen2 cloud.
Oracle announced in December that it had signed an agreement to acquire Cerner and that following completion of the prospective deal, the medical records company would become a standalone business unit within Oracle.
Cerner’s Millennium platform makes up the backbone of the VA’s EHR modernization program, which has attracted scrutiny from lawmakers in response to escalating costs and concerns over the new medical records system raised by frontline doctors.
The DOJ has the power either to halt a transaction entirely or to require divestitures before a deal can proceed. It may also require an extension of a deal waiting period.
Defense R&D contractors inadequate in protecting sensitive data, IG says
Contractors that research and develop new technologies for the Department of Defense are not consistent in safeguarding the DOD’s controlled unclassified information, according to a new audit by the Pentagon’s inspector general.
The watchdog reviewed the cybersecurity controls of 10 such contractors and found issues in how they used multifactor authentication, identified and mitigated vulnerabilities in their systems, encrypted systems and protected against users transferring controlled unclassified information (CUI) via removable media, among other things, as required by DOD’s acquisition laws.
The audit comes as the DOD moves toward requiring contractors that handle CUI — data that is sensitive and “requires safeguarding or dissemination controls” but is not classified — to certify their cybersecurity maturity either through a third-party assessment or, in some cases, self-assessment under the Cybersecurity Maturity Model Certification (CMMC).
But CMMC isn’t set to be in place across the DOD until at least fiscal 2025, and until then, it’s incumbent upon contracting officers to verify that contractors adhere to cybersecurity requirements for handling CUI set by the National Institute of Standards and Technology, per an interim Defense Federal Acquisition Regulation Supplement (DFARS) rule implemented in September 2020.
The issue is that DOD interprets that rule to apply only to contracts awarded or modified after Nov. 30, 2020. And thus, “the interim rule does not apply to existing contracts, which includes all of the contracts that we reviewed during the audit,” the IG says.
“Without a framework for assessing cybersecurity requirements for existing contractors, the cybersecurity issues identified in this report could remain undetected on DoD contractor networks and systems, increasing the risk of malicious actors targeting vulnerable contractor networks and systems and stealing information related to the development and advancement of DoD technologies,” the report reads.
Therefore, in the interim, the IG recommends that contracting officers “independently” assess and verify if contracting institutions comply with cybersecurity requirements.
“If the DoD does not verify that all contractors using CUI implement … requirements, regardless of when the contract was awarded or modified, there is an increased risk that DoD CUI related to national security could fall into the hands of our adversaries,” the IG said.
The Defense Pricing and Contracting principal director, however, disagreed with the IG’s recommendation and has not taken action to resolve it because it would take additional rulemaking to be able to enforce and would “result in substantial administrative and financial burden to the DoD.”
According to the IG, that’s not the case. Under the existing rule, it argues in the report, “contracting officers had the authority to require additional assessments as outlined” in NIST’s standards and the director should direct those officers to use that authority in such a manner.
The findings of the IG’s report are extremely relevant today as defense contractors have been increasingly under attack from bad actors, including nation-state adversaries like Russia. Earlier this month, the Cybersecurity and Infrastructure Security Agency issued an alert declaring that for nearly two years, “Russian state-sponsored cyber actors” have targeted the emails and other data of U.S. defense contractors that handle sensitive information about weapons development, computer systems, intelligence-gathering technology and more.
The growing threat of ransomware as a business model for criminal activity
Judge limits evidence ahead of False Claims Act case against NortonLifeLock
D.C. District Court Judge Rudolph Contreras limited evidence in a case where NortonLifeLock Inc. is accused of overcharging agencies on a General Services Administration contract, ahead of the trial’s Feb. 28 start date.
According to court documents, Contreras struck some “previously undisclosed opinions” from the Arizona-based cybersecurity software company’s demonstrative exhibits, at the federal government’s request, while denying the company’s cross-motion that would have allowed both parties to disclose exhibits only 72 hours before use.
The federal government; California; Florida and Lori Morsell, on behalf of New York, accuse Norton, formerly known as Symantec Corp., of violating the False Claims Act by misrepresenting the software prices and discounts available to agencies between 2007 and 2014. Norton’s contract required such disclosures and included a price reduction clause should private customers receive a better deal its software, as was the case.
Contreras agreed to exclude certain evidence in August offered by a Norton expert detailing which software sales should be included in damages and criticizing a government expert’s assumption that liability for damages should be tied to resellers’ and distributors’ sales. Both parties revised their exhibits accordingly in September, but the government accused Norton of using that time to introduce new opinions in its slides and filed a new motion to strike them.
The judge wouldn’t strike every new slide bullet point, including one criticizing the reasonableness of the government expert’s assumptions — arguing that’s different than criticizing the assumption itself, and in a bench trial a judge can recognize the difference. But he did strike a new summary of the government expert’s damages calculations because Norton failed to disclose it originally, in his decision dated Jan. 31.
Contreras also found the original exhibit deadline was reasonably met by both parties and, in an effort to resolve disputes over evidence pretrial, denied Norton’s motion to be able to revise them based on new testimony and evidence.
“The Court does not foreclose the possibility that under exceptional circumstances the parties may need to revise specific demonstratives between now and the start of trial, but it fails to see any gains in judicial efficiency by allowing the parties to continuously revise demonstratives on a rolling basis throughout trial and dealing with last-minute disagreements that could have been resolved weeks earlier,” Contreras wrote, in his memorandum opinion. “The Court therefore declines to modify the pretrial order as proposed by Norton.”
GSA and NortonLifeLock did not respond to a request for comment.
Transportation Command migrating applications to Air Force’s Cloud One
The U.S. Transportation Command plans to migrate its cloud applications to the Air Force‘s Cloud One platform offering over the next two years.
Senior IT acquisition officials told participants during a virtual industry day Wednesday they have met with the Cloud One team to kick-off the partnership, focused for now on a handful of pilots to migrate USTRANSCOM’s transportation modeling and simulation tool and both classified and unclassified versions of its Global Transportation Planning systems to the cloud.
Based on the progress made through the remainder of fiscal 2022 with those first pilots, the command will look to move more of its programs of record to Cloud One’s services in fiscal 2023.
The agency planned to move its cloud capabilities, currently hosted in Amazon Web Service’s GovCloud, to the Department of Defense’s now-defunct Joint Enterprise Defense Infrastructure (JEDI) cloud platform. But with JEDI’s cancellation, USTRANSCOM had to find a new option.
“Like many organizations, we were projected to transition to what was once known as JEDI,” said Scott Borchers, chief of the DevOps & Pipeline Division in USTRANSCOM’s J6 office. “And when the JEDI contract was ultimately canceled, we made a decision to transition to other DoD approved enterprise cloud contracts and services.”
Other options on the table were the Defense Enterprise Office Solution (DEOS), which USTRANSCOM does use for collaboration tools, and the Defense Information Systems Agency’s milCloud 2.0, which DISA announced recently would sunset later this year.
Borchers said the plan is to leverage the Air Force’s Cloud One for infrastructure-as-a-service — offering both AWS, Google Cloud Platform and Microsoft Azure options — and then use the branch’s Platform One capability as well, “with the intent to speed capability to delivery” for software development.
Jim Lovell, a program executive officer supporting the cloud migration, said in migrating to the Air Force’s existing offerings, it will fast track USTRANSCOM’s adoption of DevSecOps and automated authority to operate (ATO) capabilities, as well as “a lot of the more progressive cloud features that you would expect from a world-class cloud hosting environment.”
“Today we have two or three programs that deliver [software] in three weeks sprints,” Lovell said. “We like that. Our functional community really, really likes that. But we can’t seem to get there with all of our programs, and so that seems to be about a good pace for us where we hope to get there with all of our programs in a more standardized way with the migration to Cloud One.”
At this point, Lovell said, “all indications are we’ll be done with the first three [pilots] in this fiscal year.” And over the next “30 to 45 days,” the command will be finalizing contract actions to make the move.
“We’re well underway,” he said.
Why data analytic platforms hold the key to smarter cloud investments
As federal IT leaders continue to assess how best to manage their data and applications in the cloud and on-premises, many must still confront a deeper challenge, says a new report: The need to establish an enterprise-wide view and understanding of their data.

Read the full report.
Having a comprehensive federal data strategy involves more than cataloging what data your agency has, which data is most valuable and where it resides. It also requires the ability to gather and analyze operational and security data in real time — and then have the additional ability to discern how various types of data are being put to work over their lifetimes.
“The world is moving to a place where there is too much data coming at us all day, every day,” says Juliana Vida, group vice president and chief strategy adviser at Splunk. A former deputy CIO at the Pentagon and retired U.S. Navy commander, Vida argues that despite the challenges of capitalizing on cloud services, agencies have “no other option” but to move to the cloud.
“There is no other way to manage the volume, velocity, variety, and pace without leveraging cloud technology. So the end state has to be figuring out how and when to leverage these mature data analytics capabilities that are optimized in the cloud,” she says.
But without a foundational data strategy upfront — and the tools to develop and foster that strategy — deciding what to move the cloud becomes even harder than it already is, she says in the report, “Why data analytic platforms hold the key to smarter cloud investments.”
Vida, and others in the report, maintain that organizations need to move beyond rationalizing applications and data centers in the name of efficiency. Instead, they need to adopt a platform approach that has the capability to gather, unify, analyze and act on data from all types of systems across an organization, including data operating in the cloud.
“The right platforms help you identify which data you’re actually using, which applications you actually need … and take the human effort out of it to figure out what’s important,” she explains.
Without a fully informed data strategy, organizations run the risk of transferring workloads to the cloud only to lose out on the potential insights and value the cloud can offer, says Geoff Woollacott, a senior strategy consultant and principal analyst at Technology Business Research.
“Cloud solutions alone will not deliver data clarity,” adds Dion Hinchcliffe, vice president and principal analyst at technology research and advisory firm Constellation Research. “In fact, they may create even less clarity because the data may be more dispersed.” That’s also in part because cloud providers only see a portion of a customer’s data.
The report cites findings from a recent Harvard Business Review Analytics Services study that found 66% of executives polled globally say that leveraging real-time data analytics, is “very or extremely important to monitoring and gaining insights across cloud services, applications and infrastructure.”
The report, underwritten by Splunk, highlights how Splunk’s Data-to-Everything Platform was instrumental in helping the U.S. Census Bureau unify and analyze data across the Census Bureau’s 52 systems and 35 operating divisions onto a single platform. That effort was part of a sweeping initiative to overhaul the bureau’s legacy systems and build a cloud-enabled IT environment in time for the 2020 census.
The report also highlights how Splunk’s Security Orchestration, Automation and Response (SOAR) system can support hundreds of tools and thousands of unique APIs, enabling IT teams to coordinate complex workflows.
“Platforms aren’t just a solution for putting your data into a cloud,” she Vida. “It’s being able to see across the entire lifecycle of the data and where it’s being used to help inform these decisions about migration and where to place investments — and where to pivot from what we used to do, to what we want to do. It offers end-to-end visibility of the data. And not all platforms do that.”
Vida also describes how establishing real-time observability puts federal agencies in a stronger position to achieve four longer-term benefits including greater efficiency, resiliency, security and innovation that provide added value to their investment strategies.
Download and read the full report.
This article was produced by FedScoop and sponsored by Splunk.
FDA made improving diagnostic test data the focus of its pandemic response
Improving the use of diagnostic test data to inform public health decision-making was the “primary focus” of the Food and Drug Administration’s pandemic response, according to the agency’s chief medical officer for in vitro diagnostics.
The FDA recognized early in the pandemic it would need to get better at aggregating and analyzing data at scale to handle the large volume coming in from COVID-19 tests being widely distributed said Sara Brenner, speaking at an AFCEA Bethesda Health IT event panel Feb. 15.
“COVID-19 is tip of the spear for what will hopefully be a sea change in terms of patients having the ability to have data from diagnostic tests that are widely distributed and powering our country’s ability to utilize diagnostic testing data at scale for diseases beyond COVID,” Brenner said.
Test data used to improve decision-making requires the creation of regulations for the standardization of data at source. Medical devices developed by the health industry for home and over-the-counter use increasingly permit such standardization.
When it comes to tether emerging technologies to existing data standards, it’s the Office of the National Coordinator for Health IT within the Department of Health and Human Services that has to play “bad cop,” added Ryan Argentieri, deputy director of ONC’s Office of Technology.
ONC has been pushing the use of open application programming interfaces and recently added new data classes and elements for electronic exchange in draft Version 3 of the U.S. Core Data for Interoperability.
Once the quality of health data is assured, HHS can push faster adoption of its preferred Fast Healthcare Interoperability Resource data standard by government and industry.
For that reason FDA and ONC are a part of three interagency working groups created by President Biden’s January 2021 Executive Order on Ensuring a Data-Driven Response to COVID-19 and Future High-Consequence Public Health Threats: Enhancing Data Collection and Collaboration Capabilities, Public Health Data Systems, and Innovation in Data and Analysis.
The FDA further leads the Diagnostic Data effort for the HHS Data Strategy and Execution Workgroup, one of several related to the national COVID-19 pandemic response that also include the Centers for Disease Control and Prevention, National Institutes of Health, and HHS.
“These include technical, policy and implementation efforts in data standardization, harmonization and reporting for SARS-CoV-2 diagnostic tests; supporting data analytics; building data transmission and ingestion pipelines for high quality, high volume data in real time; and supporting the stakeholder community in adoption and implementation of data standards, capture and reporting,” an FDA spokesperson told FedScoop.
The FDA’s Digital Health Center of Excellence is chiefly responsible for tailoring medical device regulations for digital technologies, and the agency is also staffing up its programs for digital diagnostics outside of laboratories and Semantic Harmonization and Interoperability Enhancement for Laboratory Data (SHIELD).
Both of those efforts involve interagency collaboration with ONC.
“It’s a very concrete way in which we’re already utilizing lessons learned from COVID with future health technologies and how this will benefit patients beyond SARS-COV2 into all different areas of medicine,” Brenner said.
HHS selects Unqork to provide payroll digitization services
The Department of Health and Human Services has awarded Unqork a contract to digitize payroll processing within the agency’s Office of the Assistant Secretary for Health (OASH).
OASH is currently working to develop applications that streamline manual business services as part of a project led by the Office of the CIO.
Unqork is a no-code platform that currently works with government agencies through the NASPO ValuePoint Cloud Solutions Contract and the NASA SEWP V GWAC.
Unqork recently received an “In Process” FedRAMP designation and is working with HHS to achieve FedRAMP authorization status.
OASH is the largest single staff division within HHS and is responsible for coordinating public health policy and programs. No further details of the contract size or scope were disclosed.