Acquisition officials highlight need for transparency in AI discussions with industry
Transparency about what artificial intelligence technologies can actually do is key to conversations about the government potentially purchasing the technology, two government acquisition officials said Thursday.
Officials from the General Services Administration and NASA underscored the need for honest conversations and updated ways of thinking about contracts in a panel discussion about keeping pace with innovations in government technology purchasing. That discussion, during a Professional Services Council event on federal acquisition, focused heavily on purchasing AI, whose boom in popularity has also reverberated throughout the government.
“What I’m seeing as a buyer of this type of technology is I’m being sold the world, and when I go to look at it, it’s not really the world. It’s this little dirt path on the corner,” said Geoff Sage, director of the Enterprise Service and Analysis Division in NASA’s Office of Procurement.
Sage noted that generative AI is “changing the game every single day,” so something that’s important for his agency is the ability to “take baby steps to prove out a bigger concept.” Those efforts can be learning opportunities, he said.
Udaya Patnaik, chief innovation officer for the Office of IT Category in GSA’s Federal Acquisition Service, said the challenge with trying to “wrangle a constantly evolving technology” is that the capabilities of that technology aren’t clear.
“That requires a level of transparency between industry and government to really say, ‘look, this is what we know, and this is what we don’t know,’” Patnaik said. For example, he said industry needs to be able to identify where a model comes from, the data it’s trained on and the biases that could exist in the system.
The discussion comes as the Biden administration and members of Congress are looking at ways to address how the government purchases AI. The Office of Management and Budget recently solicited information from the public to inform its work to ensure procurement of AI by federal agencies is responsible. A bipartisan Senate bill would mandate that agencies assess the risks of the technology before purchasing and using them.
In addition to transparency, Patnaik also said it’s important to look at contracts “openly” because the way that AI or machine learning technologies from 10 or 15 years ago used to be acquired isn’t relevant anymore.
That requires “an unprecedented level of real tight coordination and conversation between the acquisition community, the legal community, and the technical community to really understand what’s there and what’s not,” Patnaik said.
With respect to older methods of buying, Sage similarly said “we need to be more innovative.”
Due to the proliferation of the technology in different areas, he explained that there is heightened focus on topics that come with generative AI such as data rights and copyright infringement.
Sage said NASA has been pushing for early and open communications internally that include the office of the chief information officer, lawyers, and technical professionals from day one.
In an interview with press at the same event, PSC President and CEO David Berteau said keeping pace with the speed of the technology’s rapid evolution and evaluating results are “two competing dynamics” that the White House has to focus on in its action.
“How do you pace the government’s incorporation with the pace of development of technology is the first key question. The second is, what’s it worth?” Berteau said.
He said that it’s not like code where there was a methodology for creating a proposal and estimating how much it would cost to write lines of code. “Now it looks like it’s almost instantaneous, but may be exactly worth nothing,” Berteau said.
Bipartisan Senate bill wants Commerce secretary to raise awareness of AI jobs
A new bill from Sens. Todd Young, R-Ind., and Brian Schatz, D-Hawaii, would have the Commerce secretary take on a slew of new responsibilities related to raising public awareness about artificial intelligence.
As part of the legislation, Commerce would have a role in conducting outreach for AI jobs available in the government — addressing an ongoing struggle for the public sector as it competes with private industry and their much higher salaries.
Specifically, the legislation — called the Artificial Intelligence Public Awareness and Education Campaign Act — would charge Commerce with promoting “opportunities to work in the Government, for technologists and others with experience in the development, deployment, and use of artificial intelligence, including to institutions of higher education.”
The bill would also have the Commerce Department measure the efficacy of the campaign through “key performance indicators” and highlight the rights of individuals under law as they relate to AI.
The legislation also includes components related to raising public understanding on AI-modified content and deepfakes — and focuses on communities, like seniors, that might be particularly vulnerable to AI-facilitated scams.
“As AI tools and content become increasingly common, it’s essential that the public is aware of the risks and benefits associated with them,” Schatz said in a statement. “Our bill will direct the Commerce Department to educate the public about how best to take advantage of these tools while staying vigilant to AI-enabled scams and fraud.”
The introduction of the bill comes amid a flurry of AI-related legislative proposals, though Congress has yet to pass any major new laws related to the technology.
White House, GSA move to streamline procurement process with new data tool
A new procurement tool launched by the White House and the General Services Administration will streamline market research for federal agencies and act as a “complement” to the current request for information process, an Office of Federal Procurement Policy official said in an interview Thursday.
The Procurement Co-Pilot tool, an online platform for federal use unveiled last week, uses publicly available government data from platforms such as SAM.gov and the GSA’s Transactional Data Reporting program, providing a web application that supports “robust pricing and contract research” for acquisition professionals and program managers, OFPP senior advisor Christine Harada told FedScoop.
Harada said OFPP, which is housed within the Office of Management and Budget, envisions the new tool dovetailing with the existing RFI process that agencies currently use within the Federal Register. That process, she said, is a “tried and true” — albeit slower — method, but still helpful to agencies looking to “broaden the net as much as possible.”
The new tool “is not intended to replace” the old process, but is “actually meant to complement and supplement RFIs,” Harada said.
“Obviously, this has a lot of tremendous benefits for the federal contracting community, because it puts all of this information truly at our fingertips and we can find it now in real time,” she added. “But I think there’s also a really great benefit for companies as well.”
The OFPP’s product launch follows the Strategic Management of Acquisition Data and Information letter, or Circular A-137, that OMB released in May. The circular called for the establishment of a “centralized data management policy framework” to encourage data sharing between agencies, a “Hi-Def Environment (HDE)” for users to access data, and tools and resources for decision-making involving acquisition specifically.
Kristen Wilson, a strategic acquisition data management lead within OFPP, said in an interview with FedScoop that the office intends to engage with agencies’ chief data officers, chief information officers and chief acquisition officers to develop and implement a comprehensive data governance plan.
“We’re not telling agency CDOs what to do in their data strategy within their agencies,” Wilson said. “We’re specifically just addressing how data gets shared into the [HDE] — and this is a governmentwide effort versus an agency-specific effort.”
In the circular, OMB highlighted critical data modernization efforts in progress, such as The Foundations for Evidence-Based Policymaking Act of 2018, often referred to as the Evidence Act. OMB maintains that there remains a need for enabling accessible data across the entire government.
GSA did not respond to a request for comment by the time of publication.
The AI leadership imperative: Preparing federal agencies for AI’s impact
A new report underscores the urgent need for federal government leaders to help their executives better understand and embrace artificial intelligence’s rapid emergence to meet the challenges and opportunities that AI’s impact will have on their organizations.
The new report, “Leading Agency Innovation in the Age of AI,” asserts that federal officials must make a broader effort to educate their leadership teams and support a work environment that encourages leaders to identify appropriate AI use cases and lay the groundwork for moving from the possible to the practical.
The report, produced by Scoop News Group and underwritten by Microsoft, highlights the work of one organization — the nonprofit, nonpartisan Partnership for Public Service — that has been working behind the scenes across government to cultivate a growing cohort of senior executives to more fully understand what AI can and cannot do and how to put AI to practical use at their agencies.

The Partnership’s AI Federal Leadership Program brings together qualified senior executives through a six-month course that explores AI’s capabilities and potential impact on government agencies and culminates in executives having to develop an AI project roadmap to implement at their agencies. Since its inception, more than 500 senior executives from 40 agencies across the federal government and over 30 states have completed the program.
“The program not only serves as a model for training government leaders about AI and its impact but is also producing a growing — and much-needed — cohort of senior government executives who are better equipped to guide their agencies through AI transformation,” the report says.
The report highlights several elements that are important for agency leaders to embrace if they are to prepare for the expected impact AI will likely have on their workforce, their operations, and their missions:
Sharing lessons learned: A vital aspect of the program is the opportunity for participants to share their AI aspirations and application lessons with their peers. As Nancy Potok, a former program coach, notes, “The mingling of people who have different levels of technological expertise and experience and come from very different organizational cultures is one of the strengths of the course because they learn from each other.”
Access to AI experts: The program connects participants with technical experts on the front lines of AI development. A chief technology officer from a cabinet-level department emphasizes the value of this access, stating, “The challenge with the federal government is that there are probably [only] about 50,000 people in the world that can engage you about AI and know what they’re talking about.” That makes it difficult, he says, for senior executives to get a first-hand feeling of how AI works at an enterprise level and how to strategize its use.
Focusing on the problem, not just the solution: Patricia Cogswell, a former Homeland Security executive who served as a program facilitator, highlights the importance of defining the problem before selecting an AI use case. She cautions against simply “picking a solution and then finding a problem,” emphasizing the need to identify mission-critical challenges that AI can help address.
The report also includes 12 lessons that participants collectively say they learned from their experience in the program and the pilot projects they later developed at their agencies.
Additionally, it highlights examples of ways federal agency leaders, including Eric Stein, Deputy Assistant Secretary for Global Information Services at the U.S. State Department, applied the lessons they learned from the AI leadership program at their agencies.
If the federal government is to gain genuine traction in adopting AI, agencies — and the government — need to take a more orchestrated approach to training federal workers at all levels, says one cabinet-level chief technology officer in the report. He recommended AI orientation and training programs that focus on the needs of:
- Executives – so that they become familiar with the language of AI and what it can do.
- Middle managers – because they are the people consuming vendor services, they must be able to tell them what’s needed and what to buy.
- Acquisition specialists – because they need to know how to write the contracts.
- End users – so that people can understand how to use AI correctly (and know not to upload data into some public engine.)
- Security specialists – because nobody understands how AI will be misused.
Another federal AI leader cited in the report stressed the importance of experimentation. “Being able to see something that is operational — that may not be the end product but that gets you halfway there and then be able to iterate — is critical to get momentum, to give people an idea of what the art of the possible is,” he said.
Download the full report to learn more about “Leading Agency Innovation in the Age of AI.”
This article was produced by Scoop News Group for FedScoop and underwritten by Microsoft.
ARPA-H enters $19M contract with Palantir for artificial intelligence, data software
The Advanced Research Projects Agency for Health will use Palantir’s AI and data software to support its data infrastructure and track the progress of its research programs under a $19 million contract being announced Thursday.
Under the two-year contract, ARPA-H will deploy Palantir’s AI Platform (AIP) and Foundry software to “rapidly collect, synthesize, analyze, and make decisions from a range of data sources,” according to a release shared with FedScoop ahead of the announcement. Those tools will be used with a variety of agency operations, including with performance data related to ARPA-H’s programs to help the agency track progress and make decisions.
Alastair Thomson, the acting director of data innovation at ARPA-H, told FedScoop that when it comes to internal operations, the agency’s goal “is to be very, very data driven.” That extends to the design of the agency’s programs and how they’re monitored.
“We want to find what is revolutionary, not evolutionary, and so to do that, we’ve got to understand really what is the state of the art in a particular field?” Thomson said. “And so a lot of that comes from data.”
The contract comes as ARPA-H is still building its infrastructure. The Department of Health and Human Services agency was established in 2022 to support ambitious and innovative research in health and biomedical fields. Its programs so far include strengthening hospital infrastructure in the face of cyberattacks, developing technologies to remove cancerous tumors, and building a mobile health program powered by electric vehicles.
“We’re a new agency. We don’t have a lot of systems in place,” Thomson said. “Palantir is going to be a big part of implementing those systems that are really tailored in a way that makes sense to our unique mission.”
Though Thomson also noted an advantage of being a new agency from an infrastructure standpoint is that “we’re not just talking about cloud-first — we’re cloud-only.” Already Palantir has deployed a cloud-based instance of its data platform at the agency. An appeal of the technology to ARPA-H, Thomson said, was how quickly they could get it up and running.
Hirsh Jain, the head of public health and senior vice president of federal at Palantir, told FedScoop that the company is excited about bringing software systems to ARPA-H to help the agency “run their own core operations more effectively.”
ARPA-H is a “really unprecedented opportunity to drive research and development in health care in the private sector in a way that has the potential to be so instrumental to achieving the next generation of health outcomes,” Jain said.
What Palantir’s software is “going to do is provide a lot of the core infrastructure necessary to run those programs, make those R&D investments, [and] allocate those dollars as effectively as possible,” Jain said.
Foundry is a software that’s aimed at data integration and helping organizations pull existing data sources together and map it into what Palantir calls the “ontology,” which is a way for an organization to see its operations, Jain explained. AIP, which launched last year, is a way for organizations “to use large language models on top of their existing data that they already have access to,” he said.
When it comes to monitoring its programs, Thomson said the software will help the agency inform decisions with data. Each of the agency’s programs have metrics, milestones and deliverables that the agency “aggressively” monitors, he said, and something ARPA-H doesn’t want to do is continue trying something that isn’t working.
“Making good data-driven decisions about when to stop and pivot and try a different approach is really, really critical to us,” Thomson said.
Ultimately, some of the insights from those platforms are intended to be made public. “Our intent is to leverage the platform to produce reports and analytics and things that can be made available to the public as time goes on, so they can see, yes, we are being effective,” Thomson said.
Another area the agency plans to use the AI capabilities of the software for is analyzing the scientific context that a particular program is operating by synthesizing related publications and other sources of knowledge related to that area.
“There are things that we can very quickly assess using AI and get a perspective on it,” Thomson said. He added that reviewing publications and other information is still part of the process, but AI helps to “get a more complete perspective on things because it’s able to have a much broader view.”
In addition to providing the softwares, Palantir will also provide training and data analytics for operating staff at the agency. Thomson said ARPA-H wants to create a culture of data within the agency and the training will help staff make use of the data.
“You can’t be a learning organization without understanding how to understand your data,” he said, adding “training is an important part for us.”
SBA pushes back timeline to implement GAO privacy recommendation
The Small Business Administration has extended its timeline to fully implement a critical cybersecurity recommendation, delaying what a congressional watchdog called “key practices” to protect personally identifiable information.
The Government Accountability Office in September 2022 tasked the SBA with fully defining and documenting a process to ensure that the agency’s top privacy official is “involved in assessing and addressing the hiring, training, and professional development needs of the agency with respect to privacy.”
SBA officials agreed with that recommendation and told the GAO in March 2023 that the agency would update its Privacy Program Plan to “delineate hiring, training, and professional development needs of the agency in relation to privacy.”
However, in priority open recommendations released this month, SBA officials told the GAO that changes in staffing and budget allocations would force the agency to push back its implementation timeline from the second quarter of 2024 to the end of the fiscal year.
“To fully implement the recommendation, SBA needs to ensure that its updated Privacy Program Plan defines how the senior agency official for privacy, or other designated privacy officials, are involved in addressing related agency workforce needs,” the GAO wrote. “Fully implementing the recommendation would help the agency more consistently and effectively identify staffing needs and ensure a well-qualified privacy workforce.”
In response to questions from FedScoop about the delay and what other actions the agency told the watchdog it has taken to “bolster its privacy workforce,” the SBA said that it is reviewing the GAO’s report and had no further comment.
Sen. Joni Ernst, ranking member of the Senate Small Business and Entrepreneurship Committee, “has long been concerned with SBA’s privacy and IT standards especially with the amount of personally identifiable information they have on small business owners,” a spokesperson for the Iowa Republican said in an email to FedScoop.
The spokesperson added that a letter the senator sent last month to the SBA seeking a full accounting of how the agency is making IT investments through its IT working capital fund “has gone unanswered.” Ernst said in a press release that the SBA has mostly used the $22 million fund on “pet projects” around policy changes and artificial intelligence while “the agency as a whole continues to fail federal IT standards and has significant security risks in its systems.”
The press office of Sen. Jeanne Shaheen, D-N.H., who chairs the Senate Small Business and Entrepreneurship Committee, did not respond to a request for comment by the time of publication.
The SBA has plenty of company among federal agencies in completing privacy work. According to the GAO, NASA, the Office of Personnel Management, the Social Security Administration, and the departments of Commerce, Defense, Education, Energy, Health and Human Services, Housing and Urban Development, Justice, Labor, State, Transportation and Treasury all have open priority recommendations on privacy dating back to the watchdog’s 2022 report.
TMF investments to modernize campaign finance software, tribal school websites, HR system
Updating federal campaign finance reporting software, modernizing tribal communities’ school websites and streamlining human resources information systems are among the new investments announced Tuesday from the Technology Modernization Fund.
The announcement of the $31.66 million funding round — to the Federal Election Commission, the Department of Interior and the Department of Energy — came with a warning from the General Services Administration.
“It is essential that Congress provide resources to allow the TMF to continue to meet the growing demand for investments which address constantly evolving technology needs, threats and advancements so that government can deliver better for the American people,” GSA Administrator Robin Carnahan said in a press release.
The funding plea from Carnahan follows a similar call from TMF board chair and federal CIO Clare Martorana, who asked Congress in April to “please fund the TMF.” In March, lawmakers clawed back $100 million from the TMF.
The House, meanwhile, passed a bill last month to enhance TMF procedures and extend its sunset date. The revitalized Modernizing Government Technology Reform Act from 2017, co-sponsored by Reps. Nancy Mace, R-S.C., Gerry Connolly, D-Va., and Ro Khanna, D-Calif., was amended to impose additional constraints on reimbursements to provide agencies with flexibility for repaying the fund.
As the House legislation waits for a Senate companion, TMF leaders trumpeted the latest round of funding, with Martorana saying the FEC, Interior and DOE investments “demonstrate a commitment to using technology as a force for positive change — increasing government transparency, improving access to human resources data, and creating more equitable opportunities for underserved communities.”
The FEC, for example, has legacy software that operates only on Windows-based PCs that are not compatible with other operating systems, hindering the accessibility and security of those using non-Windows-based machines. With an $8.8 million investment, the FEC is looking to modernize its FECFile Online software, making it cloud-based and web-accessible for filers so that it “improves data quality and enhances security.”
Meanwhile, schools funded by Interior’s Bureau of Indian Education — which includes both BIE-operated and tribal-controlled schools — lack websites or “operate with an outdated online presence.” The $5.86 million awarded to BIE is meant for a website modernization project to “bridge the digital divide” between these schools and other educational institutions, as well as providing parents and guardians with access to crucial information concerning school-related activities and announcements.
Lastly, the DOE received $17 million to transition away from its “outdated HR infrastructure” to better support the agency’s workforce and curb risks that the current technology poses. DOE’s HR system, which is set to be replaced by a Software as a Service (SaaS) platform, saw its last “significant investment” about two decades ago, according to the GSA. The modernized approach aims to improve data integrity, accessibility and reduce time, while the agency anticipates that the automation of processes “will drive cost savings.”
Larry Bafundo, acting TMF executive director, said in a statement that people are “at the heart of every TMF investment.”
“The TMF, and the agencies we partner with,” he added, “are devoted to making improvements to services, systems and programs that make government more accessible for everyone, from federal workers supporting clean energy to children attending schools in underserved areas.”
GSA unveils new class of Presidential Innovation Fellows focused exclusively on AI
The second wave of 2024 Presidential Innovation Fellows has arrived for a “yearlong tour of duty” focused exclusively on artificial intelligence, the General Services Administration announced Monday.
The new PIF cohort — composed of 11 experts from tech companies, startups and other organizations — will serve eight different agencies in AI-centered roles, furthering the Biden administration’s goal of harnessing the emerging technology and ensuring responsible uses, according to the GSA. The fellows’ work will include projects that use data and AI to enhance the electrical grid’s infrastructure and leverage the technology to increase public access to justice while decreasing risks to consumers.
Olivia Zhu, an assistant director for AI policy at the Office of Science and Technology Policy, said in the GSA’s press release that she appreciates “the PIF program’s role in helping me transition from the tech sector into government, and look forward to watching more technologists join public service.”
The second cadre of PIF fellows follows the 21-person first group, announced in March, that has participants working across 14 agencies focused on modernization efforts.
Helena Fu, director of the Department of Energy’s Office of Critical and Emerging Technologies, said in a statement that the DOE fellow “will help propel our VoltAIc Initiative — building AI-powered tools to streamline siting and permitting to help accelerate deployment of clean energy infrastructure.”
The PIF program was launched in 2012 by OSTP before it was transferred to GSA in 2013. During that time, the program has hosted more than 260 fellows who have worked at more than 50 agencies. Many of those fellows continued on in other innovative and often tech-focused government roles.
How PNNL is leveraging AI and ML to enhance infrastructure resiliency
In recent years, the escalating frequency, intensity and cost of natural disasters have posed widespread challenges for federal, state and commercial officials. One such challenge is ensuring the resiliency of the nation’s infrastructure, including the energy grid.
That concern prompted scientists at the Department of Energy’s Pacific Northwest National Laboratory (PNNL) to begin exploring how artificial intelligence (AI) and machine learning (ML) could be used to predict wildfires and mitigate the impact of other potential natural disasters.
Pioneered by experts like Chief Scientist Andre Coleman, PNNL’s groundbreaking work has helped modernize disaster and emergency management by using AI and ML to make sense of vast amounts of satellite and geospatial data in near real-time. This helps public safety officials, utilities and land-use managers better prepare for pending disasters.
Where it started
According to Coleman, PNNL began experimenting in 2014 with a vision to harness imagery from satellite, aircraft and drones to swiftly assess hazards created by wildfires, floods, hurricanes and other weather-related disasters – and their impact on critical energy infrastructure.

“The original idea was to say, ‘Can we quickly make use of these sorts of imaging resources and assess a hazard? What’s the extent of the hazard? Where are we seeing the damage caused from the hazard?’” says Coleman. “And how do we relate those damages to critical energy infrastructure?”
To do this initial work, Coleman says PNNL built machine learning models and tools “to take that imagery and assess it in an automated way.” And then, over time, more machine learning algorithms were put into their models.
Where it stands
While PNNL’s focus was on near real-time response at the beginning, it evolved into proactive risk assessment that allowed emergency management and operations professionals to make decisions focused on mitigating risk well ahead of a disaster, such as vegetation management in and around critical infrastructure or replacing wood power poles with steels ones so they sustain through more extreme events.
This shift in focus was catalyzed by the increasing severity of wildfire events, especially historic fires that consumed more than 10 million acres in 2017 and again in 2020, prompting the recognition of wildfires as a national emergency.
“I think there was just this realization that fire is not the same as it once was,” says Coleman. “It burns more intensely; it moves faster.”
Advances in satellite technology have helped propel PNNL’s current approach, which involves integrating data from various satellite sensors, including passive and active sensors, to derive actionable insights. Passive sensors capture reflected energy, while active sensors penetrate cloud cover, storms and smoke, enabling continuous monitoring even in adverse conditions. By leveraging open-access satellite data and commercially collected imagery, as well as collaborating with agencies like NASA, the United States Geological Survey and the European Space Agency, PNNL can tap into a robust data ecosystem for its analysis.
In addition, the development of PNNL’s Rapid Analytics for Disaster Response (RADR) represents a significant milestone. The platform combines multi-modal imagery, AI and scalable cloud computing — thanks to partnerships with major cloud providers — with an infrastructure damage assessment tool. The tool helps users better understand the current impact and risk to infrastructure from wildfires, floods, hurricanes, earthquakes and more. Currently, once a provider collects and makes new imagery available – which typically takes about 4 to 6 hours – RADR can take that imagery, process and disseminate it in 7 to 10 minutes.
However, according to Coleman, newer satellite communications and increased automation are helping to drive this latency down to under an hour. In addition, this significant advancement in data collection and increase in the amount of data available is something that the RADR cloud platform has been planning for and is well-equipped to handle. For example, newer planned satellite sensors such as FireSat will provide a 20-minute satellite revisit, requiring a highly scalable system like RADR to generate analytics for these data.
In addition, PNNL’s work has caught the attention of a widening group of stakeholders, including the Department of Energy’s Office of Cybersecurity, Energy Security, and Emergency Response and the Joint Artificial Intelligence Center, which have played a crucial role in supporting PNNL’s endeavors. Collaboration with public and private utilities has further enhanced the research landscape, ensuring practical applications and alignment with industry needs.
Where it’s going
The complexity of wildfire detection and prediction necessitates continuous refinement of AI and ML algorithms. PNNL’s interdisciplinary team — comprising scientists, developers and cloud architects — remains dedicated to enhancing system capabilities. With the upcoming availability of specialized earth observation satellites, like NASA’s SWOT and NISAR, PNNL aims to expand its predictive capabilities further and extend its reach to a broader audience. “The idea is to keep adding and improving our algorithms, incorporate more sensors as they come online, continue to mature the capability and really work to get this out to as many end users as we can,” says Coleman.
PNNL’s efforts are also expanding beyond natural threats, to include a more holistic approach to safeguarding critical infrastructure, including the rise of physical as well as cyber attacks on critical infrastructure. By integrating environmental, cyber and physical security considerations, PNNL strives to build a more resilient energy grid capable of withstanding diverse challenges.
“A big role for us at the lab is to look at all these kinds of dimensions and say, ‘How do we protect the grid? How do we make it more resilient as our reliance on energy continues to increase?’’ says Coleman. “Through the various teams that are working on that, we have the opportunity to interface and consider how to move forward and build a better, more resilient system.”
As PNNL continues to innovate and collaborate, the future holds promise for a more resilient energy infrastructure. By leveraging the power of AI and ML, PNNL stands poised to address the evolving threats posed by natural disasters and enhance grid resiliency for generations to come.
The report was produced by Scoop News Group for FedScoop, as part of a series on innovation in government, underwritten by Microsoft Federal. To learn more about AI for government from Microsoft, sign up here to receive news and updates on how advanced AI can empower your organization.
NIH email scandal: A ‘shocking disregard’ for public record-keeping or within federal rules?
Allegations that top officials at the National Institute of Allergy and Infectious Diseases sidestepped public records laws have revived attention on the complicated but consequential rules governing federal emails.
During House Oversight Select Subcommittee on the Coronavirus Pandemic hearings featuring testimony from Dr. Anthony Fauci, former director of the National Institutes of Health subagency, and senior adviser David Morens, emails were cited in which some agency officials may have tried to evade the Freedom of Information Act.
In a message sent from his personal Gmail account, Morens referenced deleting emails and communicating with a colleague about how to make emails “disappear.” In one message uncovered by the subcommittee, he wrote that his NIH email “is FOIA’d constantly…Don’t worry, just send to any of my addresses and I will delete anything I don’t want to see in the New York Times.”
The allegations against Morens raise questions about the extent to which he was complying with rules about government emails. The incident exemplifies ongoing concerns about the gap between records preservation policy and enforcement — and also comes after controversies over the preservation of other federal officials’ electronic communications.
A spokesperson for the Department of Health and Human Services, under which NIH and NIAID operate, said the agency doesn’t comment on personnel matters but noted that “HHS is committed to the letter and spirit of the Freedom of Information Act and adherence to Federal records management requirements. It is HHS policy that all personnel conducting business for, and on behalf of, HHS refrain from using personal email accounts to conduct HHS business.”
According to a letter from select subcommittee Chairman Brad Wenstrup, R-Ohio, NIH told the National Archives and Records Administration in August that it did not find evidence that federal records within their custody were destroyed prematurely. That August letter, apparently sent by Anthony Gibson, NIH’s records officer, to the chief records officer of NARA, does not appear to be public.
A NARA inquiry initiated last year into the incident remains in the “Pending review/follow-up” stage, according to data tracked by the National Archives.
Jason R. Baron, a University of Maryland professor who previously served as the director of litigation for NARA, called Morens’ actions a “shocking disregard for the public’s right to access records under FOIA, as well as government recordkeeping in general.” Baron, a member of the FOIA advisory committee, said the case typified an instance where NARA would be responsible for following up with NIH to recover any records, and either of those agencies could then refer their findings to the DOJ for a lawsuit to, again, try recovering records.
According to NARA data, accusations of mishandling government records aren’t uncommon — and the agency can review claims relatively quickly. For example, a case involving the Department of Agriculture that reported records destroyed “because off [sic] black mold” was closed in just under a month after it was reported to NARA. Others can take over a year to resolve, such as a case involving the National Oceanic and Atmospheric Administration that reviewed allegations over the unauthorized deletion of records regarding a council meeting on fishing industry regulations. (NARA determined that this incident was unfounded and that the agency was in compliance with relevant legislation.)
“Federal records are either temporary records or permanent records, as determined by an appraisal of the records and documented in a records schedule,” NARA communications staff said in an emailed statement to FedScoop. “Temporary records are eventually deleted, while permanent records ultimately come to the National Archives for preservation.”
NARA continued: “Under the Capstone approach, agencies use General Records Schedule 6.1 to manage their email. This records schedule allows agencies to manage email based on the role of the email sender in the agency. The email of senior officials, known as Capstone officials, is permanently preserved. Email of other employees is a temporary record which will eventually be deleted.”
NARA spells out a series of requirements that could have been relevant in the situation involving Morens. The agency’s universal electronic records management requirements state that employees of executive agencies “may not create or send a record using a non-official electronic messaging account” unless an official account is copied or if the record is forwarded to an official messaging system.
“Federal employees should use agency accounts for electronic messages — including texts, chats, and emails — when conducting agency business. Personal and non-official accounts should only be used to conduct agency business in exceptional circumstances,” states NARA.
Not following these procedures can be the basis of disciplinary action, according to the U.S. Code, which governs the unlawful destruction, alteration or removal of federal records and designates reporting procedures for NARA when credible information about a potential incident is received. The National Archives, which also manages guidance on transfer of documents, is supposed to contact the relevant agency and can help assist in recovering destroyed federal records, including by contacting the attorney general. The Department of Justice did not respond to a request for comment.
NARA’s website states that electronic communications, like emails, “created or received in the course of agency business are likely” federal records.
Notably, even if the official “deletes” the email in their individual accounts, it’s likely that they’re saved through the automated archival process (and then deleted after several years, depending on a predetermined schedule).
The Morens incident highlights other ways that federal officials can theoretically try to avoid FOIA requests about their electronic communications. It’s not uncommon for FOIA officers to encourage requesters to share specific keywords in order to conduct records searches. But emails from Greg Folkers, Fauci’s former chief of staff at NIAID, included misspellings of key terms that individuals might have included in their request for records, such as “EcoHealth” written as “Ec~Health” and virologist Kristian Anderson spelled as “anders$n.” A Republican-led group of legislators has charged that the misspellings were intentional.
During a subcommittee hearing, Fauci stated that he did not engage in attempts to obstruct the FOIA process or the release of public documents, nor did he communicate official business with Morens using private email addresses or encourage Morens to use his private email address for official business. Wenstrup, the Ohio Republican and subcommittee chair, has requested access to Fauci’s private emails.
In addition to Baron, two other members of the FOIA Advisory Committee told FedScoop the incident raises critical issues about the gaps in promoting the values of FOIA — at least in regard to email communication.
“It sounds like a pattern and practice of avoiding [FOIA] combined with records destruction — and that’s scandalous if that’s true,” Alex Howard, who directs the Digital Democracy Project, said in an interview with FedScoop. “If that exists today in an agency as the conditions under which they’re dealing with that law, that should be held up as an example of what not to do.”
Similarly, Gbemende E. Johnson, a University of Georgia political science professor who also works on the FOIA Advisory Committee, noted that “incidents of improper record preservation would suggest at the very least that additional training and reminders of the legal obligations of [the Federal Records Act] and FOIA are necessary.”
A bipartisan Senate bill to modernize federal records law covers some notable ground, Johnson noted. The legislation from Sens. Gary Peters, D-Mich., and John Cornyn, R-Texas, would update federal records law to account for electronic communications, including messaging apps, improve records management compliance, and create an advisory committee to consider how emerging technology could improve record management.
Baron, the former NARA litigation director, said “the law as it presently stands relies on the good faith of federal officials to manually forward or copy to a governmental account any communications about government business that they make on Gmail or [another] messaging or texting ‘app.’ There should be easier, automated means to do so.”
Madison Alder contributed to this article.