Senior officials in the Pentagon’s new Chief Digital and Artificial Intelligence Office (CDAO) are compiling a federated data catalog, rethinking tech infrastructure and deepening external collaborations for data-sharing based on lessons they’re gathering from recent, unfolding crises.
Multiple Defense Department components recently merged together to form the CDAO and ultimately help accelerate the adoption and widespread integration of data- and AI-driven capabilities across the enterprise. As they hash out their path forward as one unit, officials involved are also jointly applying what their teams learned deploying technologies to address national security challenges over the last few years — namely in the COVID-19 pandemic, U.S. military withdrawal from Afghanistan and Russia-Ukraine conflict — to help the government better prepare for future catastrophes.
“At the CDAO, I think we have a real opportunity to be able to support our warfighters in crises and we’re really uniquely positioned to do that. We have the ability to connect data dots across the department, whether that’s looking at logistics, financials, personnel operations or intelligence,” Deputy CDAO for Enterprise Capabilities Greg Little said last week.
Little joined Deputy CDAO for Warfighter Support Joe Larson, Deputy CDAO for Digital Services Katie Olson and TRANSCOM Operational Delivery Team Lead Jeff Clark for a virtual panel during DOD’s annual digital and AI symposium.
“I think one of the important lessons that I learned in undertaking crises and deploying teams in times of crisis is not to sacrifice user-design, and sometimes you have to go slow in order to go fast,” Olson noted during the discussion.
Olson served as the Defense Digital Sevice’s acting director before the office was restructured under the CDAO. During that time and early into the U.S. military’s chaotic withdrawal from Afghanistan, the State Department asked DDS to very quickly identify two decades’ worth of Afghan allies who had worked for America, digitize that information and make it available to State to process visas of those requesting asylum status.
Government tech officials building tools to support rapidly shifting and tense situations should be deliberate about “putting the users first” and completely thinking through the core of what must be solved, Olson said.
“So in that instance, what we did is we said, ‘OK, who is the single source of truth for whether or not someone worked for us in Afghanistan?’ Well, it’s the employers. It’s the people that the U.S. government contracted with to do work for us in Afghanistan. So, what if we found a way for the employers to be the people providing verification?” Olson explained. “We reached out to companies that we’ve contracted with in Afghanistan, and we built a portal so that as people applied to the State Department would automatically ping employers who had employed our Afghan allies and partners over the past 20 years.”
Among other topics, TRANSCOM’s Clark also highlighted how data access, quality and processes are typically compromised during fast-moving crises.
During the U.S. withdrawal from Afghanistan, his team supported the evacuation of 124,000 people from the Middle Eastern nation in 70 days. To do so, Clark noted, officials involved “made shortcuts into the underlying systems to track that information, so the data was in fact slower than the operation was going on.” He’d get calls notifying him that “fixes” were made to tracking processes at certain times just to move people safely, which “led to shortcomings of the data on the other side.”
Clark mentioned other examples where officials “did the right thing to expedite the process,” but ultimately introduced data quality-related issues that caused the misinterpretation of real events happening on the ground — including one that recently left a four-star general disoriented over complex flight patterns.
“That’s all informed the Ukraine process,” he noted.
The Pentagon officials agreed that looking forward, the DOD must be strategic about implementing a more robust data-centered infrastructure that it can turn to for immediate interoperability and insights right when any sort of crises occur, as opposed to in their aftermath.
“We’re working on a federated data catalog to be able to just understand where our data is, what the meaning of that data is and what is the type of data that we have that will be able to answer the type of question,” Little noted.
Though he didn’t provide many details, Little said the CDAO is also partnering with companies to pinpoint better application programming interface capabilities to enable deeper data-sharing across new and legacy architectures — and, separately producing an ontology associated with data quality.
“We need to have the right infrastructure and tooling in the first place, so that in times of crisis we’re not getting sloppy with the data that we’re collecting and managing and using, but we’re feeding clean data into usable formats that are easily consumed and so that we can get the job done that we need to — whether that’s rectifying flight manifests or ensuring the quality of the vaccine data, for example,” Olson said.