VA’s AI ‘to-go’ delivery model is morphing into a platform

The department isn't the only one taking an iterative approach to AI.
The exterior of the Veterans Affairs Hospital is seen November 10, 2003 in New York City. (Photo by Spencer Platt/Getty Images)

Interest in an artificial intelligence “to-go” delivery model is building with more than a dozen Department of Veterans Affairs sites looking to pilot modules, the agency’s head of AI said Thursday.

VA developed the initial module to assist its medical centers with COVID-19 individual risk prediction, but its hundreds of centers and thousands of facilities have other uses for the statistical models being tested, said Gil Alterovitz, VA’s director of AI.

Additional use cases haven’t been chosen, but AI models will be packaged as embeddable software add-ons for rapid deployment based on the original.

“We’re now using that to generalize and essentially created a new platform so that artificial intelligence research and development can be added as modules in the future,” Alterovitz said during day three of FedTalks, presented by FedScoop.


Once the AI technology and health application have been vetted, any medical center will be able to access a module when VA shares a secure, internal weblink. The module will appear within the reporting system dashboard.

The original module was developed with available medical records and helps determine if veterans testing positive for COVID-19 should be admitted to a medical center or intensive care, and even calculates their chance of death.

Iterative AI

VA isn’t the only department taking an iterative approach to AI adoption.

The Department of Agriculture‘s Agricultural Marketing Service has begun using machine learning coupled with computer vision to reduce manual work when grading cotton. USDA charges for the service, but costs have gone down now that labor time has been reduced.


Computer vision is also used on geospatial imagery to generate crop yield data, allowing USDA to do historical analysis and correlate imagery with local yields. And natural language processing is being used to automate expense reviews associated with USDA facility repairs.

“We’re also really focused on getting to standard tools, so we can be on platforms where people can collaborate and fully leverage our data as an enterprise,” said Ted Kaouk, chief data officer at USDA. “So we’re testing out enterprise tools where requirements are coming in from our component agencies.”

Kaouk also chairs the federal CDO Council, which is currently discussing broader opportunities for data sharing that could benefit cognitive technologies.

A recently established Data Sharing Working Group is working on a comprehensive list of data sharing use cases across agencies for the council. The list will help answer questions like why agencies are sharing data; with who; and what legal, policy and technical challenges they still face, Kaouk said.

The forthcoming Federal Data Strategy 2021 Action Plan will make subtle changes to the Year 1 plan in an effort to address some of those constraints.


“Overall it’s a very aggressive plan of actions and milestones for the first year, and I do expect in 2021 we’ll see some very similar actions to follow on and build upon those actions and foundational processes,” Kaouk said. “Things like establishing data governance boards, inventorying our data and assessing our workforce and infrastructure.”

On the workforce front, the Federal Data Science Training Program is part of a larger focus by agencies on assessing skill gaps in an effort to train an AI-ready workforce, he said.

Participants are not only immersed in design thinking, data visualization and statistical analysis but advanced machine learning and AI techniques.

“The Federal Data Council also has a Data Skills Working Group,” Kaouk said. “So we’re staying plugged in.”

Latest Podcasts