VA virtual agent will remain in beta for now

A chatbot in use at the Department of Veterans Affairs will eventually be updated to address sexual assault-relate topics, but remains in beta mode for now.
Department of Veterans Affairs, VA
(Getty Images)

The Department of Veterans Affairs is planning to update its virtual agent chatbot with a new training model designed to discuss topics related to sexual assault. The expansion of the virtual agent chatbot is part of the VA’s 2023 fiscal year roadmap, though the chatbot will continue to remain in beta mode. The agency emphasizes that it “does not use chatbots as a replacement for direct crisis intervention.”

The update has not been previously published and details on the chatbot training method have not yet been established, the agency told FedScoop.

Still, the eventual upgrade comes as the VA tries to expand ways that veterans can discuss this critical and sensitive topic. In June, the agency announced that its 1-800-MyVA411 hotline can now be used to report sexual harassment or sexual assault at VA facilities. 

VA Press Secretary Terrence Hayes told FedScoop in a statement: “While the chatbot platform has been active for over a year, each new feature that is part of our minimally viable product is released iteratively. So rather than classifying a feature mid-conversation with the user, VA decided to keep the beta classification at the platform level for now. Other Government agencies such as the Federal Student Aid do the same.”


“Whenever possible VA connects Veterans in crisis to experts directly and does not use chatbots as a replacement for direct crisis intervention,” he added. “The chatbot carries a disclaimer clearly stating that it is not a personal, medical, or mental health emergency bot. However, a level of sexual assault-related bot training is part of the fiscal year 2023 roadmap as an extension of the existing Veteran Crisis Line bot response.”

It’s not yet clear what the update will involve. John Davisson, an attorney at the Electronic Information Privacy Center, noted that the chatbots like these should make clear that they’re not meant for submitting personal identifying information — and emphasized the importance of cybersecurity safeguards.

“It should expunge at the end of the chat all information associated with a user’s interaction with the chat if they are seeking support relating to a sexual assault. Because it is — of course — possible that someone will, despite instructions otherwise,  end up submitting personal information or information that could be linked to them,” added Davisson. 

Rebecca Heilweil

Written by Rebecca Heilweil

Rebecca Heilweil is an investigative reporter for FedScoop. She writes about the intersection of government, tech policy, and emerging technologies. Previously she was a reporter at Vox's tech site, Recode. She’s also written for Slate, Wired, the Wall Street Journal, and other publications. You can reach her at Message her if you’d like to chat on Signal.

Latest Podcasts