HHS pairs new AI tools with federal data to treat cancer, Lyme disease
Officials at the Department of Health and Human Services think they now have a way to combine the power of artificial intelligence with public data to make it easier for patients to find cutting-edge treatments for complex diseases.
Officials from the HHS Office of the Chief Technology Officer said Thursday they have completed a 14-week project — dubbed The Opportunity Project (TOP) Health tech sprint — that blended open federal data with AI technology to streamline research, better identify new treatments and disseminate new information on conditions like cancer and Lyme disease.
“At HHS, we recognize that Federal government alone cannot solve our most important and complex challenges,” Ed Simcox, HHS CTO and acting CIO, said in a blog post published Thursday. “The TOP Health sprint is a valuable step in leveraging skills from industry with public resources to promote better health outcomes.”
But officials also said they won’t be able to share the results until the government reopens in its entirety. An HHS spokesperson told FedScoop that the demo day would be hosted by the Census Bureau, which originally crafted the TOP methodologies used to develop the pilot. But the Census Bureau, as part of the Department of Commerce, hasn’t been funded and demonstrations of the pilot have been delayed as a result.
HHS also conducted the project with the Presidential Innovation Fellows, which is operated by the General Services Administration, an agency that is impacted in the partial shutdown.
The TOP Health project was developed around two central challenges: designing digital tools to help patients find experimental therapies for treatment, as well as helping developer locate patients that would be eligible for treatments; and using emerging technologies like AI to create tools that facilitate data sharing for the education and prevention of Lyme disease and other tick-borne maladies.
Over the course of the 14-week sprint, companies like Microsoft and Oracle, alongside the Department of Energy’s Oak Ridge National Laboratory and stakeholders across the health care space, divided into 11 teams, crafting new tech tools fueled by curated data sets from Healthdata.gov, data.gov and other federal sources provided by HHS, the Department of Veterans Affairs and the PIF program.
Those tools leveraged AI to better match clinical treatment trials with cancer patients and make it easier to share trial data and symptom information for those with Lyme disease.
Other teams designed apps to track information on the effectiveness of treatments for complex conditions and the location of potential disease-carrying ticks, and provide educational information on Lyme disease.
The pilot’s completion follows the recent signing of the Open, Permanent, Electronic, and Necessary (OPEN) Government Data Act into law. The law requires agencies to publish their data in machine-readable formats that are consistent across government, potentially making it easier for technologies like AI to use the data to find new solutions for a variety of real-world challenges.
VA needs a contractor to help test its new EHR
As the Department of Veterans Affairs begins its journey to adopt a new electronic health record (EHR), the department is going to need some help testing the modernized system before its rollout at select facilities across the country.
In fiscal 2019, VA will dedicate more than half of its software testing resources to transitioning from the legacy Veterans Health Information Systems and Technology Architecture (VistA) to its $16 billion EHR Modernization (EHRM) program, the department says in a new request for information.
It’s going to be a huge lift for VA’s Enterprise Testing Service (ETS) team, so it’s going to need a little help ensuring the platform is deployed on-time and within budget — something lawmakers have a close eye on.
The new EHR, developed by Cerner, will be “similar” to that used by the Department of Defense, which will allow patient data to be “seamlessly” shared between the two.
Concerning the EHR modernization, ETS is accountable for verification and validation of all applications. “ETS measures and examines delivery and deployment processes, quality, and capabilities within [the Office of Information and Technology] to enable a culture of continuous improvement,” the RFI says. “This culture leads to enhanced program and project quality through early engagement, agile services and environments, and elimination of process redundancy.”
The VA will need “a Contractor to provide support in the areas of testing of software (to include performance and system-of-systems integration testing); infrastructure, environments, and operations of the ETS Test Center (ETSTC) and ETS cloud; and continuous quality process management support,” it says in the RFI.
Not all testing will be devoted purely to the new EHR, however; the potential contract also includes “testing VA applications in support of VA modernization initiatives, including but not limited to EHRM, Financial Management Business Transformation (FMBT), Defense Medical Logistics Standard Support (DMLSS) system, and other Government-off-the-shelf (GOTS) and COTS applications.”
But as the department gets more into the heat of the EHR rollout, it’s going to continuing ramping up its focus on testing it: “As the transition matures, roughly 80 percent of ETS resources and effort will focus on EHRM and other modernization efforts.”
VA envisions a year-long base contract with four optional extension years, based out of its Bay Pines Test Center in Florida and a field office in Birmingham, Alabama.
The department is still early on in the timeline for building out the new EHR. VA Secretary Robert Wilkie signed the contract on the 10-year contract last May. During a hearing in November, officials highlighted early progress of the implementation, such as the development of governance structures and the completion of assessments at initial rollout sites in the Pacific Northwest.
However, a ProPublica report, also from November, painted a different picture, claiming the health of the $16 billion project was “yellow trending towards red.”
Good news came for VA earlier this month when after almost two years, a permanent CIO, James Gfrerer, was Senate-confirmed to lead its Office of Information and Technology. The department hadn’t had an official CIO since LaVerne Council resigned at the change in administration in January 2017. It is fair to say that Gfrerer has his work cut out for him.
Shutdown impacts may be rippling beyond shuttered agencies
The partial government shutdown has now gone longer than any in federal history, and its effects on agencies and contractors are reaching critical levels, leaders from a federal contractor trade group said Thursday.
“How long can people work if they are not getting paid,” David Berteau, president of the Professional Services Council, said in a conference call with the media. “That’s a question we’ve never tested to this level before. It is pretty scary.”
With the shutdown 27 days in and counting, PSC leaders said the economic impacts on the contracting community have reached unprecedented levels with no indication of when the budget impasse will end. Even worse, they said, it likely has started to ripple beyond furloughed agencies to programs that have funding.
Berteau said ongoing efforts like IT modernization projects approved by the Technology Modernization Fund are caught in a confusing new reality — not because of a lack of appropriations, but because the services needed to carry them out are closed.
“We have companies that are working on IT modernization contracts where the customer is closed even though the funding — which comes, for example, from the modernization fund — is still available,” he said. “So the work can go on, but you can’t talk to the customer. Now, how dumb is that?”
According to the TMF board’s website, as of Dec. 12, two of the shuttered agencies have spent funds on their modernization projects: the Department of Housing an Urban Development on its UNISYS Mainframe Migration and Department of Agriculture on its Farmers.gov Portal.
But because the shutdown has partially impacted the General Services Administration — which administers the TMF and is partially funded by appropriations — it’s unclear how much more has been spent since last month and what progress has been made.
Projects like the Department of Defense’s Joint Enterprise Defense Infrastructure (JEDI) cloud computing contract may also face unintended roadblocks as a result of the shutdown, even though DOD is fully funded for fiscal 2019. Berteau said though the department has stated JEDI will not be impacted by the shutdown, defense officials had to cancel an advisory board meeting about the contract because such a gathering is required to be posted in the Federal Register, which isn’t operating because of the shutdown.
The lasting impact on both agencies and contractors isn’t yet known. More than 800,000 federal employees have been furloughed, with agencies calling for some to return to work without pay.
PSC officials said while there is no official tally of federal contractors furloughed, they estimate it’s in the “tens of thousands,” depriving the government of a workforce on two fronts at a time when talent is heavily in demand.
The fear, Berteau said, is now that the shutdown has enveloped a second pay period, federal workers and contractors will leave for other jobs. Such a move ensures that whenever the shutdown does end, the hangover will be even more pronounced and tougher to recover from.
“How do you retain people in order to have them available when you come back,” he said. “This is not a question we’ve run into before.”
Shareholders also want Amazon to stop selling facial recognition tech to government
A group of Amazon shareholders is now formally asking the company to stop selling its Rekognition facial recognition software to government agencies.
In the resolution, some shareholders ask the company’s board of directors to step in and stop such sales “unless the Board concludes, after an evaluation using independent evidence, that the technology does not cause or contribute to actual or potential violations of civil and human rights.” The petitioning shareholders expect the resolution will be voted on during the company’s annual meeting this spring.
The primary shareholder on the petition is the Sisters of St. Joseph of Brentwood, a member of the Tri-State Coalition for Responsible Investment.
“The Sisters of St. Joseph of Brentwood have committed as a congregation to support immigrant communities and promote racial equity, and this applies to our investments,” Sister Patricia Mahoney said in a statement. “Therefore, we filed this proposal because we are concerned that Amazon has pitched facial recognition technology to Immigration and Customs Enforcement (ICE) and piloted its Rekognition with police departments, without fully assessing potential human rights impacts.”
The group isn’t alone in their interest in halting, or at least slowing, government use of these technologies.
Just earlier this week, American Civil Liberties Union, together with a bunch of other civil rights groups, called on Amazon and other software companies to stop selling facial recognition tech to the government. The ACLU seems pleased to see the issue raised by yet another set of stakeholders.
“The fact that Amazon’s shareholders felt compelled to take this up to the company’s board of directors should be a wake-up call to Amazon’s leadership to take concerns around face surveillance seriously,” Shankar Narayan, the Technology and Liberty Project director of the American Civil Liberties Union of Washington, said in a statement.
Despite regular pressure from civil rights groups, and increasing questioning from Congress, Amazon has maintained that facial recognition technology is a net positive for the government. Asked if the business has “drawn any red lines” around its government work after employees protested the sale of Rekognition to law enforcement agencies, Teresa Carlson, vice president of worldwide public sector for Amazon Web Services, said no.
“We provide them the tools, we don’t provide the solution application that they build,” she said of Amazon’s government sales during the Aspen Security Forum in July 2018. “And we often don’t know everything they’re actually utilizing the tool for. But they need to have the most innovative and cutting-edge tools they can.”
It’s unclear which additional shareholders are joining the Sisters of St. Joseph of Brentwood in their resolution, but the organization says that the group represents a total of $1.32 billion in assets under management.
Mirror mirror on the wall, which is the best credibility assessment tool of them all?
What’s the best method for determining whether or not someone is trustworthy? Given the choice, should we trust a thorough background check, a polygraph, or a simple gut-check?
These questions take on special significance in an era when the trustworthiness of the media, public figures and more is increasingly fraught. And these are the questions that the Intelligence Advanced Research Projects Activity (IARPA) is trying to answer through a new public challenge.
The Credibility Assessment Standard Evaluation (CASE) challenge is looking for “novel methods to measure the performance of credibility assessment techniques and technologies” —methods that have thus far proved elusive.
Ideally, IARPA wants a method that can be used both for trustworthiness assessment techniques (like a gut-check) and technologies (like a polygraph). The agency also wants a method that will work for future technologies, such as novel biometric screening technologies or artificial intelligence.
The challenge is an interesting follow-up to IARPA’s very first public challenge from 2014, which aimed to create algorithms for determining trustworthiness. The winning team rose to the challenge by delivering a statistical technique using a person’s heart rate and reaction time to predict whether that person can be trusted to follow through on a promise. In the challenge at hand, participants aren’t being asked to create technologies that can determine trustworthiness but rather to find a way to determine how well these technologies (existing or not) actually work.
The competition, which launched Jan. 2, has a total of $125,000 in prize money attached. The challenge will be conducted in two stages — stage one wraps up at the end of March and five winners from this portion will be announced in May 2019. These top five finalists will participate in a grand prize pitch competition in D.C. in July.
The challenge is being hosted on popular citizen science platform HeroX.
Gov Actually Episode 29: An Idea to Prevent Future Government Shutdowns
We’re nearly four weeks into what’s already the longest federal government shutdown, and neither side seems to be budging on the battle to build a wall across the nation’s southern border.
But Gov Actually co-host Danny Werfel has an idea — one that he thinks could prevent future political budget debates from holding federal appropriations hostage.
It involves “a pool of resources that are intended to address border security, and there’s flexibility in how those funds can be used” based on prevailing business cases.
“I know in some ways, I’m kicking the can down the road a bit, but it is a legislative construct that allows both sides to walk away happy,” Danny says in the latest episode.
Of course, co-host Dan Tangherlin pushed back some, saying there are many other things at play here, like “a set of political and personality challenges that revolve around trust and good faith and partisanship.” There are “many, many vectors from this idea,” he said.
Dan did compliment Danny for coming to the table with a well thought-out, non-partisan idea, saying a big part of the problem is that neither side wants work with the other. “In any negotiation, to get to an outcome, both sides have to feel like they walk away with something.”
Listen to the podcast to see where Dan and Danny land in trying to figure out how to prevent future shutdowns from happening.
This isn’t the first time Gov Actually has talked about a shutdown. This time last year, the team discussed a much tamer lapse in appropriations and what it’s like to be a part of an agency that’s furloughed.
Catch all of the Gov Actually episodes on iTunes and SoundCloud.
Let us know what you think in the comments on iTunes.
Talent and data top DOD’s challenges for AI, chief data officer says
The Pentagon has made big plans to adopt artificial intelligence across the department, but two very large hurdles stand in the way of that goal, its new chief data officer said Wednesday: structuring data and recruiting the talent to manage it.
“You can’t feed the algorithms if you don’t have data. Solid, clean data in large volumes, well-tagged and well organized,” Michael Conlin said at the ACT-IAC Artificial Intelligence and Intelligent. “People will tell you that the machine learning algorithms, AI technologies can clean the data for you. Good luck.”
The Department of Defense has no shortage of data to pull from, but for it to be of any use to the AI capabilities, the department has to make sure that data is recorded in consistent, machine-readable formats for accuracy and to ensure it doesn’t present the algorithms with unintended bias, Conlin said.
“The more data you have to train your algorithms, the more accurate the algorithms are and the faster you get your results,” he said. “So data is everything.”
As an example, he detailed the department’s efforts to improve the flight readiness of aircraft by tracking the lifecycle of parts that are replaced frequently versus those that can be sustained for longer — dubbed “lemons” and “peaches,” respectively.
Conlin said the department tracked the serial numbers for the parts from aircraft maintenance records and determined with 99.9 percent accuracy which parts were lemons and which were peaches after nine maintenance stops on each part.
The problem stems from the data itself, however. Because department officials used the serial numbers to identify the lemons versus peaches, Conlin said, only 25 percent of the data was useful.
“Some [records] had a blank or a ‘To be completed later,’ or ‘I don’t know,’ or something that wasn’t the serial number,” he said. “So you couldn’t connect the maintenance records together in order to be able to identify to nine consistent maintenance activities.”
Considering that Silicon Valley is focused more on delivering AI solutions tailored for specific use cases rather than enterprisewide applications, as well as the growing importance of edge computing, the quality of the structured data becomes that much more important.
Equally important is the Pentagon’s need for data scientists to help oversee the AI systems, Conlin said. But the challenge is the current federal workforce structure isn’t designed for the job.
“We don’t train data scientists in the government. We don’t have the career path for data professionals, let alone data scientists,” he said. “I have to write my own position description for the data scientists I want to hire.”
Because data scientists have to possess the skills of a computer scientist, coder, mathematician and statistician, Conlin said the role traverses multiple job classifications in the government’s human capital space, making the position essential but difficult to recruit.
“Talent is a really big challenge for us,” he said. “And finding, borrowing, begging and creating talent is a really big challenge for all of us here in the room today.”
What can draw data scientists to an organization, however, are the unique challenges that working for agencies like the DOD can provide, Conlin said — but only if they can maintain a line of sight to those challenges.
“If you are thinking about making investments in this space, you are going to want to think about a centers of expertise or centers of excellence kind of model where you are pooling these folks in one place and assigning them, but maintaining a very high level of community activity across the group,” he said. “You have to think about what your value proposition for these human beings is if you want to keep them. Because they can make six- to seven-figure salaries in the commercial sector.”
General Dynamics scores Navy ONE-Net extension worth up to $160M
General Dynamics IT will continue its work supporting ONE-Net, the Navy’s legacy network for operations outside the contiguous U.S., for at least another year.
The Navy awarded GDIT, through its recently acquired subsidiary CSRA, a $78 million contract extension to continue its work with ONE-Net until the service transitions to the forthcoming $3.5 billion modernization, Next Generation Enterprise Network Recompete (NGEN-R). This extension comes with a base year and a possible four-and-a-half month option. If all options are exercised through May 2020, according to GDIT, the contract’s value will bump up to $160 million.
GDIT will continue to provide core IT services to Navy operations outside the lower 48 states at 54 overseas sites and across 10 countries, “including Non-classified Internet Protocol Router Network (NIPRNet) and Secret Internet Protocol Router Network (SIPRNet) access, network connectivity and security, mobile access and desktop support,” says a release from the company.
“GDIT’s ONE-Net solution supports 47,000 warfighters globally with next-generation technology and services,” Senior Vice President Leigh Palmer, head of GDIT’s Defense Division, said in a statement. “This award shows GDIT’s continued dedication to the Navy’s OCONUS network and our ability to deliver impressive results for our customers.”
CSRA first won the ONE-Net contract in 2011. It has already been extended once in Jan. 2017 for up to $80.3 million.
While it’s a big win for GDIT, the bigger, looming story is Navy’s slow trudge to launch NGEN-R. The service issued a request for proposals for both parts of the $3.5 billion contract late last year. It will subsume any services provided outside of the contiguous U.S. under ONE-Net and combine and streamline them with the original Next Generation Enterprise Network, awarded to Hewlett Packard Enterprise Services in 2013, which included the Navy Marine Corps Intranet, and the Marine Corps Enterprise Network’s (MCEN) operations within the continental U.S.
New GitHub authorization expands agency access to open source resources
Government IT offices now have access to a vast range of open source software resources and developers since GitHub gained FedRAMP operating authority for its Enterprise Cloud, according to a new special report.
The authorization from the Federal Risk and Authorization Management Program means government agencies can move beyond GitHub’s licensed platform for internal enterprise software development and take advantage of a wider universe of cloud-based open source development resources, knowing they meet federal security guidelines.

Download the full report.
The expanded access to GitHub resources comes as a growing number of government agencies all over the world utilize GitHub’s open source collaboration platform. At last count, 143 U.S. federal civilian agencies, 14 Department of Defense agencies and 48 state agencies are using GitHub to collaborate on code, data, policy and procurement, according to GitHub figures.
Details about the expanded options for federal agencies – and how GitHub used a new FedRAMP authorization process to gain security approval – are contained in a new special report, “Federal Access to Open Source,” produced by FedScoop and underwritten by GitHub.
According to Ashley Mahan, acting director of the FedRAMP Program Office, the FedRAMP Tailored assessment process takes advantage of a subset of the NIST 800-53 technical control standards to fine-tune FedRAMP authorization specifically for software-as-a-service (SaaS) providers that handle low-risk, low-impact data and aren’t responsible for a host of network security controls.
GitHub is among the first cloud service providers to receive FedRAMP Tailored approval.
“We have historically had [government] customers on GitHub.com, but they were either doing it as shadow IT under a team plan or non-mission-critical system,” said Jamie Jones, GitHub principal architect. “Because GitHub.com did not have an authority to operate (ATO) it was not deemed appropriate for most organizations’ day-to-day mission-critical applications,” he explained.
By establishing FedRAMP Tailored, the GSA program office has created a more streamlined security approval process that is better suited for software-as-a-service providers.
Now with FedRAMP approval, Jones says there is no reason an agency currently using GitHub under a team or enterprise license not to move to the FedRAMP-authorized Enterprise Cloud version. And in most circumstances, there are a number of advantages.
“One key benefit of using the FedRAMP-authorized Enterprise Cloud is we can now support your agency’s identification and authorization tools,” Jones said. “For the extra capabilities we are providing, including faster support requests or the ability to use SAML and your identity providers, it’s far less of an administrative burden.”
According to the report, some of the features on GitHub Enterprise Cloud are quickly gaining parity with, and in some cases, exceeding the features of GitHub’s Enterprise Server, its on-premises offering. But the biggest advantage to agencies using GitHub Enterprise Cloud, according to Jones, is it gives them access to the entire GitHub universe of open source development and collaboration resources — and the vast community of developers contributing to those resources.
Read the report to learn about the GitHub story, or click here to learn more about GitHub’s new open source platform for software development.
NIH wants your Fitbit data for precision medicine
The federal government wants you to share the personal health data collected by your Fitbit device, all for the benefit of science.
The Department of Health and Human Services’ National Institutes of Health (NIH) officially launched All of Us in May 2018, but the broader precision medicine initiative goes back to 2015. The goal is to get a million or more volunteers to share demographic and health information about themselves by answering surveys, sharing electronic health records and more. This data, NIH argues, will help researchers gain more insight into the impact of lifestyle and environment on disease treatment and prevention. And this, in turn, will lead to increasingly more personalized health care practices.
Now, there’s a new way for participants to share data: by linking their personal Fitbit wearable devices.
“Collecting real-world, real-time data through digital technologies will become a fundamental part of the program,” Eric Dishman, director of the program, said in a statement. “This information, in combination with many other data types, will give us an unprecedented ability to better understand the impact of lifestyle and environment on health outcomes and, ultimately, develop better strategies for keeping people healthy in a very precise, individualized way.”
To participate, volunteers need to register and link their own Fitbit device. They can then decide what kind of data they want to share with the program.
All of Us plans to partner with other similar health tech devices in the future, the press release hints. It will also launch an initiative through which volunteers will be provided with Fitbit devices later this year.
Prior to the formal launch of All of Us, the program conducted a pilot version. According to NIH, more than 25,000 volunteers participated in the pilot program. It’s unclear how many participants All of Us currently has, or how this may change with the introduction of the Fitbit component.