Technology Modernization Fund to get additional $25 million

The Technology Modernization Fund is set to get $25 million more in funds in the current version of the proposed 2020 spending bill.

This latest infusion would bring the fund’s total to $150 million — it launched with $100 million in fiscal 2018 and got an additional $25 million in fiscal 2019. The fund, which is housed within the General Services Administration and overseen by both GSA and the Office of Management and Budget, gives agencies money for long-term IT modernization projects.

Thus far it has awarded a total of $90 million to seven distinct projects — two at both the U.S. Department of Agriculture, General Services Administration, and one apiece at the departments of Energy, Housing and Urban Development, and Labor.

Despite demand from agencies, getting additional funds into the TMF has been contentious.

In March, the White House budget proposed $150 million for TMF. Following this, a House version of the 2020 Financial Services and General Government Appropriations Act, which passed in June, proposed cutting it to $35 million. A Senate version of the bill, from September 2019, didn’t include any appropriations for TMF at all.

This current $1.3 trillion spending bill faces a Friday deadline for passage to avoid a government shutdown. However, both chambers are optimistic they will pass it and send it to the president’s desk this week.

The Government Accountability Office expressed doubts in a report last week on whether the TMF will be able to collect fees to cover its own administrative costs. The fund’s program management office has spent $1.2 million, the report said, and collected only $33,165. At this rate, the watchdog said, the TMF won’t recover its costs until 2025.

OMB Deputy Director for Management Margaret Weichert disagreed with the GAO’s assessment, though, and said that the audit “paints an incomplete picture of the TMF.”

Most agencies don’t use FedRAMP to authorize all cloud services

Most agencies don’t use the Federal Risk and Authorization Management Program to authorize all of their cloud services despite being required to do so statutorily, according to the Government Accountability Office.

The Office of Management and Budget established FedRAMP in 2011 as a required program to authorize and continuously monitor cloud service offerings across agencies. But 15 of 24 Chief Financial Officers Act agencies don’t always use FedRAMP and OMB doesn’t “effectively monitor” their compliance, GAO found in a report.

The General Service Administration, the agency that manages FedRAMP, lacks in its guidace as well, the report found.

“GSA took steps to improve the program, but its FedRAMP guidance on requirements and responsibilities was not always clear and the program’s process for monitoring the status of security controls over cloud services was limited,” reads the report. “Until GSA addresses these challenges, agency implementation of the program’s requirements will likely remain inconsistent.”

Between June 2017 and July 2019, FedRAMP authorizations of cloud services increased from 390 to 926 — 137%.

But GAO closely examined efforts at the Department of Health and Human Services, GSA, Environmental Protection Agency, and U.S. Agency for International Development and found them missing “key elements” of the FedRAMP process.

Only USAID’s security plans addressed required information on control implementation and security assessment reports summarized results of control tests. None of the four agencies’ remedial action plans addressed required information. And only GSA prepared and provided cloud service authorizations to the FedRAMP Program Office.

Among the 15 CFO Act agencies not always using FedRAMP, one reported 90 unauthorized cloud services and the other 14 reported 157 unauthorized cloud services. GAO surveyed 47 cloud service providers and found 31 encountered agencies not using FedRAMP in fiscal 2017.

GAO recommended OMB enhance oversight, to which the agency has yet to respond. GSA agreed with GAO’s recommendations that it needs to improve guidance and monitoring.

HHS agreed with GAO’s findings and USAID generally agreed, but EPA generally disagreed arguing one system selected for review was not used in agency operations.

Watchdog predicts ‘ongoing challenges’ collecting administrative fees for TMF

The program management office of the Technology Modernization Fund isn’t collecting fees fast enough to cover its administrative costs, a new review by the Government Accountability Office finds.

The TMF program office, housed within the General Services Administration, has obligated $1.2 million for operating expenses — salaries and the like. The seven projects to which the TMF has given funding are collectively expected to pay this back over the next few years, but so far the office has only received $33,165.

This is less than GSA had planned to collect, GAO says, for various reasons, including the fact that agencies don’t have to start paying immediately. On top of that, two projects — the Department of Agriculture’s infrastructure optimization project and the Department of Energy’s cloud email project — have reduced their scopes and therefore reduced the funding necessary and the fees they will be expected to pay back.

“Such factors raise doubts on whether GSA will be able to fully recover future operating expenses,” GAO writes. At this rate, the watchdog says, the TMF won’t see its $1.2 million until 2025.

It also means that the TMF board has less money to invest in other agency projects, GAO says.

Administration officials at the White House Office of Management and Budget, however, place blame for any shortcoming in funding squarely at the feet of Congress. In a written response to the report, OMB Deputy Director for Management Margaret Weichert says the audit “paints an incomplete picture of the TMF.”

“OMB disagrees with GAO’s characterization of the repayment process to the TMF and the assumptions about potential insolvency of the fund,” Weichert writes.

This is one central disagreement to come out of what is a very detailed report by GAO documenting all the money to pass through the TMF to date.

The watchdog is also concerned, however, with one of the central selling points of the TMF: that it will allow agencies to save money by moving away from expensive legacy IT systems. The report states that “none of the seven TMF- funded projects’ cost savings estimates can be considered reliable,” which means “it is not clear whether the projects receiving funding to date will save the government as much money as was estimated.” The way forward, GAO says, is for GSA and OMB to clarify the rules around their cost estimating processes.

To this end and others, the GAO report offers five recommendations — two for OMB and three for GSA. GSA agreed with one of the recommendations and partially agreed with the other two.

The TMF was created by the Modernizing Government Technology (MGT) Act, which President Trump signed into law in December 2017. In March 2018 OMB convened a seven-member board tasked with picking awardees of the fund.

In fiscal 2018 the TMF received $100 million, and in fiscal 2019 it got an additional $25 million. Supporters argue that the money, which recipient agencies pay back over five years, helps agencies work on large-scale modernization projects. But convincing Congress to put money into the pot hasn’t proven easy.

Thus far, out of its secured appropriations, the fund has awarded a total of $90 million to seven distinct projects — two at both the U.S. Department of Agriculture, General Services Administration, and one apiece at the departments of Energy, Housing and Urban Development, and Labor. Of that $90 million, the fund has transferred around $37 million to the agency projects.

How to keep up with IT skills needed to adopt AI and automation

How HHS Can Use Social Data for Better Healthcare

The social determinants of health (SDOH)  — which include income, education, housing, environment, and food availability — are attracting increasing attention across the health care industry and in the U.S. Department of Health and Human Services (HHS).

Public health experts have realized that a person’s ZIP code can be just as important as his or her genetic code in predicting health risks. Simultaneously, HHS has identified social interventions as an important part of the Department’s mission. HHS Secretary Alex Azar has said that the “social determinants would be important to HHS even if all we did was healthcare services…but in our very name and structure, we are set up to think about all the needs of vulnerable Americans, not just their healthcare needs.”

In October, the HHS Office of the Chief Technology Officer (CTO) and the independent nonprofit Center for Open Data Enterprise (CODE) co-hosted the Roundtable on Leveraging Data on the Social Determinants of Health. The roundtable brought together more than 80 experts from inside and outside of government to explore new strategies for combining SDOH data with research, clinical, and public health data, and to help develop a new data-driven paradigm for socially focused healthcare. Today, CODE published its independent report on the roundtable with recommendations for public-private collaboration to advance that new paradigm.

As Assistant Secretary for Health Adm. Brett Giroir says in the foreword to the CODE report, the need for a new approach is urgent. “Our healthcare spending is tremendously high, accounting for almost 18% of our GNP and potentially going up to $6 trillion by 2027. And we’re not getting our money’s worth for those expenditures,” he writes. “America needs a new approach to healthcare, and work on the social determinants of health can be the foundation of that approach.”

The report describes many ways that different organizations are using SDOH data and illustrates them with a dozen case studies. Healthcare providers, for example, are collecting SDOH data in clinical settings to improve patient care. Policymakers are using population-wide SDOH data to target healthcare funding. And initiatives such as Healthy People 2020 and the Gravity Project are expanding the use of SDOH data for many applications.

CODE’s report calls for a national SDOH Data Strategy that can provide the policy framework to improve interoperability and data access, address privacy concerns, and broadly set the agenda for improved use of SDOH data. Based on input from Roundtable participants and CODE’s own research, the report includes several specific recommendations that such a strategy could implement. Key recommendations include:

CODE’s report comes at a time of increased momentum for the use of social determinants data. The Social Determinants Accelerator Act, proposed by Rep. Cheri Bustos, D-Ill., this year, aims to help states better measure the outcomes of their social determinants investments by providing planning grants and technical assistance. Moreover, the National Library of Medicine recently implemented the first free FHIR-enabled questionnaire, announced at the roundtable by former White House CTO Aneesh Chopra, to gauge the social determinants of health in a clinical setting. FHIR, which stands for Fast Healthcare Interoperability Resources, is a standard that enables quick exchange of healthcare information available on electronic health records.

At the Roundtable, former Acting Assistant Secretary for Health Dr. Karen DeSalvo noted that “SDOH Data is bigger than a single sector and will require ongoing collaboration between the public and private sectors. It will also require the involvement of CBOs that have access to different, more granular data sources and have their own goals for data use.” She described principles that can guide the development of a better SDOH data culture, including building open ecosystems, engaging with partners strategically, and keeping the end in mind. New partnerships between healthcare providers and CBOs, collaborative efforts to provide healthcare to rural communities, and other initiatives are showing how stakeholders can work together to build a new, data-driven paradigm for socially focused healthcare.

 Joel Gurin is President, and Paul Kuhne is Roundtables Program Manager, at the Center for Open Data Enterprise (CODE). CODE’s report is available online here.

As the Cyber Reskilling Academy’s second cohort moves on, trainers reflect on the impact

Participants in the Federal Cyber Reskilling Academy’s second cohort haven’t all landed government cybersecurity jobs yet, but the contractor that handled the training is sharing its process for how it has benchmarked and tracked their progress

Comtech Telecommunications rated applicants’ potential to succeed in a cyber career prior to their admission into the cohort and for the first time administered hands-on CYBRScore skills assessments throughout the eight-week instructional period. The CYBRScore assessment was tailored to the cyber defense analyst (CDA) work role defined in the National Initiative for Cybersecurity Education (NICE) Cybersecurity Workforce Framework.

But first, the pool of 600 federal employees that applied to the academy needed whittling down to 20 students across 10 civilian agencies and the Department of Defense. The Office of Management and Budget and Federal CIO Council didn’t require an IT or cyber background to apply, so Comtech asked general questions measuring 44 characteristics within the realm of aptitude, attitude and career interest.

Aptitude questions consisted of basic science, technology, engineering and math problems, while attitude questions gauged an applicant’s penchant for repetitive tasks and preference for working alone or in groups.

“We try and get those discerning features characterized up front, so at the end of the day, the cohort would be the best fit for success going through the overall program,” Alan Gush, academy director of CYBRScore, told FedScoop.

For instance, the CDA work role required candidates be committed to accuracy when interpreting data, but not so much so that they couldn’t complete assignments. Those characteristics were weighted more heavily.

Once the cohort was selected, subsequent CYBRScore assessments — conducted in a real-world virtual environment using open-source tools to do the data gathering — evaluated CDA knowledge, skills, abilities and tasks (KSATs) as laid out in the NICE framework. That meant no true-false, multiple-choice or fill-in-the-blank questions.

The first assessment baselined each student’s KSATs. Then CYBRScore divided the CDA work role into five functional areas for a total of five weekly assessments: protocol analysis, network defense analysis, network attack analysis, incident detection and incident handling.

“That’s a little bit nerve-racking, but it’s similar to…taking a certification exam,” said Erik Wallace, director of business development for CYBRScore.

An assessment might inform the student a hacker accessed the system and ask them to determine what happened leading up to the breach and identify the piece of data used in the intrusion, as well as the date it transpired. Or the participant might be asked to enumerate a network — identifying hosts by their internet protocols, noting their operating systems and which ports are potentially open.

Hundreds of data points were scored in real-time throughout each assessment, with the scores provided to students immediately after for them to track their progress. Scores were also used to personalize learning plans, with participants directed to specific training sets or hands-on virtual labs in areas where they were found to have skills gaps.

On the whole, the 19 participants that completed the academy showed “amazing gains in proficiency” between their baseline and final CYBRScore assessments, Gush said. In total, 16 of the 19 academy graduates finished in a higher experience tier than when they began.

Graduates received a certificate of completion Sept. 20 with 20% selected for cyber positions within their original agency or another since then, Gush said. Alternatively, graduates may have pursued developmental rotation assignments or additional cyber responsibilities within their current roles.

If OMB and the CIO Council opt to do a third cohort of the Reskilling Academy, Comtech would have to bid to run the program like any other company. But cyber aptitude assessments come recommended.

OMB has access to all the applicant data generated by CYBRScore through its contracting partner, the General Services Administration, for future iterations. And Federal CIO Suzette Kent hinted at their expansion in September.

CYBRScore would be interested in supporting a third cohort for OMB or training for any other federal agency — others having expressed interest, Gush said. Wallace said he would expect “slight tweaks” to the weighting of future assessments.

Agencies can find the offering on GSA’s Schedule 70 IT contract.

“So it is available as a distinct program, if agencies are interested in building out that capacity,” Gush said.

Army NETCOM doles out $118M to GDIT for IT, cyber support

The Army’s IT service provider for network communications will receive cybersecurity support and specialized staff from General Dynamics Information Technology through a $118 million task order award announced Thursday.

Network Enterprise Technology Command, a subordinate to Army Cyber Command, is based out of Fort Huachuca in Sierra Vista, Arizona and sought help with a number of enterprise IT issues.

GDIT won the single-award task order and will provide security compliance, real-time network monitoring, sensor grid monitoring, incident triage, and patching support for the Army’s portion of the Department of Defense Information Network.

The Army issued the task order via its Alliant 2 Cybersecurity and Network Operations Mission Support contract. There’s a five-month base period, four 12-month options and a six-month extension on the task order.

DHS announces winners of opioid detection challenge

The Department of Homeland Security’s Science and Technology Directorate announced the winners Thursday of its $1.5 million opioid detection challenge.

IDSS, an airport security scanning company based in Armonk, NY, won $500,000 for its solution which combines a 3D X-ray scanner with “automated detection algorithms.”

The runner-up, One Resonance, presented a solution that uses radio frequency to search for illicit substances, for which it won $250,000.

“The influx of illicit drugs is one of the nation’s greatest threats,” William N. Bryan, DHS senior official performing the duties of undersecretary for science and technology, said in a statement. “Through this combined effort to address the trafficking of opioids, S&T, our federal partners, and the private sector have produced technology solutions that will better protect the American people from the effects of this devastating crisis.”

The opioid detection challenge launched in February as a partnership between DHS S&T, U.S. Customs and Border Protection (CBP), the United States Postal Inspection Service (USPIS) and the Office of National Drug Control Policy (ONDCP). The stated goal was to find “novel, automated, nonintrusive, user-friendly and well-developed” ideas for tools and technologies that can detect opioids in the mail and thus disrupt their flow.

The broader goal, of course, is to combat the ongoing deadly opioid epidemic, a public health crisis that claimed around 50,000 lives in 2017.

In June, the challenge organizers chose eight finalists and gave each $100,000. The challenge culminated Thursday with an event and a live test of the technologies at the DHS Transportation Security Laboratory (TSL) in Egg Harbor Township, New Jersey.

For Veterans Experience Office, sharing CX best practices is a ‘privilege and a duty’

It’s no secret that the Department of Veterans Affairs has become increasingly focused on improving the experience that veterans have while interacting with the various tentacles of this massive agency.

In January 2015, the VA launched its Veterans Experience Office (VEO). In March 2018, when the Trump administration debuted its President’s Management Agenda, it named the VA as a leader on a cross-agency priority goal to improve customer experience (CX). In May 2019, the VA added CX to its “Core Values and Characteristics” — the guidelines in the the Code of Federal Regulations that define priorities of VA employees as well as what the agency stands for.

Now, the VEO is looking to establish a Customer Experience Institute as a way to formalize and “sustain” CX collaboration across the federal government.

Development of the institute is still in a very early phase — the exact curriculum or org structure is yet to be determined. But VEO’s Barbara Morton says she sees it has a “privilege and a duty” to share what the VA has learned about CX with other agencies across the government.

People are “really, really hungry” to learn more about customer experience, Morton told FedScoop. But implementing some of these practices can be challenging.

“Nobody has to start from scratch,” Morton said. “If you are at ground zero in an agency and you know you need to establish a customer experience capability, you know you want to, I don’t want that person to be hanging out on an island by themselves not knowing where to start. There are resources, there are strategies, there are things that my team can share.”

Morton was careful to note that the VA’s approach to CX isn’t one-size-fits-all. That said, she hopes the institute will be able to share basic principles that are widely applicable. And VA, having put effort in over the years, is a good home base for such an institute.

“We have credibility in this space because we’ve actually been able to build and mature customer experience capabilities,” Morton said.

So what kind of message will the institute focus on bringing to agencies and agency employees across the government?

Essentially, Morton says, it’s time to start thinking about CX as more than just a “nice to have.”

“In government we have an important opportunity to expand the way we think about measures of success,” she said. Traditionally, government measures success via operational metrics instead of experience metrics. And while number of appointments scheduled and such are certainly important considerations, “if the experiences that we’re providing are horrible, are the operational metrics really that important? It’s a question for us all to kind of chew on.”

“It’s really exciting,” Morton said.

USPS teams with Google Cloud for call center relief

The U.S. Postal Service plans to reduce wait times on about 80 million customer calls fielded annually through a partnership with Google Cloud announced Thursday.

USPS awarded Carahsoft — Google Cloud’s authorized distributor for public sector clients — a cloud contract with a $50 million ceiling covering customer experience and mail delivery solutions.

Increased delivery competition from new carrier services like Amazon has the postal agency reinventing itself, and call center customer experience is “one of the bigger pain points,” Mike Daniels, vice president of global public sector at Google Cloud, told FedScoop.

“Their call wait times are very unacceptable. They’re limited in what they can do with respect to staffing; you can’t just throw more people at it,” Daniels said. “It’s super expensive and risk fraught to tear out the agent console.”

The world’s largest mail delivery operation, USPS has about 630,000 employees, but its call center can’t handle current call volume.

In November, the agency launched its first Google Cloud pilot aimed at answering customers’ questions about obtaining passports.

Google Cloud’s Dialogflow Enterprise Edition creates artificial intelligence virtual agents (AIVAs) that can respond to customers more quickly than USPS employees.

AIVAs use the same conversational technology as Google Assistant and will eventually be incorporated into the agency’s call center technical architecture. That means customers will be able to access AIVAs by calling the agency, visiting its website or even texting.

Dialogflow will also be used to establish conversational interfaces across the agency’s websites, mobile applications, messaging platforms and Internet of Things devices. Future iterations will allow customers to search ZIP codes, nearby post offices, and instructions on shipping valuable items and tracking packages.

Forthcoming pilots will not be limited to customer experience solutions, Daniels said.

“Postal is really looking to transform everything they’re doing from a production, back-end standpoint,” he said. “From true production systems that deliver the mail to customer interaction systems.”