Agencies gain ‘momentum’ appointing Evidence Act leadership

Government has seen “momentum” around evidence-based policymaking at agencies, the majority having placed senior officials in charge of advancing data-driven decision making, according to the Evidence Team lead at the Office of Management and Budget.

All agencies submitted their interim learning agendas, first annual evaluation plans and interim capacity assessments in September, as mandated by OMB guidance stemming from the Foundations for Evidence-Based Policymaking Act, said Diana Epstein.

More than two years after the passage of the Evidence Act, relatively few agencies lack the leadership needed to implement its requirements.

“For the most part agencies have named their designated officials: the evaluation officers, the statistical officials and the chief data officers,” Epstein said, during a Data Foundation event Thursday. “The councils for each of these officials have been meeting regularly, and we’ve had some great cross-council collaboration.”

The Evaluation Officer Council meets monthly and works regularly with the Federal CDO Council and Performance Improvement Council. Meanwhile, Epstein’s team and the Office of Evaluation Sciences within the General Services Administration hold a monthly Evaluation and Evidence Training Series for hundreds of federal employees. The Interagency Council on Evaluation Policy was also rebooted and expanded.

OMB provided detailed feedback on agencies’ draft documents, and some have already published their evaluation plans on their websites as required.

“The last thing we want is for this to be yet another compliance or reporting exercise where agencies just put in minimal effort, check the boxes and nothing really changes,” Epstein said.

Agencies are expected to submit their first full learning agendas — identifying priority questions about programs, policies and regulations that can be answered with data — and capacity assessments as part of their strategic plans next fall. That’s on top of their fiscal 2023 evaluation plans. 

The Biden administration recently reaffirmed government’s commitment to evidence-based policymaking with its Memo on Restoring Trust in Government, which will see OMB release additional guidance in the coming months.

“We still have a long way to go,” Epstein said. “But it’s very exciting to see all the progress that we’re making collectively.”

Final CMMC rule expected to be finished in about a month

The final Defense Federal Acquisition Regulation Supplement (DFARS) rule that will require all contractors to have third-party inspections of their networks prior to working with the Department of Defense will get its final tweaks within the next 30-40 days, the program’s lead official said Thursday.

The interim final rule for the Cybersecurity Maturity Model Certification (CMMC) that was published in September received many comments from industry that the DOD has been working to adjudicate, said Katie Arrington, the department’s chief information security officer for acquisition and sustainment. She said the team is working to make the rule “go final” in about a month.

“You shouldn’t be waiting to build [cybersecurity] costs in” to rates, Arrington said to contractors during a Deltek webinar.

The interim final rule put CMMC into effect in December but had an open comment period for industry to give feedback to the government. As the CMMC program management office works through feedback, it has been tweaking the rule.

Issuing an interim final rule is not the norm but was needed because of the importance of securing industrial base contractors, Arrington said. CMMC is the department’s latest attempt to secure the industrial base’s cybersecurity, which has been vulnerable to massive data breaches of government information down the supply chain.

One of the biggest questions about the rule has been about reciprocity between CMMC and other federal cyber compliance programs. Arrington didn’t say what reciprocity may be coming but said that there will be guidance in CMMC Assessment Guides the DOD is working on.

There are other parts of the CMMC DFARs rule that will impact contractors before they are required to get an assessment. They now need to submit a self-assessment of their cyber compliance to the DOD, according to the rule. That process is separate from the CMMC assessment but could help companies prepare for their inspection by giving themselves a test first.

“The only thing they need to wait for is for the assessor to be aligned with the [third party assessment organizations],” Arrington said. No organization has been fully cleared yet to give assessments.

MetTel becomes latest EIS vendor to receive managed security services authorization

The government gave MetTel permission to provide Trusted Internet Connections 3.0-compliant managed security services through the Enterprise Infrastructure Solutions contract, the telecommunications company announced Thursday.

Together the General Services Administration and Cybersecurity and Infrastructure Security Agency granted MetTel the authority to operate (ATO) Managed Trusted Internet Protocol Service (MTIPS).

MTIPS secures agencies’ internet traffic by reducing the number of connections needed, which reduces the .gov’s attack surface while making it easier to monitor.

“GSA recognized the MetTel team’s capability to provide MTIPS security services with a modern and modular structure that will provide new benefits for agencies,” said Robert Dapkiewicz, senior vice president and general manager of MetTel Federal, in the announcement. “The constant attempt by cybercriminals to penetrate government websites is showing no sign of slowing down.”

MTIPS covers additional cyber services like continuous monitoring, which in MetTel’s case will be provided by its EIS security partner Raytheon. Other services include network intrusion detection, hosted Domain Name System sink holing, and email scanning and filtering — all of which send real-time data to a security operations center.

One of eight primes on the government’s $50 billion EIS contract for telecom and network modernization, MetTel is the only non-incumbent local exchange carrier to build its own MTIPS infrastructure.

Only three other EIS primes — AT&T, Lumen and Verizon — have completed the TIC Assessment and Authorization Process to receive their MTIPS ATOs. BT Federal, Core Technologies, Defined Technologies and Granite Telecommunications were awarded MTIPS through EIS but have yet to complete the process.

Military-wide data requirements document coming soon, Joint Chiefs’ Hyten says

The nation’s No. 2 general said Wednesday that by the end of spring the military will get a “strategic directive” defining data requirements that will lay the foundation for how the Department of Defense will use data at scale.

The new document will define several technical requirements for networks and data standardization that will be used to implement a common data architecture across the force, said Gen. John Hyten, vice chairman of the Joint Chiefs of Staff. Developed by the Joint Requirements Oversight Council (JROC) which Hyten leads, the document will define data requirements for all the services with the hope of enabling the type of rapid data-sharing and processing needed to field modern concepts of operations and artificial intelligence-enabled warfare.

“We have a chance to actually stay ahead of our adversary…to dominate data,” Hyten said during the 5G Tech Summit hosted by AFCEA DC.

The “Information Advantage Strategic Directive,” as Hyten dubbed it, will be one of many critical documents coming out of the Joint Staff in the coming months that relate to Joint All Domain Command and Control (JADC2) — the overarching concept of operations where a military Internet of Things is born out of the ability to fuse data across the domains of military operations. The goal is to increase lethality by converging operations and fielding force-multiplying technologies like AI that will speed up decision making based on real-time data from the field.

“This is an unbelievingly challenging process,” Hyten said of creating the data standards and military-wide requirements needed to enable JADC2 operations.

On top of the technical challenges, Hyten said he hopes the guidance will help overcome cultural and security barriers to data sharing. Classification levels have stymied the military’s ability to widely share data. Stringent security practices have become muscle memory for some, even when working with less sensitive data that doesn’t need high levels of security.

The requirements will also take into account many of the technical challenges operators face in the field. Rural outposts with limited connectivity can’t send massive packets of data, bandwidth challenges that Hyten said were top of mind when thinking about the requirements for enterprise networks.

“If we push these huge packages of data with 5G…then at the edge what data do we push?” he said, referring to how new 5G networks the military is experimenting with could allow for much more data to be transferred. “That is an unbelievably complicated problem.”

First TIC 3.0 use cases finalized

The first finalized Trusted Internet Connections 3.0 use cases helping agencies secure external connections to federal networks were released by the Cybersecurity and Infrastructure Security Agency on Wednesday.

The Traditional TIC Use Case details the “castle-and-moat” security architecture that most major agencies have used for a decade, while the Branch Office Use Case outlines networking directly to the cloud or an external trust zone — rather than directing internet traffic through a TIC access point or headquarters first.

CISA released draft versions of the two use cases in December 2019, but the November presidential election delayed final approval by the Federal Chief Information Security Officer Council until 2021.

TIC Program Manager Sean Connelly said zero-trust and partner research and development use cases might also come in 2021. And CISA already plans to release infrastructure-as-a-service (IaaS), software-as-a-service (SaaS), platform-as-a-service (PaaS), and email use cases at some point.

Remaining guidance rounds out CISA’s effort to support multiple architectures for securing agency networks as they increasingly move their data to the cloud and their users off premise during the COVID-19 pandemic.

A draft Remote User Use Case released in December replaced Interim Telework Guidance that CISA released in April 2020 in response to vendor requests for help aiding agencies with the pandemic surge in telework. And a draft Volume 2 of the National Cybersecurity Protection System (NCPS) Cloud Interface Reference Architecture (NCIRA) was released at the same time providing an index of common cloud telemetry reporting patterns and characteristics, so agencies can send cloud-specific data to NCPS cloud-based architecture.

Finalized versions of initial TIC 3.0 core guidance — the Program Guidebook, Reference Architecture: Volume 1 and Security Capabilities Catalog — were released in July. The first two documents will be fairly static and the latter a living document that adds capabilities and controls into use cases as they’re announced.

Why federal cybersecurity teams are prioritizing asset management

Jake Munroe is a product marketing manager at Axonius and has held various roles across the security space in consulting, marketing, and sales. Prior to joining the private sector, Munroe served as a Navy Intelligence Analyst with an extensive background in counterterrorism, cyber threat intelligence, and open-source intelligence investigations.

Jake Munroe, Product Marketing Manager, Axonius

Before the pandemic, there was significant urgency to improve network visibility. Accommodating remote workers added to that urgency by introducing more threat vectors. This prompted federal agency leaders to think about security risks in new ways and to prepare for when the workforce returns to the office.

There are many variables to consider — among them, loosened bring-your-own-device policies. While these policy shifts were necessary to accommodate an expanded remote workforce, they’ve left agency IT leaders grappling with significant security gaps. What’s more, as those previously remote devices return to physical offices, they’ll perpetuate the massive visibility challenges CIOs and CISOs already face.

IT’s innately complex landscape can make it hard to answer basic questions about asset management. A new approach aims to solve three challenges federal security teams are facing: understanding what assets an agency has, identifying the security gaps associated with those assets, and taking action to enforce security policies.

Getting a credible asset and user inventory

Asset management is foundational to compliance and there are many key regulations agencies are tasked with adhering to: NIST Cybersecurity Framework, CIS 20, CDM, etc. Even so, asset management is still a challenge for many agency IT teams, resulting in questions like:

These are the questions agency IT teams want quick answers to, and yet so often are challenged to resolve.

Traditionally, agency security teams are intimately familiar with security tools, and IT teams familiar with asset management tools — but there’s limited crosstalk between these data sources. Additionally, traditional approaches to compiling an asset inventory are typically time consuming and error prone, often requiring manual input into spreadsheets listing the physical devices, software and licenses across various departments. As soon as an inventory is compiled, it quickly becomes obsolete.

CISA’s Continuous Diagnostics and Mitigation (CDM) program has been active for nearly a decade to help fortify cybersecurity of agency networks. Even so, many agencies are still struggling to enforce a holistic strategy around network visibility.

The cybersecurity asset management approach delivers agencies with enhanced visibility into assets and issues, enabling them to gain compliance with key regulations.

Axonius is a cybersecurity asset management platform that discovers all of the assets in an environment, and then helps agencies validate compliance and automate remediation. To do this, Axonius uses adapters, not agents or scanning, to connect to over 300 security and management tools, allowing users to collect and aggregate data from across the entire organization.

Discovering security gaps

At Axonius, there are typically three things we recommend to agencies aiming to improve their cybersecurity posture:

  1. Start by compiling and assessing their asset inventory
  2. Discover coverage gaps with the data collected
  3. Enforce security policies.

But taking action on those steps can be challenging when agencies use traditional, manual approaches to aggregate an asset inventory. It’s hard to get a full picture of both users and devices across the various tools which agencies own and when those tools don’t communicate with each other.

A platform that provides data aggregation across all IT security and management offerings can help agencies build and maintain an active asset management system. From there, agencies can identify coverage gaps and take steps to automate and enforce security policies.

Some benefits that federal security teams are seeing from cybersecurity asset management include:

Perhaps the most important benefit is one that can’t be quantified: Confidence. Visibility into assets and gaps gives security teams the tools they need to be more confident in their ability to comply with regulations and keep their agencies secure.

Take action to close security gaps

Because Axonius aggregates together data from all available sources, it provides federal security and IT teams with the ability to send alerts, perform search queries, and enforce an automated action.

Some of the ways we’ve seen agencies use Axonius include:

The suddenly remote workforce only added to an already rapidly changing operating environment for federal IT and security teams. This complexity only underscores the importance of combining asset management, endpoint security, vulnerability assessment, and real-time enforcement together in one view.

Learn more about why asset management matters for federal cybersecurity teams.

Government agencies lean into managed services to stay ahead of IT demands

Seismic shifts in technology solutions and a need for modern IT and security skills are prompting a sizeable portion of federal agencies to bring in managed service providers (MSPs) to advance their IT needs, according to a new survey of federal agency IT, business and program executives.

Two in three respondents in the survey, conducted by FedScoop and underwritten by GDIT, say their agency is currently using, planning to use or considering retaining an MSP to support their IT work.

managed services

Read the full report.

Though the reasons driving agencies to use MSPs vary, one key undercurrent lies in the pressing agency need for cost efficiencies, relying on MSPs with experience deploying cloud, data and cybersecurity solutions.  At the same time, the reasons to work with MSPs have grown more nuanced in recent years.

There are a number of factors motivating agencies to take a fresh look at MSPs. Reducing long-term IT costs ranks highest — by six in 10 respondents. But four in 10 respondents also cite the importance of focusing more attention on mission and less on operations as a driving force. A third of respondents indicate the need to gain greater operational agility and automation, and to reduce risks, are also big motivators.

Executives in the survey identify a variety of core areas where MSPs provide value. Most respondents (65%) point to staffing and support, but 54% cite security and 44% mention compliance as key areas where MSPs are filling critical support gaps. Another four in 10 respondents say the need for training and change management, and for supporting continuous improvement and innovation, are core areas where MSPs make a difference.

The types of specific managed IT services demanded by agencies are also in flux.

Asked which IT services agencies had handed over to an MSP over the past five years, help desk services ranks at the top of the list. Over the next five years, in contrast, agencies will look to MSPs most for help with cloud infrastructure services, followed by backup and recovery, cybersecurity, data analytics, networks/infrastructure and software development.

Growing importance of MSPs

Managed service providers have historically been viewed by federal agencies though the lens of how well they reduce total IT costs, while still allowing agencies to control their IT outcomes. The findings suggest that MSPs are likely to play an increasingly important role in helping agencies manage their overall IT infrastructure, operations and security, much as commercial enterprises now rely on MSPs to keep their IT up to date.

Given the mission benefits of accelerating IT modernization and improving service delivery, the study findings show that agencies are coming to recognize that even if moving to an MSP is cost-neutral, or costs slightly more, it may be well worth it.

Based on the completed responses of 162 pre-qualified federal agency executives with mission, business or IT decision-making responsibilities, the study also explores which factors of success are most important in considering an MSP.

Technical expertise and experience, the ability to fully assess and support an agency’s specific needs and predictable costs are all seen as key factors in successful MSPs. So are the transparency of service level agreements and the ability to adopt or integrate new and emerging technologies downstream once an agency contracts with an MSP.

Managed service models are also continuing to evolve. Rob Smallwood, Vice President of Digital Modernization at General Dynamics Information Technology, a leading MSP provider in the federal market, suggests it’s important to look for MSPs with comprehensive expertise at every level of the IT stack. However, agencies should also look for MSPs that are flexible enough to provide selective, augmented or hybrid support.

“Managed services is about outcomes — what benefits to their enterprise the agency wants to achieve. By focusing on outcome-based managed services, the provider can enable the organization to ultimately achieve what they’re looking for, like cost savings, improved operations, increased value and freeing agency personnel to focus on their mission instead of the IT,” Smallwood said. “That’s also why flexible or hybrid models are preferred by agencies, because they’re working together with the provider to achieve those outcomes.”

The study also concludes that agencies should take steps to understand and fully identify many of the associated support costs that often aren’t transparent in their IT budgets before comparing proposals from MSPs.

Download the full report, “Managed Services: Powering Federal IT” for the detailed findings.

Join GDIT experts as they discuss the study findings and the journey to managed services in a virtual webinar on May 11: Guided Journey to Outcome-Based Managed Services

This article was produced by FedScoop and underwritten by GDIT.

Navy planning 4 major tests for network integration in 2021

The Navy plans to test its network integration capabilities — a key pillar of its modernization plan — at least four times this year, the chief of naval operations said Wednesday.

Adm. Mike Gilday described the tests as “big spirals” where siloed networks are combined into a “network-of-networks” operation to give the service an Internet of Things-like capability.

These tests — part of the wider military’s overarching Joint All Domain Command and Control (JADC2) concept — won’t be the first the Navy has undertaken. But Gilday said they will allow the Navy to put more data through its systems and unify more platforms with new applications at a greater scale.

The tests will “allow us to bring more networks into that network-of-networks construct,” Gilday said during a Center for a New American Security event. “So that’s testing more data on more networks and introducing more battle management aids to put the end-user in a position where they can see the battlespace better.”

Network integration is one of four core pillars in the Navy’s approach to JADC2, along with agile software development, common data standards and battle management applications. All of these efforts are hosted under Project Overmatch, the Navy’s JADC2 implementation program.

“The reason why this is so important is, first off, we need to maintain decision advantage over the adversary,” he said of the tests and overall JADC2 strategy.

Announcing the tests beforehand is new for the Navy, which has been tighter-lipped about its progress with JADC2 implementation than the other military services. Previous major tests that the Navy hosted within the service and in joint operations with other parts of the military have come to light after the fact.

Another recent change is that Project Overmatch got a new boss, with Rear Adm. Douglas Small leading a “robust” team of “technically savvy” civilians, Gilday said. Part of a reorganization, this gives Small added authorities and more central control of implementing the highly technical changes to network operations and data standards.

The team recently launched a battle management application on the USS Carl Vinson that Gilday praised as the first of many to help commanders interact and see data more clearly.

“The applications that we are applying now are much like the applications in your phone,” he said.

NIH’s COVID-19 data enclave continues to evolve with the virus

Technology linking patient records across data sources while preserving their privacy is being prototyped by the National Institutes of Health as researchers attempt to understand the evolving COVID-19 virus and its variants.

The National Center for Advancing Translational Sciences within NIH launched the largest COVID-19 dataset in the U.S., the National COVID Cohort Collaborative (N3C) Data Enclave, in April. And now NCATS wants to use privacy-preserving record linkage (PPRL) to link data from its enclave with medical images, omics tools, electronic health records (EHRs), and social determinants of health to answer researchers’ lingering questions like why COVID-19 symptoms linger in some patients.

PPRL finds and links records on the same patient across independently maintained data sources using a cryptographic hash value to protect their identity.

“Combining the EHR data with prospective studies and COVID clinics is going to be really important to be able to follow people over time, do specific interventions and try to tease out the differences in these diseases,” Dr. Ken Gersing, director of informatics at NCATS, told FedScoop. “What we’re now calling ‘long COVID’ is surely a syndrome of groups of many different illnesses, rather than one particular illness.”

Multimodal analytics being implemented now will give researchers the ability to look at patient images with their lab results, but some of the data sources NCATS wants to link to the N3C Enclave are maintained by other agencies like the Centers for Medicare & Medicaid Services.

PPRL respects data ownership by temporarily linking datasets in a neutral, high-performance computing area long enough for researchers to complete their work. Duplicate information is eliminated in the process.

NCATS still has hurdles to clear before PPRL goes live, ideally in two to five months, Gersing said. PPRL needs to be financed, legal barriers must be navigated and there’s a question of how to truly de-identify data from omics tools.

NIH announced funding for its institutes and centers (ICs) to research long COVID using PPRL in late January, going so far as to contract with two vendors. Datavant is handling the PPRL technology, while Regenstrief Group agreed to serve as the honest data broker for matching records.

“We, as the holders of the data, don’t want to also be the linkage group for the patients’ benefit, for the institutions’ benefit and for our benefit also — that there’s no conflict of interest and for preserving privacy,” Gersing said.

Appointing a data broker further allows researchers to ask COVID-19 patients to participate in potential studies. Researchers flag hashes of interest for the broker, which has the local institution where they originated de-encrypt them for the purpose of reaching out. That way patient identities remain with local institutions alone.

About 1,900 researchers from nearly 300 institutions were working in the N3C Data Enclave, which contained data from about 800,000 COVID-19 patients as of March. ICs like the National Heart, Lung, and Blood Institute and the National Institute of Child Health and Human Development; agencies like the Food and Drug Administration and the Agency for Healthcare Research and Quality, and companies like Pfizer and IBM all use the enclave.

While generally these institutions consider each other competitors, NIH agreed to harmonize their datasets and make them available to all with rules against reselling, re-identifying, downloading and using for non-COVID research.

The N3C Data Enclave is a Palantir analytics platform with three subsets — synthetic, de-identified and limited datasets — that a Data Access Committee of federal officials may or may not grant researchers access to upon request.

Only the limited dataset, the hardest to obtain access to, contains true dates and ZIP codes. Meanwhile the synthetic dataset, the easiest to access, is a pilot in itself.

“If we can prove that the computer-generated data, modeled off of the limited dataset, is truly equivalent scientifically and privacy-wise, then there’s no reason this data can’t be shared across the world,” Gersing said. “Just put it out there as a file.”

NCATS paid for all the technical infrastructure, which normally researchers have to spend a portion of their grant money on, so they could focus on answering questions like: What medications alleviate COVID-19 symptoms better depending on case severity? And what variables can doctors use to predict how sick a hospital patient will likely get for resource and treatment planning purposes?

The Johnson & Johnson, Moderna and Pfizer vaccines have special RxNorm numbers in EHRs that will help N3C researchers study their efficacy over time.

NCATS’s data enclave is a Federal Risk and Authorization Management Program-certified environment that also requires dual authentication to access. The center’s security office monitors the enclave and also has an outside federal group run penetration tests, though it hasn’t really run into nefarious actor to date, Gersing said.

“If this data ever got out of the enclave, it would shut down a very valuable resource,” he said. “I’m not saying it’s job one, but it sure is close.”

Biden’s GSA administrator pick Robin Carnahan boasts strong tech credentials

President Joe Biden intends to make one of 2017’s “Top Women in Tech” the head of the General Services Administration, the White House announced Tuesday.

Robin Carnahan founded and led the state and local government practice at 18F, GSA‘s tech consultancy, from 2016 to 2020, having previously been Missouri’s secretary of state.

Most recently, Carnahan co-founded the State Software Collaborative as a fellow at Georgetown University’s Beeck Center.

While at GSA, Carnahan helped state and local governments improve their digital services while cutting costs. Her practice taught non-technical officials about IT risk management, procurement and modernization projects.

As Missouri’s secretary of state Carnahan modernized online services for hundreds of thousands of customers related to both elections and securities. A Democrat, she also ran for one of Missouri’s Senate seats in 2010 but lost to Republican Roy Blunt.

Carnahan regularly testifies before Congress on government innovation, but Biden‘s nominee will still have to endure a Senate confirmation hearing before assuming the role of GSA administrator, which Katy Kale has been filling in an acting capacity.