White House proposes even more for Technology Modernization Fund

In its fiscal 2022 discretionary funding request on Friday, the White House asked Congress for an additional $500 million for the Technology Modernization Fund, citing agencies’ need to update and secure antiquated information systems.

That’s on top of a $110 million increase, to $2.1 billion, requested for the Cybersecurity and Infrastructure Security Agency and $750 million requested as a reserve for agencies’ IT enhancements.

The American Rescue Plan Act saw a record $1 billion injected into the TMF and $650 million appropriated to CISA in March, but countering the cybersecurity threats posed by China and Russia remains a priority for the Biden administration.

The General Services Administration will play a central role in managing the TMF and other governmentwide funding for IT modernization.

“These funds will allow GSA to support the administration’s efforts to tackle the climate crisis, promote economic opportunity and strengthen federal cybersecurity,” Acting Administrator Katy Kale said in a statement. “These critical investments will enhance support to federal agencies and the public, while making our nation’s infrastructure more secure and sustainable.”

GSA‘s emphasis will be on mission-critical systems and citizen-facing digital services in light of the COVID-19 pandemic and the SolarWinds hack that compromised products agencies use, according to the request.

The proposed increase for CISA would go toward enhanced cyber tools, hiring and support services. Another $20 million would be for a new Cyber Response and Recovery Fund.

Within the Department of Commerce, the White House asked for $916 million — a $128 million increase — in funding for the National Institute of Standards and Technology. That money would expand NIST’s research into computing, cybersecurity, artificial intelligence and quantum information science (QIS). Another $39 million was requested for spectrum sharing research benefitting broadband and 5G deployment.

Additionally, the White House proposed a $1.7 billion increase, to $10.2 billion, for the National Science Foundation to, in part, establish a new directorate for prioritizing practical applications of AI, high-performance computing, QIS, robotics, advanced communications, and cybersecurity research.

“NSF stands ready to maximize the impact of this increase in funding and tackle critical challenges to bolster the U.S. economy and our leadership in critical and emerging areas of research and technological advancements,” said an agency spokesperson in a statement.

The White House request also asks for $4.8 billion for the Department of Veterans Affairs‘ Office of Information Technology in support of cloud modernization and $2.7 billion for the department’s ongoing Electronic Health Record modernization.

Discretionary funding is but one part of the president’s overall proposed budget request, which the Office of Management and Budget intends to release in the coming month. The discretionary request doesn’t include mandatory proposals or tax reforms, and Congress will decide which of the president’s proposals to fund.

“This year’s appropriations process comes at a particularly important moment,” said one administration official on a call with reporters. “Where the past decade, due to overly restrictive budget caps, our country has underinvested in core public services, benefits and protections that are incredibly important to our success.”

Three recommendations to secure a hybrid workforce

Kurt Steege, chief technology officer at ThunderCat and Peter Romness, cybersecurity principal at Cisco, together bring decades of experience advising IT leaders in the U.S. government.

hybrid work

Peter Romness, Cybersecurity Principal, Cisco and Kurt Steege, CTO, ThunderCat

The pandemic proved to agency leaders that they can offer a more flexible work arrangement for government workers. But securing a remote and hybrid work environment for today — and tomorrow — requires greater attention to a holistic security strategy.

Flexibility built into both policies and the underlying IT infrastructure is one way that CIOs and CISOs can accommodate a new way of working. And what agency leaders should aim for is a near seamless and equitable work experience — whether from home or from the office.

The good news is that thanks to the investments many agencies made to use cloud infrastructure, IT leaders are now in a position to take advantage of more effective cloud security capabilities around data. That includes identity and access controls that can reduce agencies’ overall security risks in the years ahead.

Smart cloud decisions yesterday make today’s response possible

The immediate need during the pandemic was to adjust IT systems so that employees could work productively at home. Secondary to that, agency IT departments needed to make certain those systems were secure. Unfortunately, the traditional “checkbox approach” to securing systems is no longer enough to lessen the level of cyber risk agencies face today.

To build a holistic security strategy will take both time and money — for which there are many limitations for agencies.

The bright side is that we have seen how the Cloud First and Cloud Smart policies set by the last two administrations have paid off in big ways. In fact, the most notable successes to facilitating mission during the pandemic are coming from those organizations which have been leveraging their cloud investments.

The biggest change making a difference in security — more than any other security practice — is when organizations use cloud tools to implement dynamic and persona-based policies that control access to agency resources. It not only improves security. It also improves the user experience, by allowing people to view content in a way that helps them in their job — regardless of the location — without jumping through a variety of security hoops to make that happen.

To achieve those improvements, though, requires visibility across the network. From a data security standpoint that means understanding where your data is, how it is being used and accessed, how the network behaves and knowing what policies that have been built.

Investing in security for hybrid work environments

The future of work is poised to look very differently across both the government and private sectors now that leaders and employees alike have experienced many of the positive benefits of a flexible work environment.

One of the discussions we have been a part of with some of our customers is a thoughtful transition to a “30-40-30” office-home work model: 30% of an organization’s staff may never return to the office; 40% may go back to the office a few days a week; and the remaining 30% would most likely work full-time at the office.

To secure this new work model, our first recommendation involves matching policies with existing use cases. Even before you look at the security tools you plan to use, weaving together policies regarding identity and data will make the whole system run more smoothly and securely.

Our next recommendation — and often a sticking point when managing data security — is understanding appropriate levels of security classification and sensitivity. For agencies that work in a more classified or sensitive area, it’s easy to just classify everything the same. But it’s also important to look at the long-term needs of users. The good news is, dynamic policies make it easy to adjust the data classification to be more variable, depending on the user and type of data.

That ties into our third recommendation, which is identifying what you have. A lot of organizations don’t know where to start in this endeavor. The don’t know what data they have or where it is; they often don’t even know all the devices that are in their environment or what those devices are doing. Having an accurate inventory really matters.

The value of working with strong partners

While at the surface these recommendations may seem simple, the complexity of agencies’ enterprise network brings a lot of challenges. That is why we promote working with a strong integration partner to get the most from your existing security investments and lessen the burden of acquisitions for new tools.

The partnership between ThunderCat Technology and Cisco offers a great resource for agencies to integrate and automate Cisco’s security tools across agency networks because ThunderCat Technology has built a practice around Cisco’s suite of solutions.

Cisco brings a full range of tools that provide the strongest levels of visibility, flexibility and security. ThunderCat Technology, meanwhile, understands all the components operating across an agency’s systems, and can serve as a knowledgeable advisor for how to best develop a holistic security strategy across multiple vendors partners so everything works together.

Learn more about how ThunderCat Technology and Cisco can help your organization integrate a holistic security strategy.

White House asks for $5B to fund VA IT in 2022

The White House released a discretionary budget outline Friday that asks Congress to appropriate $4.8 billion for the Department of Veterans Affairs‘ Office of Information and Technology for fiscal 2022.

The $4.8 billion top-line IT number is just shy of the enacted $4.9 billion given by Congress last year, which doesn’t include emergency funding made available to the VA to account for telemedicine and telework needs during the pandemic.

Separate from the OIT budget, the White House has also asked for $2.7 billion for the continued modernization of VA’s electronic health record, a 10-year project that could cost north of $16 billion before it’s all said and done.

“The funding request invests in the core foundations of our country’s strength and advances key U.S. Department of Veterans Affairs (VA) priorities, including addressing Veteran homelessness, suicide prevention, caregiver support, and modernizing information technology systems to enhance customer service experience and ensure Veterans receive world-class health care,” VA Secretary Denis McDonough said in a statement following the budget proposal release.

In total, the VA is requesting $113.1 billion in discretionary funding, an $8.5 billion or 8.2% increase from the fiscal 2021 enacted level, according to the White House.

Congress ended up giving the VA more than it asked for in fiscal 2021, and we’ll have to wait and see if appropriators will be as generous this time given the record spending the government has already undertaken in light of the pandemic.

Leaders on Capitol Hill and at VA have expressed concerns about cost overruns on the EHR program, which is funded separately from the IT budget. Secretary McDonough told Congress in March that he saw higher than anticipated staff needs during the initial launch of the program, which may result in higher costs than the $16 billion originally expected.

The VA received $2.6 billion in the enacted fiscal 2021 budget for the EHR program.

Spending on the new cloud-based medical records system was supposed to peak early in the rollout. The program is built on Cerner’s Millennium software system and will eventually be interoperable with a similar system being rolled out in military medical centers.

Agencies gain ‘momentum’ appointing Evidence Act leadership

Government has seen “momentum” around evidence-based policymaking at agencies, the majority having placed senior officials in charge of advancing data-driven decision making, according to the Evidence Team lead at the Office of Management and Budget.

All agencies submitted their interim learning agendas, first annual evaluation plans and interim capacity assessments in September, as mandated by OMB guidance stemming from the Foundations for Evidence-Based Policymaking Act, said Diana Epstein.

More than two years after the passage of the Evidence Act, relatively few agencies lack the leadership needed to implement its requirements.

“For the most part agencies have named their designated officials: the evaluation officers, the statistical officials and the chief data officers,” Epstein said, during a Data Foundation event Thursday. “The councils for each of these officials have been meeting regularly, and we’ve had some great cross-council collaboration.”

The Evaluation Officer Council meets monthly and works regularly with the Federal CDO Council and Performance Improvement Council. Meanwhile, Epstein’s team and the Office of Evaluation Sciences within the General Services Administration hold a monthly Evaluation and Evidence Training Series for hundreds of federal employees. The Interagency Council on Evaluation Policy was also rebooted and expanded.

OMB provided detailed feedback on agencies’ draft documents, and some have already published their evaluation plans on their websites as required.

“The last thing we want is for this to be yet another compliance or reporting exercise where agencies just put in minimal effort, check the boxes and nothing really changes,” Epstein said.

Agencies are expected to submit their first full learning agendas — identifying priority questions about programs, policies and regulations that can be answered with data — and capacity assessments as part of their strategic plans next fall. That’s on top of their fiscal 2023 evaluation plans. 

The Biden administration recently reaffirmed government’s commitment to evidence-based policymaking with its Memo on Restoring Trust in Government, which will see OMB release additional guidance in the coming months.

“We still have a long way to go,” Epstein said. “But it’s very exciting to see all the progress that we’re making collectively.”

Final CMMC rule expected to be finished in about a month

The final Defense Federal Acquisition Regulation Supplement (DFARS) rule that will require all contractors to have third-party inspections of their networks prior to working with the Department of Defense will get its final tweaks within the next 30-40 days, the program’s lead official said Thursday.

The interim final rule for the Cybersecurity Maturity Model Certification (CMMC) that was published in September received many comments from industry that the DOD has been working to adjudicate, said Katie Arrington, the department’s chief information security officer for acquisition and sustainment. She said the team is working to make the rule “go final” in about a month.

“You shouldn’t be waiting to build [cybersecurity] costs in” to rates, Arrington said to contractors during a Deltek webinar.

The interim final rule put CMMC into effect in December but had an open comment period for industry to give feedback to the government. As the CMMC program management office works through feedback, it has been tweaking the rule.

Issuing an interim final rule is not the norm but was needed because of the importance of securing industrial base contractors, Arrington said. CMMC is the department’s latest attempt to secure the industrial base’s cybersecurity, which has been vulnerable to massive data breaches of government information down the supply chain.

One of the biggest questions about the rule has been about reciprocity between CMMC and other federal cyber compliance programs. Arrington didn’t say what reciprocity may be coming but said that there will be guidance in CMMC Assessment Guides the DOD is working on.

There are other parts of the CMMC DFARs rule that will impact contractors before they are required to get an assessment. They now need to submit a self-assessment of their cyber compliance to the DOD, according to the rule. That process is separate from the CMMC assessment but could help companies prepare for their inspection by giving themselves a test first.

“The only thing they need to wait for is for the assessor to be aligned with the [third party assessment organizations],” Arrington said. No organization has been fully cleared yet to give assessments.

MetTel becomes latest EIS vendor to receive managed security services authorization

The government gave MetTel permission to provide Trusted Internet Connections 3.0-compliant managed security services through the Enterprise Infrastructure Solutions contract, the telecommunications company announced Thursday.

Together the General Services Administration and Cybersecurity and Infrastructure Security Agency granted MetTel the authority to operate (ATO) Managed Trusted Internet Protocol Service (MTIPS).

MTIPS secures agencies’ internet traffic by reducing the number of connections needed, which reduces the .gov’s attack surface while making it easier to monitor.

“GSA recognized the MetTel team’s capability to provide MTIPS security services with a modern and modular structure that will provide new benefits for agencies,” said Robert Dapkiewicz, senior vice president and general manager of MetTel Federal, in the announcement. “The constant attempt by cybercriminals to penetrate government websites is showing no sign of slowing down.”

MTIPS covers additional cyber services like continuous monitoring, which in MetTel’s case will be provided by its EIS security partner Raytheon. Other services include network intrusion detection, hosted Domain Name System sink holing, and email scanning and filtering — all of which send real-time data to a security operations center.

One of eight primes on the government’s $50 billion EIS contract for telecom and network modernization, MetTel is the only non-incumbent local exchange carrier to build its own MTIPS infrastructure.

Only three other EIS primes — AT&T, Lumen and Verizon — have completed the TIC Assessment and Authorization Process to receive their MTIPS ATOs. BT Federal, Core Technologies, Defined Technologies and Granite Telecommunications were awarded MTIPS through EIS but have yet to complete the process.

Military-wide data requirements document coming soon, Joint Chiefs’ Hyten says

The nation’s No. 2 general said Wednesday that by the end of spring the military will get a “strategic directive” defining data requirements that will lay the foundation for how the Department of Defense will use data at scale.

The new document will define several technical requirements for networks and data standardization that will be used to implement a common data architecture across the force, said Gen. John Hyten, vice chairman of the Joint Chiefs of Staff. Developed by the Joint Requirements Oversight Council (JROC) which Hyten leads, the document will define data requirements for all the services with the hope of enabling the type of rapid data-sharing and processing needed to field modern concepts of operations and artificial intelligence-enabled warfare.

“We have a chance to actually stay ahead of our adversary…to dominate data,” Hyten said during the 5G Tech Summit hosted by AFCEA DC.

The “Information Advantage Strategic Directive,” as Hyten dubbed it, will be one of many critical documents coming out of the Joint Staff in the coming months that relate to Joint All Domain Command and Control (JADC2) — the overarching concept of operations where a military Internet of Things is born out of the ability to fuse data across the domains of military operations. The goal is to increase lethality by converging operations and fielding force-multiplying technologies like AI that will speed up decision making based on real-time data from the field.

“This is an unbelievingly challenging process,” Hyten said of creating the data standards and military-wide requirements needed to enable JADC2 operations.

On top of the technical challenges, Hyten said he hopes the guidance will help overcome cultural and security barriers to data sharing. Classification levels have stymied the military’s ability to widely share data. Stringent security practices have become muscle memory for some, even when working with less sensitive data that doesn’t need high levels of security.

The requirements will also take into account many of the technical challenges operators face in the field. Rural outposts with limited connectivity can’t send massive packets of data, bandwidth challenges that Hyten said were top of mind when thinking about the requirements for enterprise networks.

“If we push these huge packages of data with 5G…then at the edge what data do we push?” he said, referring to how new 5G networks the military is experimenting with could allow for much more data to be transferred. “That is an unbelievably complicated problem.”

First TIC 3.0 use cases finalized

The first finalized Trusted Internet Connections 3.0 use cases helping agencies secure external connections to federal networks were released by the Cybersecurity and Infrastructure Security Agency on Wednesday.

The Traditional TIC Use Case details the “castle-and-moat” security architecture that most major agencies have used for a decade, while the Branch Office Use Case outlines networking directly to the cloud or an external trust zone — rather than directing internet traffic through a TIC access point or headquarters first.

CISA released draft versions of the two use cases in December 2019, but the November presidential election delayed final approval by the Federal Chief Information Security Officer Council until 2021.

TIC Program Manager Sean Connelly said zero-trust and partner research and development use cases might also come in 2021. And CISA already plans to release infrastructure-as-a-service (IaaS), software-as-a-service (SaaS), platform-as-a-service (PaaS), and email use cases at some point.

Remaining guidance rounds out CISA’s effort to support multiple architectures for securing agency networks as they increasingly move their data to the cloud and their users off premise during the COVID-19 pandemic.

A draft Remote User Use Case released in December replaced Interim Telework Guidance that CISA released in April 2020 in response to vendor requests for help aiding agencies with the pandemic surge in telework. And a draft Volume 2 of the National Cybersecurity Protection System (NCPS) Cloud Interface Reference Architecture (NCIRA) was released at the same time providing an index of common cloud telemetry reporting patterns and characteristics, so agencies can send cloud-specific data to NCPS cloud-based architecture.

Finalized versions of initial TIC 3.0 core guidance — the Program Guidebook, Reference Architecture: Volume 1 and Security Capabilities Catalog — were released in July. The first two documents will be fairly static and the latter a living document that adds capabilities and controls into use cases as they’re announced.

Why federal cybersecurity teams are prioritizing asset management

Jake Munroe is a product marketing manager at Axonius and has held various roles across the security space in consulting, marketing, and sales. Prior to joining the private sector, Munroe served as a Navy Intelligence Analyst with an extensive background in counterterrorism, cyber threat intelligence, and open-source intelligence investigations.

Jake Munroe, Product Marketing Manager, Axonius

Before the pandemic, there was significant urgency to improve network visibility. Accommodating remote workers added to that urgency by introducing more threat vectors. This prompted federal agency leaders to think about security risks in new ways and to prepare for when the workforce returns to the office.

There are many variables to consider — among them, loosened bring-your-own-device policies. While these policy shifts were necessary to accommodate an expanded remote workforce, they’ve left agency IT leaders grappling with significant security gaps. What’s more, as those previously remote devices return to physical offices, they’ll perpetuate the massive visibility challenges CIOs and CISOs already face.

IT’s innately complex landscape can make it hard to answer basic questions about asset management. A new approach aims to solve three challenges federal security teams are facing: understanding what assets an agency has, identifying the security gaps associated with those assets, and taking action to enforce security policies.

Getting a credible asset and user inventory

Asset management is foundational to compliance and there are many key regulations agencies are tasked with adhering to: NIST Cybersecurity Framework, CIS 20, CDM, etc. Even so, asset management is still a challenge for many agency IT teams, resulting in questions like:

These are the questions agency IT teams want quick answers to, and yet so often are challenged to resolve.

Traditionally, agency security teams are intimately familiar with security tools, and IT teams familiar with asset management tools — but there’s limited crosstalk between these data sources. Additionally, traditional approaches to compiling an asset inventory are typically time consuming and error prone, often requiring manual input into spreadsheets listing the physical devices, software and licenses across various departments. As soon as an inventory is compiled, it quickly becomes obsolete.

CISA’s Continuous Diagnostics and Mitigation (CDM) program has been active for nearly a decade to help fortify cybersecurity of agency networks. Even so, many agencies are still struggling to enforce a holistic strategy around network visibility.

The cybersecurity asset management approach delivers agencies with enhanced visibility into assets and issues, enabling them to gain compliance with key regulations.

Axonius is a cybersecurity asset management platform that discovers all of the assets in an environment, and then helps agencies validate compliance and automate remediation. To do this, Axonius uses adapters, not agents or scanning, to connect to over 300 security and management tools, allowing users to collect and aggregate data from across the entire organization.

Discovering security gaps

At Axonius, there are typically three things we recommend to agencies aiming to improve their cybersecurity posture:

  1. Start by compiling and assessing their asset inventory
  2. Discover coverage gaps with the data collected
  3. Enforce security policies.

But taking action on those steps can be challenging when agencies use traditional, manual approaches to aggregate an asset inventory. It’s hard to get a full picture of both users and devices across the various tools which agencies own and when those tools don’t communicate with each other.

A platform that provides data aggregation across all IT security and management offerings can help agencies build and maintain an active asset management system. From there, agencies can identify coverage gaps and take steps to automate and enforce security policies.

Some benefits that federal security teams are seeing from cybersecurity asset management include:

Perhaps the most important benefit is one that can’t be quantified: Confidence. Visibility into assets and gaps gives security teams the tools they need to be more confident in their ability to comply with regulations and keep their agencies secure.

Take action to close security gaps

Because Axonius aggregates together data from all available sources, it provides federal security and IT teams with the ability to send alerts, perform search queries, and enforce an automated action.

Some of the ways we’ve seen agencies use Axonius include:

The suddenly remote workforce only added to an already rapidly changing operating environment for federal IT and security teams. This complexity only underscores the importance of combining asset management, endpoint security, vulnerability assessment, and real-time enforcement together in one view.

Learn more about why asset management matters for federal cybersecurity teams.

Government agencies lean into managed services to stay ahead of IT demands

Seismic shifts in technology solutions and a need for modern IT and security skills are prompting a sizeable portion of federal agencies to bring in managed service providers (MSPs) to advance their IT needs, according to a new survey of federal agency IT, business and program executives.

Two in three respondents in the survey, conducted by FedScoop and underwritten by GDIT, say their agency is currently using, planning to use or considering retaining an MSP to support their IT work.

managed services

Read the full report.

Though the reasons driving agencies to use MSPs vary, one key undercurrent lies in the pressing agency need for cost efficiencies, relying on MSPs with experience deploying cloud, data and cybersecurity solutions.  At the same time, the reasons to work with MSPs have grown more nuanced in recent years.

There are a number of factors motivating agencies to take a fresh look at MSPs. Reducing long-term IT costs ranks highest — by six in 10 respondents. But four in 10 respondents also cite the importance of focusing more attention on mission and less on operations as a driving force. A third of respondents indicate the need to gain greater operational agility and automation, and to reduce risks, are also big motivators.

Executives in the survey identify a variety of core areas where MSPs provide value. Most respondents (65%) point to staffing and support, but 54% cite security and 44% mention compliance as key areas where MSPs are filling critical support gaps. Another four in 10 respondents say the need for training and change management, and for supporting continuous improvement and innovation, are core areas where MSPs make a difference.

The types of specific managed IT services demanded by agencies are also in flux.

Asked which IT services agencies had handed over to an MSP over the past five years, help desk services ranks at the top of the list. Over the next five years, in contrast, agencies will look to MSPs most for help with cloud infrastructure services, followed by backup and recovery, cybersecurity, data analytics, networks/infrastructure and software development.

Growing importance of MSPs

Managed service providers have historically been viewed by federal agencies though the lens of how well they reduce total IT costs, while still allowing agencies to control their IT outcomes. The findings suggest that MSPs are likely to play an increasingly important role in helping agencies manage their overall IT infrastructure, operations and security, much as commercial enterprises now rely on MSPs to keep their IT up to date.

Given the mission benefits of accelerating IT modernization and improving service delivery, the study findings show that agencies are coming to recognize that even if moving to an MSP is cost-neutral, or costs slightly more, it may be well worth it.

Based on the completed responses of 162 pre-qualified federal agency executives with mission, business or IT decision-making responsibilities, the study also explores which factors of success are most important in considering an MSP.

Technical expertise and experience, the ability to fully assess and support an agency’s specific needs and predictable costs are all seen as key factors in successful MSPs. So are the transparency of service level agreements and the ability to adopt or integrate new and emerging technologies downstream once an agency contracts with an MSP.

Managed service models are also continuing to evolve. Rob Smallwood, Vice President of Digital Modernization at General Dynamics Information Technology, a leading MSP provider in the federal market, suggests it’s important to look for MSPs with comprehensive expertise at every level of the IT stack. However, agencies should also look for MSPs that are flexible enough to provide selective, augmented or hybrid support.

“Managed services is about outcomes — what benefits to their enterprise the agency wants to achieve. By focusing on outcome-based managed services, the provider can enable the organization to ultimately achieve what they’re looking for, like cost savings, improved operations, increased value and freeing agency personnel to focus on their mission instead of the IT,” Smallwood said. “That’s also why flexible or hybrid models are preferred by agencies, because they’re working together with the provider to achieve those outcomes.”

The study also concludes that agencies should take steps to understand and fully identify many of the associated support costs that often aren’t transparent in their IT budgets before comparing proposals from MSPs.

Download the full report, “Managed Services: Powering Federal IT” for the detailed findings.

Join GDIT experts as they discuss the study findings and the journey to managed services in a virtual webinar on May 11: Guided Journey to Outcome-Based Managed Services

This article was produced by FedScoop and underwritten by GDIT.