Why federal cybersecurity teams are prioritizing asset management

Jake Munroe is a product marketing manager at Axonius and has held various roles across the security space in consulting, marketing, and sales. Prior to joining the private sector, Munroe served as a Navy Intelligence Analyst with an extensive background in counterterrorism, cyber threat intelligence, and open-source intelligence investigations.

Jake Munroe, Product Marketing Manager, Axonius

Before the pandemic, there was significant urgency to improve network visibility. Accommodating remote workers added to that urgency by introducing more threat vectors. This prompted federal agency leaders to think about security risks in new ways and to prepare for when the workforce returns to the office.

There are many variables to consider — among them, loosened bring-your-own-device policies. While these policy shifts were necessary to accommodate an expanded remote workforce, they’ve left agency IT leaders grappling with significant security gaps. What’s more, as those previously remote devices return to physical offices, they’ll perpetuate the massive visibility challenges CIOs and CISOs already face.

IT’s innately complex landscape can make it hard to answer basic questions about asset management. A new approach aims to solve three challenges federal security teams are facing: understanding what assets an agency has, identifying the security gaps associated with those assets, and taking action to enforce security policies.

Getting a credible asset and user inventory

Asset management is foundational to compliance and there are many key regulations agencies are tasked with adhering to: NIST Cybersecurity Framework, CIS 20, CDM, etc. Even so, asset management is still a challenge for many agency IT teams, resulting in questions like:

These are the questions agency IT teams want quick answers to, and yet so often are challenged to resolve.

Traditionally, agency security teams are intimately familiar with security tools, and IT teams familiar with asset management tools — but there’s limited crosstalk between these data sources. Additionally, traditional approaches to compiling an asset inventory are typically time consuming and error prone, often requiring manual input into spreadsheets listing the physical devices, software and licenses across various departments. As soon as an inventory is compiled, it quickly becomes obsolete.

CISA’s Continuous Diagnostics and Mitigation (CDM) program has been active for nearly a decade to help fortify cybersecurity of agency networks. Even so, many agencies are still struggling to enforce a holistic strategy around network visibility.

The cybersecurity asset management approach delivers agencies with enhanced visibility into assets and issues, enabling them to gain compliance with key regulations.

Axonius is a cybersecurity asset management platform that discovers all of the assets in an environment, and then helps agencies validate compliance and automate remediation. To do this, Axonius uses adapters, not agents or scanning, to connect to over 300 security and management tools, allowing users to collect and aggregate data from across the entire organization.

Discovering security gaps

At Axonius, there are typically three things we recommend to agencies aiming to improve their cybersecurity posture:

  1. Start by compiling and assessing their asset inventory
  2. Discover coverage gaps with the data collected
  3. Enforce security policies.

But taking action on those steps can be challenging when agencies use traditional, manual approaches to aggregate an asset inventory. It’s hard to get a full picture of both users and devices across the various tools which agencies own and when those tools don’t communicate with each other.

A platform that provides data aggregation across all IT security and management offerings can help agencies build and maintain an active asset management system. From there, agencies can identify coverage gaps and take steps to automate and enforce security policies.

Some benefits that federal security teams are seeing from cybersecurity asset management include:

Perhaps the most important benefit is one that can’t be quantified: Confidence. Visibility into assets and gaps gives security teams the tools they need to be more confident in their ability to comply with regulations and keep their agencies secure.

Take action to close security gaps

Because Axonius aggregates together data from all available sources, it provides federal security and IT teams with the ability to send alerts, perform search queries, and enforce an automated action.

Some of the ways we’ve seen agencies use Axonius include:

The suddenly remote workforce only added to an already rapidly changing operating environment for federal IT and security teams. This complexity only underscores the importance of combining asset management, endpoint security, vulnerability assessment, and real-time enforcement together in one view.

Learn more about why asset management matters for federal cybersecurity teams.

Government agencies lean into managed services to stay ahead of IT demands

Seismic shifts in technology solutions and a need for modern IT and security skills are prompting a sizeable portion of federal agencies to bring in managed service providers (MSPs) to advance their IT needs, according to a new survey of federal agency IT, business and program executives.

Two in three respondents in the survey, conducted by FedScoop and underwritten by GDIT, say their agency is currently using, planning to use or considering retaining an MSP to support their IT work.

managed services

Read the full report.

Though the reasons driving agencies to use MSPs vary, one key undercurrent lies in the pressing agency need for cost efficiencies, relying on MSPs with experience deploying cloud, data and cybersecurity solutions.  At the same time, the reasons to work with MSPs have grown more nuanced in recent years.

There are a number of factors motivating agencies to take a fresh look at MSPs. Reducing long-term IT costs ranks highest — by six in 10 respondents. But four in 10 respondents also cite the importance of focusing more attention on mission and less on operations as a driving force. A third of respondents indicate the need to gain greater operational agility and automation, and to reduce risks, are also big motivators.

Executives in the survey identify a variety of core areas where MSPs provide value. Most respondents (65%) point to staffing and support, but 54% cite security and 44% mention compliance as key areas where MSPs are filling critical support gaps. Another four in 10 respondents say the need for training and change management, and for supporting continuous improvement and innovation, are core areas where MSPs make a difference.

The types of specific managed IT services demanded by agencies are also in flux.

Asked which IT services agencies had handed over to an MSP over the past five years, help desk services ranks at the top of the list. Over the next five years, in contrast, agencies will look to MSPs most for help with cloud infrastructure services, followed by backup and recovery, cybersecurity, data analytics, networks/infrastructure and software development.

Growing importance of MSPs

Managed service providers have historically been viewed by federal agencies though the lens of how well they reduce total IT costs, while still allowing agencies to control their IT outcomes. The findings suggest that MSPs are likely to play an increasingly important role in helping agencies manage their overall IT infrastructure, operations and security, much as commercial enterprises now rely on MSPs to keep their IT up to date.

Given the mission benefits of accelerating IT modernization and improving service delivery, the study findings show that agencies are coming to recognize that even if moving to an MSP is cost-neutral, or costs slightly more, it may be well worth it.

Based on the completed responses of 162 pre-qualified federal agency executives with mission, business or IT decision-making responsibilities, the study also explores which factors of success are most important in considering an MSP.

Technical expertise and experience, the ability to fully assess and support an agency’s specific needs and predictable costs are all seen as key factors in successful MSPs. So are the transparency of service level agreements and the ability to adopt or integrate new and emerging technologies downstream once an agency contracts with an MSP.

Managed service models are also continuing to evolve. Rob Smallwood, Vice President of Digital Modernization at General Dynamics Information Technology, a leading MSP provider in the federal market, suggests it’s important to look for MSPs with comprehensive expertise at every level of the IT stack. However, agencies should also look for MSPs that are flexible enough to provide selective, augmented or hybrid support.

“Managed services is about outcomes — what benefits to their enterprise the agency wants to achieve. By focusing on outcome-based managed services, the provider can enable the organization to ultimately achieve what they’re looking for, like cost savings, improved operations, increased value and freeing agency personnel to focus on their mission instead of the IT,” Smallwood said. “That’s also why flexible or hybrid models are preferred by agencies, because they’re working together with the provider to achieve those outcomes.”

The study also concludes that agencies should take steps to understand and fully identify many of the associated support costs that often aren’t transparent in their IT budgets before comparing proposals from MSPs.

Download the full report, “Managed Services: Powering Federal IT” for the detailed findings.

Join GDIT experts as they discuss the study findings and the journey to managed services in a virtual webinar on May 11: Guided Journey to Outcome-Based Managed Services

This article was produced by FedScoop and underwritten by GDIT.

Navy planning 4 major tests for network integration in 2021

The Navy plans to test its network integration capabilities — a key pillar of its modernization plan — at least four times this year, the chief of naval operations said Wednesday.

Adm. Mike Gilday described the tests as “big spirals” where siloed networks are combined into a “network-of-networks” operation to give the service an Internet of Things-like capability.

These tests — part of the wider military’s overarching Joint All Domain Command and Control (JADC2) concept — won’t be the first the Navy has undertaken. But Gilday said they will allow the Navy to put more data through its systems and unify more platforms with new applications at a greater scale.

The tests will “allow us to bring more networks into that network-of-networks construct,” Gilday said during a Center for a New American Security event. “So that’s testing more data on more networks and introducing more battle management aids to put the end-user in a position where they can see the battlespace better.”

Network integration is one of four core pillars in the Navy’s approach to JADC2, along with agile software development, common data standards and battle management applications. All of these efforts are hosted under Project Overmatch, the Navy’s JADC2 implementation program.

“The reason why this is so important is, first off, we need to maintain decision advantage over the adversary,” he said of the tests and overall JADC2 strategy.

Announcing the tests beforehand is new for the Navy, which has been tighter-lipped about its progress with JADC2 implementation than the other military services. Previous major tests that the Navy hosted within the service and in joint operations with other parts of the military have come to light after the fact.

Another recent change is that Project Overmatch got a new boss, with Rear Adm. Douglas Small leading a “robust” team of “technically savvy” civilians, Gilday said. Part of a reorganization, this gives Small added authorities and more central control of implementing the highly technical changes to network operations and data standards.

The team recently launched a battle management application on the USS Carl Vinson that Gilday praised as the first of many to help commanders interact and see data more clearly.

“The applications that we are applying now are much like the applications in your phone,” he said.

NIH’s COVID-19 data enclave continues to evolve with the virus

Technology linking patient records across data sources while preserving their privacy is being prototyped by the National Institutes of Health as researchers attempt to understand the evolving COVID-19 virus and its variants.

The National Center for Advancing Translational Sciences within NIH launched the largest COVID-19 dataset in the U.S., the National COVID Cohort Collaborative (N3C) Data Enclave, in April. And now NCATS wants to use privacy-preserving record linkage (PPRL) to link data from its enclave with medical images, omics tools, electronic health records (EHRs), and social determinants of health to answer researchers’ lingering questions like why COVID-19 symptoms linger in some patients.

PPRL finds and links records on the same patient across independently maintained data sources using a cryptographic hash value to protect their identity.

“Combining the EHR data with prospective studies and COVID clinics is going to be really important to be able to follow people over time, do specific interventions and try to tease out the differences in these diseases,” Dr. Ken Gersing, director of informatics at NCATS, told FedScoop. “What we’re now calling ‘long COVID’ is surely a syndrome of groups of many different illnesses, rather than one particular illness.”

Multimodal analytics being implemented now will give researchers the ability to look at patient images with their lab results, but some of the data sources NCATS wants to link to the N3C Enclave are maintained by other agencies like the Centers for Medicare & Medicaid Services.

PPRL respects data ownership by temporarily linking datasets in a neutral, high-performance computing area long enough for researchers to complete their work. Duplicate information is eliminated in the process.

NCATS still has hurdles to clear before PPRL goes live, ideally in two to five months, Gersing said. PPRL needs to be financed, legal barriers must be navigated and there’s a question of how to truly de-identify data from omics tools.

NIH announced funding for its institutes and centers (ICs) to research long COVID using PPRL in late January, going so far as to contract with two vendors. Datavant is handling the PPRL technology, while Regenstrief Group agreed to serve as the honest data broker for matching records.

“We, as the holders of the data, don’t want to also be the linkage group for the patients’ benefit, for the institutions’ benefit and for our benefit also — that there’s no conflict of interest and for preserving privacy,” Gersing said.

Appointing a data broker further allows researchers to ask COVID-19 patients to participate in potential studies. Researchers flag hashes of interest for the broker, which has the local institution where they originated de-encrypt them for the purpose of reaching out. That way patient identities remain with local institutions alone.

About 1,900 researchers from nearly 300 institutions were working in the N3C Data Enclave, which contained data from about 800,000 COVID-19 patients as of March. ICs like the National Heart, Lung, and Blood Institute and the National Institute of Child Health and Human Development; agencies like the Food and Drug Administration and the Agency for Healthcare Research and Quality, and companies like Pfizer and IBM all use the enclave.

While generally these institutions consider each other competitors, NIH agreed to harmonize their datasets and make them available to all with rules against reselling, re-identifying, downloading and using for non-COVID research.

The N3C Data Enclave is a Palantir analytics platform with three subsets — synthetic, de-identified and limited datasets — that a Data Access Committee of federal officials may or may not grant researchers access to upon request.

Only the limited dataset, the hardest to obtain access to, contains true dates and ZIP codes. Meanwhile the synthetic dataset, the easiest to access, is a pilot in itself.

“If we can prove that the computer-generated data, modeled off of the limited dataset, is truly equivalent scientifically and privacy-wise, then there’s no reason this data can’t be shared across the world,” Gersing said. “Just put it out there as a file.”

NCATS paid for all the technical infrastructure, which normally researchers have to spend a portion of their grant money on, so they could focus on answering questions like: What medications alleviate COVID-19 symptoms better depending on case severity? And what variables can doctors use to predict how sick a hospital patient will likely get for resource and treatment planning purposes?

The Johnson & Johnson, Moderna and Pfizer vaccines have special RxNorm numbers in EHRs that will help N3C researchers study their efficacy over time.

NCATS’s data enclave is a Federal Risk and Authorization Management Program-certified environment that also requires dual authentication to access. The center’s security office monitors the enclave and also has an outside federal group run penetration tests, though it hasn’t really run into nefarious actor to date, Gersing said.

“If this data ever got out of the enclave, it would shut down a very valuable resource,” he said. “I’m not saying it’s job one, but it sure is close.”

Biden’s GSA administrator pick Robin Carnahan boasts strong tech credentials

President Joe Biden intends to make one of 2017’s “Top Women in Tech” the head of the General Services Administration, the White House announced Tuesday.

Robin Carnahan founded and led the state and local government practice at 18F, GSA‘s tech consultancy, from 2016 to 2020, having previously been Missouri’s secretary of state.

Most recently, Carnahan co-founded the State Software Collaborative as a fellow at Georgetown University’s Beeck Center.

While at GSA, Carnahan helped state and local governments improve their digital services while cutting costs. Her practice taught non-technical officials about IT risk management, procurement and modernization projects.

As Missouri’s secretary of state Carnahan modernized online services for hundreds of thousands of customers related to both elections and securities. A Democrat, she also ran for one of Missouri’s Senate seats in 2010 but lost to Republican Roy Blunt.

Carnahan regularly testifies before Congress on government innovation, but Biden‘s nominee will still have to endure a Senate confirmation hearing before assuming the role of GSA administrator, which Katy Kale has been filling in an acting capacity.

Top Air Force IT leader has ‘mixed feelings’ about CMMC

The Air Force’s chief information officer has concerns about how the Department of Defense’s new cyber standards for contractors could harm small businesses trying to enter the defense market.

Lauren Knausenberger worries that the strictness of the Cybersecurity Maturity Model Certification, a program that requires third-party verification to a range of security controls, will limit small innovative companies from working with DOD. While she supports the need for better cybersecurity standards for DOD’s IT supply chain, CMMC may not be the best way to do it, she said.

“I have mixed feelings on it personally,” she said during an America’s Future Series webinar. “I think if we lock it down so that we are not going to do business with certain people because they don’t meet [CMMC], I think that limits our options.”

CMMC is a five-tiered system to increase cybersecurity controls that is being phased into contracts over the next five years. Contractors will be required to hire an accredited assessor to verify they meet one of the five levels, a process that remains in development as assessors are being trained and overseen by an independent accreditation body.

Knausenberger is not directly involved in the CMMC program, which falls under the undersecretary of defense for acquisition and sustainment’s authority. But her job as the top IT official in the Air Force gives her significant insight into the department’s technology needs and the potential impacts of barring some companies from its supply chain. She also was an investor and entrepreneur in the private sector before joining government, giving her insight into the challenges that may arise for tech companies.

For small companies hoping to work with the military, the costs of CMMC consultants, meeting the model’s security requirements and the fee for an assessor could be prohibitive. And if they fail to meet the CMMC level defined in a contract, the door to that opportunity is then shut.

“I would rather just say, ‘Hey let’s just give you some endpoint requirements,'” Knausenberger said.

While CMMC is all about the maturity of networks, Knausenberger said having some end-point security requirements and virtual means to connect into the department’s secure networks would likely cover necessary security needs.

“I don’t really care a whole lot about the other pieces” of the maturity model, she said.

Accreditation Body makes new industry council

Also Tuesday the CMMC Accreditation Body, the organization in charge of accrediting assessors and managing implementation of CMMC, announced a new industry advisory council. A group of a dozen industry executives will provide a “crucible for industry dialogue” on how CMMC will impact them, the group said in a news release.

Most of the members come from large defense contractors, like BAE Systems, Amazon Web Services and Accenture. One member, Nicole Dean, is a former board member.

“[J]ust like the volunteer professionals in the AB, the IAC volunteers have chosen to serve a higher cause,” CMMC-AB Board Chair Karlton Johnson said in a statement. “Their leadership, skill, and professional expertise will greatly contribute to the overall success of the CMMC program.”

The council mirrors previous groups the AB had during its initial creation. The volunteer board members led “working groups” of other volunteers from industry who worked on specific parts of CMMC implementation. The AB is looking for more volunteer members for the council to fill a diversity of perspectives, it said.

Joint Base San Antonio to focus on 5G for telemedicine

The Department of Defense is expanding its 5G technology experimentation to focus on medical capabilities, recent contracting documents show.

Joint Base San Antonio was selected last year as the Department of Defense’s test site for medical advancements powered by 5G. The National Spectrum Consortium recently released a statement of work shedding light on the new capabilities the DOD hopes to achieve like real-time virtual medical support, enabling remote forces to connect medical devices and ensuring the security of new 5G networks carrying medical data.

“There is a lot of opportunity to drive efficiency,” Randy Clark, vice chair of the National Spectrum Consortium, told FedScoop. The DOD awarded the National Spectrum Consortium a $2.5 billion contract in December to facilitate the military’s 5G pilots with the consortium’s member companies.

The tests are a part of an overall strategy from the DOD to offer its military sites as real-world places to test the new tech that could help modernize both commercial 5G development and DOD operations. The government has invested hundreds of millions of dollars in the program as 5G has become a technology critical in the competition against China.

“5G is going to play a critical role in the third offset,” a military term for the third generation of advanced technologies that will provide military dominance to whichever country fields them first, Clark said. “This is a part of a much larger initiative.”

Other military bases around the country have similar arrangements with private companies offering new tech in looser regulatory environments. But most of those focus on logistics and general connectivity.

5G could benefit medical providers by connecting hospitals in real-time with high-speed, ultra-wideband networks. Surgeons could get advice, even robotic assistance, from medical experts anywhere in the world with 5G, a capability currently limited by existing network capacity.

For the military, that could mean everything from more connected battlefield medicine and augmented reality training to digital twins of medical devices, Clark said.

“All of that wouldn’t necessarily take place without the investment,” Clark said of the hundreds of millions of dollars DOD is putting into its 5G testbeds.

Some of the specific enabling tech that the military wants to pilot in San Antonio includes artificial intelligence that can precisely modulate between radio wave frequencies, cybersecurity frameworks and alternative energy sources to keep powering the networks through blackouts. The testbeds are also generating new datasets for machine learning to extract information on when to set up networks.

“That is going to be disruptive in its own right,” Clark said of the power of combining AI with 5G.

Cybersecurity is a critical area of 5G research, Clark added. The new networks will require new practices to secure sensitive communications, especially if the military’s medical information is being transmitted. He stressed the importance zero-trust will play in securing 5G.

National Nuclear Security Administration awards $89.9M deal to Palantir for safety platform

The agency that maintains the U.S. nuclear weapons stockpile wants to allocate its employees and finances with safety in mind using a new data platform developed by Palantir.

The National Nuclear Security Administration awarded a five-year, $89.9 million contract to the Silicon Valley-based software company for a platform capable of measuring the health of its safety programs, Palantir announced Monday.

The platform will support NNSA’s Safety Analytics, Forecasting, and Evaluation Reporting (SAFER) project run out of its Office of Safety, Infrastructure, and Operations.

“Our work with NNSA illustrates Palantir’s mission to provide software to the world’s most important institutions in support of their most critical work,” said Akash Jain, president of Palantir USG. “We are excited to expand our work within the U.S. government and provide the NNSA with a high-tech solution to make the best possible use of its resources in support of the nation’s nuclear security missions.”

Palantir’s platform will integrate data across NNSA sites irrespective of the data or system type and will give the agency granular insight into safety metrics complete with visualizations.

The contract is Palantir’s first with NNSA.

While many of Palantir’s recent federal contracts have been tied to COVID-19 pandemic response systems, namely HHS Protect and Tiberius, the company started in the defense and intelligence space. One of the first tech startups explicit in their desire to aid national security agencies, Palantir landed its first Space Force contract almost a year ago and an Army network modernization contract in November.

DIU’s Mike Brown is Biden’s pick to head DOD acquisition

Mike Brown, the director of the Defense Innovation Unit, is set to be the Biden administration’s pick to head the Department of Defense’s acquisition and sustainment enterprise.

The White House on Friday indicated President Joe Biden’s intent to nominate Brown as undersecretary of defense for acquisition and sustainment.

Brown comes from a long career leading technology companies in Silicon Valley before he was tapped to bridge the gap between the DOD and his old tech community at DIU in 2018.

His expected nomination was one of three the White House announced for the Pentagon Friday, including picks of Michael McCord to be DOD’s comptroller and Ronald Moultrie to be undersecretary of defense for intelligence and security. Secretary of Defense Lloyd Austin gave a strong recommendation for the three.

“Each of these individuals is talented, experienced and highly qualified for the critical national security roles they will, if confirmed, undertake on behalf of the Department,” Austin said. “Their deep experience in national security will prove essential in guiding our efforts to defend this nation and secure our interests around the world.”

It’s unclear who will replace Brown at DIU. Once officially nominated, he will need to receive Senate confirmation.

Brown’s job at DIU focused on rapid prototyping and acquisition, handling a couple billion dollars a year. But his new job would focus on acquisition programs at a much larger scale worth hundreds of billions of dollars and that are often the opposite of rapid. Brown’s nomination represents a potential sea change for the department by putting a former technology official at the helm of acquisition. Brown’s predecessor Ellen Lord was a former defense industry executive when she took on the role in 2017.

The former CEO of cybersecurity company Symantec, Brown would also oversee critical cybersecurity programs to secure the defense industrial base, like the Cybersecurity Maturity Model Certification (CMMC). The program is currently under an internal review.

Brown has also been influential in his thoughts on U.S. technology competition with China. He has frequently spoken on the think tank circuit about Chinese tech development and co-authored an influential paper about economic competition and civil-military fusion in tech.

“National security follows economic security and prosperity,” Brown once said.

With a new CEO, CMMC AB board will boost focus on strategy, chairman says

It’s a busy time to be in supply chain cybersecurity, especially for the board chairman of the Cybersecurity Maturity Model Certification (CMMC) Accreditation Body, Karlton Johnson.

At a time when the federal government is still reeling from the recent widespread SolarWinds hack, Johnson leads the volunteer organization charged with implementing the Department of Defense’s new CMMC standards for all defense contractors that many hope will stop the next pilferer of DOD data.

Now, Johnson’s leadership of the AB board is reaching a pivotal point: He is focused on hiring professional staff and transitioning what was a board of directors intimately involved in the day-to-day operations into one that can strategically guide a scaled organization.

In his first extended interview with FedScoop, Johnson said the board he leads will move from a body of “director do-ers” to become a “governing board.”

That means new faces on the board, new hires at the staff level and new ethics policies.

“I haven’t really seen the work changing significantly; actually I’d say it’s become more laser-focused,” Johnson said. “Especially bringing on the CEO.”

The board recently made one of its most important hires, bringing on Matthew Travis to be CEO of the AB. Johnson spoke highly of Travis, describing him as “sharp” and bringing necessary skillsets to the job. Travis is just the first major hire of many the AB wants to make in the coming weeks and months, filling out staff positions to carry out the massive undertaking before the organization, Johnson said.

“We are pretty excited because it’s a significant milestone,” he said of hiring Travis, who started last week. The most important part of the accreditation body’s developing role “is that professional staff we are bringing on,” Johnson said.

Johnson said Travis will take on some of the roles the chairman and other board directors currently fill, like managing the relationship with the CMMC Program Management Office and leading the daily operations of the organization.

The road ahead

The program the AB is implementing is DOD’s latest attempt in securing its manifold IT supply chain from hackers. The CMMC model has five levels of cybersecurity strictness— with level one being the most basic and level five including hundreds of complex controls — that all contractors will need to be certified against or risk losing access to DOD contracts.

Raising the army of assessors needed to inspect all the networks of the 300,000 defense contractors will be the AB’s responsibility. Beyond just credentialing assessors and assessment companies, the AB will also license training and testing providers, give stamps of approval to consultants willing to pay and generally oversee the quality of the complex CMMC ecosystem.

“I am focused on delivering that capability; I am focused on taking it to the next level,” he said.

To deliver the CMMC “capability,” more work remains for the board and the new staff alike. While consultants abound, contractors still await fully licensed assessors and Certified Third-Party Assessment Organizations (C3PAOs) who will be able to actually certify a company. Although full implementation of CMMC requirements will be phased in slowly through fiscal 2026, there is concern in industry over a demand crunch where assessments take more time than anticipated and there aren’t enough assessors to fan out across the defense industrial base.

Johnson says he is confident in the AB’s ability to meet demand. The AB has trained about 100 provisional assessors and cleared roughly the same number of assessment organizations through its initial application screening. But much remains to be done to turn them into fully credentialed assessors, like DOD completing its own assessment of assessors through the Defense Industrial Base Cybersecurity Assessment Center (DIBCAC).

“We remain on target,” Johnson assured.

Johnson was reluctant to disclose current timelines or estimates the AB is using to determine what that target is, or how it will meet it. But he did commit to engaging with industry and the media more regularly when the AB makes those decisions.

“Today, [based] on what we were asked to do, we are able to meet that demand,” he said.

New faces, same concerns

The daunting task of making CMMC work has come with its share of controversy and consternation from those it will impact. One of the most consistent criticisms has been a lack of communication and questions over conflicts of interest with the volunteer board members.

Johnson partially attributes the latter to “malicious influencers” spreading falsehoods or context-less information about the volunteer board. Regardless, he said the board will continue to increase its public engagements and work directly with industry to answers questions.

He also hinted at adding new ethics policies.

“From day one we have had conflict of interest policies in place. Those policies not only continue to be in place, but we are strengthening those as we go,” he said.