Matthew Travis hired as CMMC Accreditation Body CEO
The third-party board implementing the Department of Defense‘s new cybersecurity standards for contractors finally has a CEO after months of searching.
Matthew Travis, a former deputy director of the Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency, has been tapped to lead the Cybersecurity Maturity Model Certification Accreditation Body (CMMC-AB) — the organization tasked with overseeing the ecosystem of assessors who will inspect the IT networks of the 300,000 companies in the defense industrial base.
Travis will lead day-to-day operations of the CMMC-AB, a job that has largely been filled by the AB’s board of directors in the nearly 15 months since it was incorporated.
“We are extremely thrilled to have someone as respected and accomplished as Mr. Travis lead the Accreditation Body,” Board Chair Karlton Johnson said in a statement released Monday. “His organizational development skills as well as in-depth understanding of security and the Federal government will enable us to continue to quickly ramp-up AB operations and execute against our mission in service of the nation’s defense,”
Travis joined DHS in 2018 supporting what was then the National Protectorate and Programs Division — CISA’s precursor. His work to transition and stand up CISA within DHS was an attractive experience for the AB, which is also a rapidly growing organization steeped in government work. Travis resigned from CISA in November 2020 after then-Director Christopher Krebs was fired.
A former naval officer, Travis also served as a White House liaison from the Office of the Secretary of the Navy in the late ’90s. His public resume shows work experience focused primarily on homeland security and counter-terrorism technology.
The AB had been searching for a CEO since the summer of 2020.
“Joining and leading the CMMC-AB is a tremendous opportunity. I look forward to using my collective experiences of running a security company start-up as well as my time at CISA, where I focused on supply chain risk, to ensure we mitigate risks as they relate to both the DoD and the contractor community,” Travis said in a statement. “There is no more important cyber mission right now than building a trusted, verified, and resilient cybersecurity ecosystem within the Defense Industrial Base.”
CMMC is the new requirement the DOD is phasing into contracts to certify companies’ cybersecurity to shore up its supply chain. The new model is a tiered system where contractors will need to pay for an assessment from a CMMC-AB-certified assessor, which will inspect the company’s networks and give it a 1-to-5 score based on the ability to meet the security controls laid out in the CMMC model.
Since CMMC’s initial introduction in 2019, supply chain security has become an increasingly more important topic in defense and government contracting following the SolarWinds supply chain breach that impacted a multitude of government networks.
“When we look at where true cyber risk currently resides, the CMMC mission is a critical component of the safety and security of our nation and its citizens,” Travis said.
DOD officials have spoken previously about their hopes that DHS, Travis’s former employer, will adopt the CMMC model or something similar to it for the supply chain of civilian agencies.
‘Small number’ of DHS email accounts accessed during SolarWinds breach
Hackers accessed a “small number” of Department of Homeland Security employees’ email accounts during the SolarWinds breach, according to an agency spokesperson.
The suspected Russian operatives successfully targeted then-acting Secretary Chad Wolf and cybersecurity threat hunters, forcing top DHS officials to get new phones with the encrypted messaging application Signal on them, the Associated Press first reported.
Still, the new details on the breach aren’t a good look for the department that’s home to the Cybersecurity and Infrastructure Security Agency, which just received a $650 million appropriation in the American Rescue Plan Act, in part, for SolarWinds recovery.
“The department no longer sees indicators of compromise on our networks and remains focused on further securing our networks against future attacks, integrating lessons learned from this incident,” the spokesperson said. “However, this widespread intrusion campaign has again shown that our strategic adversaries are sophisticated, persistent and have increasing capabilities.”
DHS activated response teams from CISA and the private sector to respond to the SolarWinds breach, which is known to have compromised at least nine agencies by the time it was discovered in December. The department remains in contact with employees affected by the breach for guidance and services, as the response is ongoing.
CISA shared best practices on the SolarWinds Orion eviction guidance it sent affected agencies, as well as detection and forensics tools, with the White House as it considers how to modernize cybersecurity.
The Biden administration has yet to issue a promised executive order on cyber modernization, or name a national cyber director, and continues to mull its options for retaliating against Russia.
Meanwhile, CISA has shifted its focus to developing capabilities for monitoring the insides of networks for anomalous activity to defend against future supply chain attacks like the SolarWinds hack. And the tech industry called on the Office of Management and Budget and General Services Administration to proactively fund urgent cybersecurity projects with Technology Modernization Fund dollars.
AWS’ U.S. federal director talks about rapid response at agencies using the cloud
Brett McMillen has been at the forefront of the federal government’s move to cloud computing for the last decade. An engineer by training, McMillen has devoted most of his career helping government agencies harness innovative IT solutions, having worked for a number of leading telecommunications companies before joining AWS in 2011, where he now serves as Director of U.S. Federal Amazon Web Services.

Brett McMillen, Director, U.S. Federal, AWS
McMillen has been recognized for his work helping the federal government make important information, such as the 1,000 Genome Project, available as public datasets on AWS and working with federal officials to obtain the FedRAMP certification for AWS.
FedScoop caught up with McMillen recently to get his current take on how agencies are moving forward with the cloud.
FedScoop: Government agencies are constantly being asked to do more with less. In many ways, the pandemic revealed that agencies were able to do more than many thought possible. Where did you see some great examples of that among federal agencies?
Brett McMillen: If you would have asked me where I thought we would spend most of our time and effort during the pandemic, I would have guessed places like FDA, CDC, and NIH. And while we have done a lot of work to support those agencies, there have been other areas that we didn’t expect. One example is the U.S. Small Business Administration, which was tasked with administering the Paycheck Protection Program under the CARES Act. We assisted the SBA with providing an additional portal to process loans via SBA’s E-Tran system. We stepped in to help, and with the support of our partners, SBA was able to offer a portal to allow more lenders to submit their loans to the agency for the federal guarantee. These loans were then used to provide financial support to small businesses to keep workers on payroll and open for business
Another example was the U.S. Census Bureau. While they’ve been conducting the census since 1790, it’s really been done with pen and paper. 2020 was the first year that people could participate in the decennial count online. While AWS started working with them years ago to bring the decennial census online, the pandemic made the modernization even more critical. At a time when the Census Bureau has scaled back some of its in-person data collection, the ability to collect responses online enabled Census operations to continue safely and efficiently.
FedScoop: In some cases, the economic shutdown that followed the pandemic resulted in a big fall off in fees for some governmental agencies. Where did you see technology helping agencies innovate or optimize operations in response?
McMillen: While some organizations had to scale up, other organizations were experiencing less demand and needed to scale down. One example is U.S. Citizenship and Immigration Services. Since fewer people were looking for visas and passports during the pandemic, they had a large drop in the fees they were receiving and needed to scale down. Scaling down allowed CIS to serve the people that needed it at a smaller expense. We saw that at multiple agencies.
The ability to scale both up and down is one of the benefits agencies see by moving to the cloud. Agencies can’t be sure when the next crisis will occur or what resources they’ll need. Having every system ready to either scale up or scale down is essential, so that they can meet those changing needs.
FedScoop: Where else did you and your support teams at AWS see the innovative or progressive use of cloud services?
McMillen: When the federal government first introduced its “Cloud First” policy and “25 Point Implementation Plan,” most agencies would choose a specific workload to try moving to the cloud. They saw success with these workloads, and began moving more. As agencies transitioned more workloads to the cloud, they saw a need to move into enterprise-wide engagements.
Now, rather than looking at each workload separately, agencies are asking, “How would you solve this mission problem with the latest and greatest technologies?” With AWS, agencies have access to our technologies and services, as well as technologies available from our partners that build on top of AWS. These technologies can help build new systems or improve existing applications.
For example, the Centers for Medicaid and Medicare Services help distribute money to states to help people in need through programs like the Child Health Insurance Program. However, CMS didn’t have a good feedback mechanism to determine who was getting those benefits; they weren’t able to get really granular with the data and understand it. Now, we work with CMS and help them build applications to answer questions like, “In this zip code, are below-poverty-income mothers getting the prenatal care that they need?” With these improvements, they’re able to better understand the funding that goes out and how it’s serving citizens.
FedScoop: What in your estimation made the difference in the way agencies were able to mobilize so quickly and productively through the pandemic?
McMillen: The agencies that were able to pivot quickly had already made some fundamental IT improvements before the pandemic began. When a pandemic or any unforeseen event happens, you don’t know where you’re going to have to scale up and scale down. Making those investments and improvements in advance helped position them to move quickly to respond.
One example of an improvement that can be made in advance is setting up an acquisition vehicle for general operations, which allows anyone within that department to quickly and easily access modern cloud technologies. Agencies can also set up standard operating procedures and an authority to operate that any organization within the agency can leverage.
Having some of those fundamentals in place makes it easier to respond to unforeseen events and stand up a new system in the cloud. For example, AWS is working with agencies to build what we call landing zones — a compliant environment that meets that agencies’ security and acquisition needs.
It’s also helpful if government agencies continue to focus on workforce modernization and train employees on modern technologies on an ongoing basis. Then, when unforeseen events occur, their employees are already trained to quickly pivot and respond.
FedScoop: What lessons from the pandemic suggest ways agencies can still accelerate their modernization efforts even with limited budgets?
McMillen: Public service can be unpredictable, and transitions in leadership and administrations can bring about frequent change. So agencies should look at each of their systems and ask whether they are scalable and responsive to change. Unfortunately, too many of these systems were built a long time ago and don’t have that scaling capability.
However, IT modernization doesn’t have to mean boiling the ocean. Agencies can inject modern technologies into existing systems. Over time, this results in continuous modernization or continuous improvement of those systems. Agencies that adopt that approach are the ones that are able to pivot very quickly.
As we’ve already seen, this pandemic isn’t going to end overnight. It’s still going to be a long process. Agencies should use this process to modernize and make long-term improvements to their systems, which will improve the delivery of services to the American people.
Learn how AWS can help your agency capitalize on today’s cloud or contact AWS.
Read more insights from AWS leaders on how agencies are using the power of the cloud to innovate.
VA secretary worried by productivity issues, rising costs of $16 billion EHR rollout
Department of Veterans Affairs Secretary Denis McDonough told lawmakers that he has concerns with the productivity of the department’s project to deliver a modernized electronic health record and that it could end up costing more than anticipated.
McDonough’s fears come from the VA’s struggles at a Pacific Northwest VA medical center that was the first across the department to roll out the new EHR. The medical center’s issues lie in how it is delivering care after clinicians and other medical staff are trained on the new cloud-based EHR system, the secretary said during a House hearing last week.
McDonough, new to the VA secretary role, recently issued a statement saying he would review the system and its rollout at the Mann-Grandstaff Medical Center in Spokane, Washington.
“At the end of the day, this is about service provision and outcome for [veterans],” he told the House Committee on Veterans’ Affairs in his first testimony as secretary. “That’s got to be what drives this.”
The program is designed to migrate billions of VA medical records to a Cerner-built cloud and deliver all-new EHR interfaces for medical workers to replace the legacy Veterans Health Information Systems and Technology Architecture. The system will eventually link into the Department of Defense’s Military Health System GENESIS modernization program to achieve interoperability between the two agencies to ease the transition for retiring service members.
McDonough said he’s also concerned by the department’s rate of spending on the modernization project, the number of people required to deploy the system and the scope of what contractor Cerner is being asked to do on the 10-year, $16 billion contract.
The VA’s plan for spending on the EHR is that it will be “front-loaded” and peak this year — but McDonough said he has concerns that the overall program will end up costing more than anticipated as more people will likely be required at deployment.
“It does appear to be requiring a lot more people on the target in Spokane,” he said.
McDonough’s recent attention on the EHR comes after lawmakers and other watchdogs raised issues with the program. In February, the Government Accountability Office recommended VA pause the project’s rollout while systems are tested to ensure they do not fail. And earlier this month, Rep. Cathy McMorris Rodgers, R-Wash., penned a letter detailing veterans from her district who experienced “dangerous” delays in medical care after the rollout.
McDonough’s early disappointments may be assuaged in the coming weeks as the strategic review gets underway. He showed openness to keeping much of the program the same but possibly tweaking some of the schedule and plans for the rollout.
“I don’t think that we fundamentally change the program in Spokane,” he said when asked what the outcome could be.
While on the topic of digital modernization, McDonough also told lawmakers during the hearing that the VA plans to increase its reliance on timely data and build out a data strategy to support new efforts.
“We will rely on data more heavily in the future than we ever have before,” he said.
The VA has some artificial intelligence-based projects underway that heavily rely on data, like one to identify veterans who might be at risk for suicide. McDonough told lawmakers he wants to increase the timeliness of suicide data, which currently lags two years behind.
2021 Best Bosses in Fed IT voting is open
CBP revisits use of body-worn cameras at the border
Customs and Border Protection is the latest Department of Homeland Security agency to once again kick the tires on body-worn cameras.
The agency wants details on available cameras, video management and redaction software, and cloud storage that can support the incident-driven video recording system it’s already deploying, according to a request for information (RFI) issued Friday.
That system is being piloted at select border crossings to record CBP agents’ interactions with the public where there are no fixed cameras.
“CBP anticipates storing most footage in the cloud while maintaining government ownership of the data indefinitely,” reads the RFI. “Footage is considered law enforcement-sensitive data and therefore must be stored in accordance with federal laws, regulations, and requirements.”
Information CBP wants from vendors includes whether their body-worn cameras are Federal Risk and Authorization Management Program (FedRAMP) authorized, as well as their ability to redact video, automatically activate, link to a mobile application, expedite cloud uploads, and secure video.
CBP also wants cost estimates for between 5,000 and 40,000 units with submissions due April 2.
The Secret Service, also within DHS, put out an RFI in February on body-worn cameras in advance of a contract for 2,000 units it hopes to put up for bid.
While CBP has looked into using body-worn cameras previously, governmentwide interest in the technology has increased following the Department of Justice‘s decision to permit them on federal task forces in October.
Army building new All-Domain Operations Centers
The Army is creating its first “All-Domain Operation Centers” designed specifically to enable a more connected way to fight, Army strategists told reporters Friday.
ADOC construction is being housed within the service’s Multi-Domain Task Forces — units stood up at specific bases to work on implementing modernized technologies and concepts of operations that use them. The overall goal of the task forces and new operations centers is to enhance data-sharing to link together new weapons to increase their range and precision and the speed at which commanders make decisions on what to deploy.
The All-Domain Operation Centers are part of the Army’s work in support of the Joint All Domain Command and Control concept of operations that will link all domains and all the services together through a military Internet of Things.
“The nascent ADOCs…will provide combatant commanders the ability to make decisions faster from the onset,” Col. Jason Charland, military deputy to the Department of the Army‘s Management Office-Strategy, Plans and Policy, said on a call with reporters Friday.
The ADOCs are important as they will create new central locations for commanders to see data from the field and provide faster analysis with the sensors linked together. The hope is to build them as “joint from inception,” to enable other services, not just Army commanders, to oversee operations, Charland added.
The operations centers would serve as endpoints overseeing a massive undertaking of data curation and synthesis. The backend part of the system is the focus of Lt. Gen. Denis Crall, CIO for the Joint Chiefs of Staff, and others in the DOD data community trying to link many types of data together.
“One of the biggest hurdles is really the common architecture and the data models,” Charland said. “To be able to communicate at speed and scale is a wicked problem.”
The Multi-Domain Task Forces the ADOCs will be housed under recently were stood up at Joint Base Lewis-McChord in Washington State. Two more are expected to be named in Europe and in the Pacific — two regions that are of high importance for the Army’s focus on great power competition.
Air Force develops maturity model for zero trust across the department
The Air Force is developing a maturity model to help broaden its implementation of zero-trust principles in the foundation of its network architecture, its top IT official said Thursday.
The Air Force has found success with initial zero trust projects, like Platform One, the service’s DevSecOps initiative where the network architecture is built with no “trust” or wide access is given to any user, whether familiar to a network or not. Now, the Air Force is trying to move beyond individual projects to implementing zero-trust principles at the enterprise level, Lauren Knausenberger, Air Force CIO, said Thursday during a Dcode event.
“The vision is for the future to be completely zero trust…where we are able to collaborate seamlessly with all of our allies,” she said.
The maturity model will help network administrators and IT professionals across the Air Force bring their architectures in line with zero trust. The model highlights critical elements of the process like ensuring proper data tagging and access management. The Air Force is also working on an enterprise identity, credentialing and access management (ICAM) certification to be able to more securely recognize users.
“We have these little pockets of zero trust, but we are also doing some basics right now,” Knausenberger said.
The maturity model will serve as part of the Air Force’s “road map” to zero trust. It’s unclear how long the journey will take, but tech leaders in the department have been talking about zero trust for months, especially during the pandemic.
“We have a road map there,” Knausenberger said, adding “that we have to do a better job of funding the road map.”
Knausenberger also made some news about Platform One, which is building a secure environment for companies to use once they have received Small Business Innovative Research contracts to work with the Air Force. This will allow contractors to work on more sensitive projects without having to invest in their own government-approved, secure systems.
OMB restores data-driven goal setting requirements that Trump nixed
The Office of Management and Budget restored data-driven goal setting requirements for agencies, with an increased emphasis on addressing health and economic challenges caused by the COVID-19 pandemic.
An OMB memo issued Wednesday reinstated the government’s performance improvement and service delivery framework, which expects senior leaders at agencies to set ambitious goals, hold regular progress reviews and publicly report the results. The memo also brings back customer experience tools.
The Trump administration eliminated the accountability measures in December, citing a lack of public interest in the thousands of pages of performance data posted to Performance.gov annually. But many officials saw the move as an attempt to railroad the initial success of the incoming Biden administration.
“Agencies were clear, and unanimous, in their desire to have the earlier framework reinstated,” wrote Pam Coleman, associate director of performance and personnel management at OMB, in a blog on the changes. “Government performance priorities across multiple administrations have shown significant improvements when the related agency priority goals have received sustained leadership attention with clear definitions of success, collaboration across organizational boundaries and support from the Congress.”
The departments of Veterans Affairs and Housing and Urban Development set a joint agency priority goal to decrease veteran homelessness and did so by 47% between 2010 and 2016. Similarly, the Treasury Department reduced paper transactions for benefits by about 90% in 10 years, and the Department of the Interior increased water conservation and reclamation by 10% out west in two years of setting agency priority goals.
Public interest aside, Government Performance and Results Modernization Act management practices increase data-driven decision making within agencies, Coleman wrote.
Performance.gov will once again be updated quarterly with progress reports, regardless of agencies falling short of objectives.
“The removal of Part 6 from Circular No. A-11 in December 2020 threatened to disrupt strategic and performance planning across federal departments and agencies,” reads the memo. “These activities are critical to clearly defining the outcomes the Federal Government aims to achieve, using feedback from our customers to improve service delivery, and being transparent about agency results.”
Pandemic Analytics Center of Excellence will help IGs investigate fraud
The committee of federal watchdogs tasked by Congress to oversee emergency pandemic spending is developing a center of excellence that uses data analytics to combat COVID-19 relief fraud.
The Pandemic Analytics Center of Excellence, created by the Pandemic Response Accountability Committee (PRAC), will provide the federal inspectors general (IG) community with fraud-fighting tools allowing them to share data analytics and practices to assist with their audit and investigative work.
So far IGs have only managed to return or seize $2.5 billion of the $84 billion in potential fraud committed between the Small Business Administration‘s Paycheck Protection Program (PPP) and Economic Injury Disaster Loan (EIDL) program, according to a memo released by the House Select Subcommittee on the Coronavirus Crisis on Thursday.
“In order to fulfill the PRAC’s mission we need better technological tools for IGs and our oversight partners, including the use of advanced data analytics,” said PRAC Chair Michael Horowitz, IG of the Department of Justice, during a subcommittee hearing the same day.
The Coronavirus Aid, Relief, and Economic Security (CARES) Act created PRAC, which is comprised of 22 IGs from across government who are charged with seeing to it that federal funds, like those in the CARES Act and the American Rescue Plan Act, are spent properly. The committee is working with the Office of Management and Budget and other agencies to address fraud data gaps.
Despite the $84 billion potentially lost to fraud, Republicans on the House subcommittee touted the fact SBA used the PPP and EIDL programs to quickly give $910 billion in COVID-19 relief to small businesses.
“[T]his subcommittee is focused on attempting to tear down a bipartisan program that kept the economy afloat during the early and toughest days of the pandemic,” said Rep. Jim Jordan, R-Ohio, the ranking member. “We all agree fraud is bad. But we should all agree that a 99% success record is unprecedented, and we have President Trump to thank for that.”
But $84 billion in fraud would be more than 9% of the money SBA distributed, and the agency’s IG found the Trump administration ignored fraud flags, awarded loans with little to no vetting and abandoned a rule that two employees approve applications in the case of EIDL. Proper controls were added to SBA’s electronic loan application system, E-Tran, too late in the case of PPP.
Then-Treasury Secretary Steve Mnuchin cited the need for speedy loan delivery for the inevitable problems that arose at the time.
“Let me be clear: That is a false choice,” said Rep. Jim Clyburn, D-S.C., chair of the subcommittee. “Americans should not have to and did not have to choose between quickly getting aid during a crisis and preventing the theft or waste of billions of tax dollars.”
SBA has yet to conduct a formal fraud risk assessment for PPP or EIDL.
COVID-19 relief fraud investigations will continue for a decade because that’s how long the loans will be in SBA’s portfolio, said Mike Ware, IG at SBA.
“As we continue to address our processing backlog, we will employ data analytics to further triage and guide these efforts,” Ware said. “Data analytics have made a difference in our office’s ability to keep our stakeholders currently and fully informed in a timely manner.”
The SBA Office of Inspector General overlaid its data with the Treasury Department‘s Do Not Pay list and found “quite of bit of money” went to people who should never have been paid, Ware said. Analytics also helped catch duplicate PPP payments.
A problem with lists is that the fraudsters on them quickly learn to steal other people’s identities in order to continue their work. And identity theft was prevalent in EIDL fraud cases and reared its head with PPP loans as well, Ware said.
Some in the tech industry have advocated for automated screening as a fraud deterrent, in lieu of lengthy, inefficient investigations after the fact.
But the first three supplemental appropriations SBA OIG received to improve COVID-19 relief oversight were put toward recruiting auditors, analysts and criminal investigators; EIDL fraud investigative staff and data analytics; and increasing investigative capacity, Ware said.
A total of $142 million was allocated to the oversight community in the American Rescue Plan Act passed earlier this month.
“The Biden administration and Congress have also worked together to ensure that critical oversight bodies like the PRAC, [Government Accountability Office] and IG community have the resources and tools they need to do their jobs,” Clyburn said.