CMMC board gets first permanent chair since September

The accreditation body overseeing the rollout of the Department of Defense‘s new Cybersecurity Maturity Model Certification standards has a permanent chair for the first time in months.

Karlton Johnson, the former vice-chair who has been acting chair since September, was voted to lead the Cybersecurity Maturity Model Certification Accreditation Body (CMMC-AB) as it oversees the DOD’s contractor cybersecurity verification program. Since its incorporation a year ago, the board has faced ups and downs with ousters and tumult stymieing some of its progress.

The board plans to announce Johnson’s confirmation Tuesday night during a town hall meeting, giving some AB members optimism for its future. It is unclear what the vote margin was to confirm Johnson’s leadership.

Johnson, a former Air Force colonel with years of experience leading the operation of networks in hostile combat zones, was elevated to acting chair when former chairman Ty Schieber was ousted in September following the creation of a sponsorship program that some saw as a pay-to-play scheme.

During the months Johnson served as acting chair, several members publicly and privately expressed trust and confidence in his leadership of the AB. In public appearances, DOD officials have also expressed support for him and a solid relationship.

The AB exists as a separate entity from DOD program office that houses CMMC but works closely with the department through a no-cost contract. The board’s primary duty is to oversee the ecosystem of assessors, trainers, educators and consultants that will verify that contractors are meeting their cybersecurity requirements. The new CMMC requirements span five levels of cybersecurity maturity that all contracts in the future will be required to meet depending on the sensitivity of the information they deal with. The scale begins with level one for non-sensitive information and culminates with level five for the most sensitive controlled unclassified information a contractor might handle on its network.

The AB needs to accredit enough assessors to efficiently vet the 300,000 contractors that will eventually need CMMC certification. Without a sufficient supply of assessors, demand could push the cost of an assessment out of reach for small businesses, which also may need to spend more to improve their cybersecurity hygiene to meet new standards.

The AB plans to outline its recent success setting up that marketplace in the town hall Tuesday night. Representatives from the DOD will also detail what initial contracts will have CMMC requirements. All DOD contracts will have a CMMC requirement come 2025, with the rollout ramping up over time.

Gen. Murray makes pitch to stay the course on Army modernization

The Army’s modernization priorities remain in place with a new administration taking power and Department of Defense budgets unlikely to grow in the coming years, according to a top Army general.

Army Futures Command, which was stood up during the Trump administration to focus on the service’s modernization efforts, has big plans for its network modernization initiatives this coming year, Commander Gen. John Murray said Monday during a Center for Strategic and International Studies event. And though the Army wants to continue pouring resources into the effort over the span of a multi-decade cycle, bitter battles for money likely to play out in the coming years, Murray said.

“We just cannot afford to pass up this opportunity,” Murray said of the work started by the command.

Futures Command has six stated modernization priorities: “long-range precision fires, next-generation combat vehicles, future vertical lift, network, air and missile defense, and soldier lethality.” Murray says those won’t be changing and he will continue fighting for his slice in what’s likely to be a zero-sum battle for the budget to continue large-scale experiments. For now, it’s unclear how those efforts will mesh with priorities from the top, as new Secretary of Defense Ret. Gen. Lloyd Austin has yet to comment on what his technology and modernization priorities will be.

Futures Command’s network modernization effort centers on working more effectively with data and getting more compute to the tactical edge. This year the network modernization team will work to bring its U.S.-based network architecture to the tactical edge, Murray said. It is a part of exploring “unified network operations” the command strives for.

The long-term goal of all the military services is to get every sensor linked to every shooter — or at least the right sensors to the right shooters instead of the fragmented and sluggish tactical networks currently in use. The Army’s contribution to this is Project Convergence, which held its first large-scale test in September 2020. Future tests will continue along with one of the key takeaways from the first test: get more coders in the field.

“We are going to have to have the ability to write code at the edge,” Murray said.

As a means to reach that goal, the command recently stood up a software factory. The factory was announced in the summer of 2020 and Murray said its first coders have been put into cohorts in recent weeks. The command is also working on getting soldiers through an AI master’s degree program co-sponsored by Carnegie Mellon University, where the Army AI Task Force is located.

HHS seeks formal approval for emergency COVID-19 portal

The Department of Health and Human Services is seeking formal approval for the portal it created on an emergency basis to collect daily COVID-19 data from about 5,500 hospitals.

HHS posted an information collection request (ICR) Friday in the Federal Register for its Teletracking COVID-19 Portal, which replaced the Centers for Disease Control and Prevention’s National Healthcare Safety Network (NHSN) collecting hospital coronavirus data in July.

The portal allows hospitals to directly report coronavirus data requested by HHS on patients tested, bed capacity and supply requirements. The data then informs the government’s understanding of the spread and helps craft prevention and control policies.

“We acknowledge the burden placed on many hospitals, including resource constraints, and have allowed for some flexibilities, such as back-submissions or submitting every business days, with the understanding that respondents may not have sufficient staff working over the weekend,” reads the request. “It is our belief that collection of this information daily is the most effective way to detect outbreaks and needs for federal assistance over time, by hospital and geographical area, and to alert the appropriate officials for action.”

The ICR estimates each hospital’s daily reporting burden at 1.5 hours — more than 3 million hours among all contributors in a year.

Required data elements were last updated in Jan. 12 guidance, at the request of the White House Coronavirus Task Force. HHS acting Chief Information Officer Perryn Ashmore and the CDC are working with U.S. Digital Service and states and localities to harmonize them.

Refining the collected hospital data remains critical to ongoing federal coronavirus response efforts, with the Teletracking COVID-19 Portal being used to determine allocations of limited personal protective equipment (PPE) and medicine.

In at least one case, HHS has tried to reduce the reporting burden. The department has ceased sporadic requests for data from hospitals aiding its distribution of Remdesivir, which limits the effects of COVID-19. But overall, consolidated daily reporting is still needed for accurate distribution of other items, according to the ICR.

People have until Feb. 22 to comment on the necessity of the portal, estimated reporting burden and ways to improve collection like automation.

Generally agencies must obtain ICR approval from the Office of Management and Budget before collecting information from 10 or more members of the public, but the urgency of the pandemic allowed for the portal’s creation. If OMB for some reason fails to approve the ICR, data collection would cease.

Air Force builds digital twin for weapons with ABMS

Worlds have collided — the Air Force is using its Advanced Battle Management System (ABMS) to build digital twins of its weapons systems, using one futuristic initiative to achieve the other.

ABMS is the Air Force’s stab at building an Internet of Things for war; a network-of-networks with data flowing from sensors to artificial intelligence-enabled systems for quick decision-making. Now, some of that data is feeding into a related key initiative: building digital twins.

Digital twins provide software-defined replicas of physical objects, taking into account the physical toll of operations and giving users a heads up on malfunctioning parts or needed repairs.

In the case of this project, called weaponONE, as data generated from weapons systems flows across ABMS’s architecture, it can feed those digital twin models, improving their accuracy and usefulness. The project is named in line with other parts of the ABMS family, like Cloud One, Platform One and so on.

“The Digital Twin Lab represents the ultimate expression of digital engineering, acting as a force multiplier, giving us tremendous flexibility and adaptability to our weapons systems,” said Col. Garry Haase, director of the Air Force Research Lab’s Munitions Directorate.

Digital twins are a critical part of digital engineering where the 1s and 0s of code are tested in digital environments that mirror real-world conditions before physical prototypes are made. Both ABMS and digital engineering were top priorities for Will Roper, the recently departed head of acquisition, technology and logistics for the Air Force. He would seemingly levitate with excitement when speaking on both topics, saying that digital engineering was the closest thing to “magic” he had ever seen.

“I want to build everything this way,” he had said.

WeaponONE possesses multiple capabilities beyond just feeding data to digital twins, according to the Air Force Research Lab. Other capabilities include a software factory/DevSecOps pipeline for the development of simulations, on-board flight software and a government-owned cloud-based tech stack.

Having a government-owned tech stack — all of the technology systems from cloud storage to coding platforms — has been one of the sticking points for perfecting digital engineering. Roper said he wanted private companies to use a government tech stack that uses Cloud One and Platform One offerings like weaponONE does.

The Air Force’s recent breakthrough with weaponOne also gives more hope to ABMS’s uncertain future. Congress has looked at the program skeptically, asking for more strategy and detail on the acquisition process. It’s unclear how the Biden administration will work with the program that is now housed in the service’s Rapid Capability Office.

“This demonstration is just the first of many that will come as we move the weapons enterprise into an era of digital engineering, accelerating weapons development and improvement at the speed of relevance,” said Craig Ewing, senior scientist for AFRL’s weapons modeling and simulation directorate.

GDIT’s global CIO talks about keys to ensuring cloud rollouts meet your agency’s mission

As GDIT’s global chief information officer and vice president for supply chain, Kristie Grinnell leads the company’s enterprise IT strategy and initiatives. Having helped move GDIT to the cloud, Grinnell now works with federal agencies to support their enterprise IT transformation initiatives. Grinnell began her career as a manufacturing engineer at GM who turned her analytic skills into high-profile positions as CIO for GDIT, chief of staff and global IT strategy executive for PricewaterhouseCoopers, and later as director of planning and governance and as director of client delivery enablement for CSC.

In this FedScoop interview, Grinnell highlights some of the critical considerations agencies should keep in mind as they move more of their IT operations into a hybrid cloud environment.

FedScoop: We hear people talk often about hybrid cloud environments.  How would you describe what makes up a hybrid cloud environment? And why is it important to government agencies?

hybrid cloud

Kristie Grinnell, Global Chief Information Officer and VP, Supply Chain, GDIT

Grinnell: Hybrid cloud actually means different things to different people as cloud has evolved over time. First and foremost, I think about it in terms of what as-a-service are you getting. Hybrid cloud could be infrastructure-as-a-service, software-as-a-service, platform-as-a-service, database-as-a-service, you-name-it-as-a-service. You could get a little bit of software-as-a-service from here and infrastructure-as-a-service from there.

The other way you can consider hybrid is it could include on-prem — something I do in my own data center — and something I get from a cloud service provider. It could be a service within a public cloud or a private cloud, like the community clouds that we see in the government’s secret clouds.  And then it could be services from multiple cloud providers — AWS, Azure, Google, Oracle, IBM — and the services they have.

But hybrid environments are important to our government agencies, because the cloud is not one-size-fits-all. Depending on your data requirements, your security requirements, where your end-user is coming from as they access whatever workloads you’re putting into the cloud, and the interoperability you need with your other systems, agencies need to be able to choose the right [mixture of] environments for the cloud, or for the job. It is rare that you would find one cloud is the best for all of your data and workloads. And, government agencies need to be able to manage and govern these multiple environments as one hybrid enterprise, seamlessly and efficiently.

FedScoop: Technology and services available via the cloud keep evolving at blazing speed. How can Government agencies ensure they’re taking proper advantage of these services to further their mission, while ensuring appropriate security and contractual support?

Grinnell: You hit the nail on the head. Speed and agility are obviously compelling reasons to move to the cloud.  However, the type of data and how and who can access the data should be your first consideration.  Once you’ve ensured the security of the environment, then you can start thinking about how you can take advantage of all of the other technologies in the cloud with speed and agility. An enterprise architecture — a secure, standardized way of getting to whichever hybrid cloud environment you want — allows you to build on the speed of the cloud. Standards around how you operate in the cloud and leveraging containerization so you have portability, can plug and play within the cloud capabilities and change out services as you need them.

Because cloud capabilities and technologies are changing so quickly — Cloud Service Providers (CSPs) are introducing new capabilities in their cloud environments with every release that they do — you need to be prepared to use them or say that you don’t want to. And a good, sound enterprise architecture helps you make this decision.

Then as you define the service catalog that you need within each of the clouds, you can really create that ability to move at the speed of the cloud, which is why we’re all there, right? We want to have the agility and the speed of moving with the technology and capabilities that are available for us in the cloud. This is where a contractor and the right partner comes in: You need a contractor that understands each CSP’s capabilities; a contractor that can secure your cloud environment; and a contractor that helps to build the right technology and process standards to unleash the power of the cloud.

Having that deep mission understanding, you can start to connect the dots: For this customer mission, with these needs, I would use this cloud in order to get you there because it provides you this level of security or this level of agility or this level of capability that enables the customers’ mission. And GDIT does all of that being a partner to all of the major cloud service providers.

FedScoop: Large corporations and government agencies, with hundreds of thousands or even millions of users, have many diverse needs as well as sub-entities moving at different paces. How do you align a large constituency like that around the cloud? And how does that affect your technology solution?

Grinnell: This is every CIO’s challenge, right? Number one, we’re stuck with this legacy technology stack that has grown over time —  sometimes through acquisition, sometimes trying to do things on the cheap to get the job done quickly —  or something that’s just evolves without a real strategy. So you have all this technical debt. And you must think through, “What is that going to look like and what is my roadmap to get there?” Then you have all of these customer needs that are very different — and different viewpoints of what the cloud can offer you and which cloud you should go to.

But we’ve learned there are a couple things you have to do. The first is, by putting in place that enterprise architecture we talked about, you have some standards of how you’re going to bring people to the cloud, use the cloud and which clouds you should use in order to meet that customer mission.

Secondly, you have to realize that you have to meet all of those needs in a different way. And this is where it’s not about technology. It’s really about organizational change. And how do you bring the hearts and minds along to help your customer feel comfortable and understand that the cloud is secure, because we have built a sound enterprise architecture with the right cyber controls to ensure it is; that you’re not going to lose complete control, because we have chosen the right cloud experience for you based on your needs; that this isn’t going to break your budget, because we are buying what we need, possibly shutting down applications and releasing data center real estate costs we no longer need; and the more exciting part, which is that we are going to keep pace with technology and move at the speed of the cloud service provider rather than be stuck with our technical debt. We can now leverage advanced technologies for you in the cloud. And that you don’t have to do it on your own.

All of those things have to be brought together so that your customer, as you’re moving them to the cloud, feels that their needs are going to be met. And focus on the mission. Focus on the warfighter. Focus on the citizen. Understand what those needs are and match them up to the right cloud experience. Put that strategy and roadmap in place to drive it, and the organizational change construct to get there, and you can start to move faster every time.

FedScoop: Government agencies have highly sensitive workloads. How can agencies maintain control of their data and ensure security that does not risk unexpected disclosure?

Grinnell: This is where you have to really understand your data first. Data is what we’re securing at the end of the day, and your access to that data. Then you choose which cloud is applicable for the security needs and user experience of that data, and the type of security controls that you need to put in place. Do I need it to be FedRAMP certified? Do I need it to be IL (Impact Level) 5 or IL6? Once we get that, then you can really think about that enterprise architecture again.

Most agencies — not all — are going with the concept of zero trust, where we are thinking about having to validate user access before you give them access to your data, which gives you that sense of control … before they come in through those security controls.

And then, thinking about your experts managing all of these services and the mind-shift that has to take place, from what on-prem security would look and feel like versus when you go to the cloud. You want to make sure you’re not thinking physical security and how you would work in your own on-premise data center — but switch your thinking to virtual security. You have to secure how you connect to the cloud. You have to secure your application in the cloud. You have to secure access to data within the cloud and applications that connect to what is in the cloud. This is the enterprise architecture and standards you build to secure, integrate and operate within the cloud.

FedScoop: The cloud’s elasticity — and the AI and machine learning services available from cloud service providers (CSPs) — offer never-before-seen opportunities to understand massive amounts of data. What are some of the challenges and solutions to realizing the benefits of agency data in the cloud?

Grinnell: That’s right. This is where GDIT’s expertise really comes into play, because with the amount of cloud instances we have and cloud professionals we have, we know how to take advantage of those opportunities. Many of our customers value the fact that we are able to do this in an agnostic way. We’re in a position to give them an unbiased view of what’s best for them at a given point of time and help them navigate between CSPs.

That’s important for several reasons, but in particular, to avoid a degree of vendor lock-in and making sure they get a rich suite of services. GDIT is not only a technical resource, but we’re mission-savvy. We understand our customers’ business and can help advise them on ways in which they can augment their rules and workforce to be able to optimize the cloud with a common set of tools.

Security is always key before you can use any of those opportunities, you have to think about the security that’s involved with them and getting that required level of access.

Then once you have all of this data in the cloud, you have all of this scalable, agile storage and compute power with out-of-the-box capabilities that come with each different service provider. AWS provides something different than Azure which provides something different than Oracle — and each one of those capabilities should be thought through as to how you want to use that data and take advantage of the power of the cloud. This could simply mean more storage for your security logs in the cloud or it could mean more calculations that can run faster to analyze those logs. Or maybe both.

So the sky is really the limit. And we’re doing some really cool things with different customers in the cloud, based on the need that they have and the data they have. If you tried to do that in your own on-prem data center, you’d be waiting years to get enough storage and compute power, because you’d have to procure the hardware, set it up with all the cables and wires, install and configure the right software and test it with the security and all of those things. So it really gives you a great opportunity.

Learn more about how GDIT is helping agencies design systems that deliver increased speed, savings and security.

This article was produced by FedScoop, and underwritten by GDIT.

State Department names new CIO

Keith Jones has been appointed the new chief information officer at the State Department,  with a rank of an assistant secretary of State, according to a department spokesperson.

Jones has a deep background in federal IT, including a job as deputy chief information officer for U.S. Citizenship and Immigration Services from 2012 until 2018, according to his personal LinkedIn profile.

After leaving the federal government in 2018, Jones founded The Edgewater Group DC, an IT consulting firm. He also served as a principal consultant at Deep Water Point.

The State Department is the first of the seven agencies with politically appointed CIOs to name a permanent one in the new presidential administration. Jones’ predecessor, Stuart McGuigan, was known for supporting the department’s shift to telework during the coronavirus pandemic, as well as other IT modernization efforts.

Acting CIOs have been named at the Department of Veterans Affairs, Department of Defense, Department of Homeland Security, and Department of Transportation.

Biden’s COVID-19 strategy promises data-driven response, improved IT interoperability

President Biden promised data personnel, technology and other resources will be devoted to making IT systems for tracking COVID-19 and future outbreaks interoperable at all levels of government as part of his national response strategy released Thursday.

The word data appears 172 times in the 198-page document, and the strategy’s third goal of seven explicitly calls for mitigating the spread of the coronavirus by expanding data, among other things.

Biden‘s data-driven response plan contrasts starkly with former President Trump‘s lack of one, the latter eschewing a centralized, science-based, equitable approach to combating the pandemic.

“The honest truth is we are still in a dark winter of this pandemic. It will get worse before it gets better,” Biden writes in a letter introducing the strategy. “Progress will take time to measure as people getting infected today don’t show up in case counts for weeks, and those who perish from the disease die weeks after exposure.”

To better evaluate progress, the strategy proposes establishing the White House COVID-19 Response Office to coordinate federal agencies, as well as the creation of publicly accessible performance dashboards for tracking metrics like cases, testing, vaccinations and hospital admissions.

Biden already issued an executive order to enhance agencies’ collection, production, sharing and analysis of data. And the Centers for Disease Control and Prevention will maintain a public dashboard of county-level cases, according to the strategy.

In addition to providing state, local, tribal and territorial governments with resources for developing more transparent, interoperable IT systems, the Biden administration is promising to send data support teams to agencies that need extra help providing timely, reliable data.

“The scale of the COVID-19 pandemic in the United States requires improved data systems that can manage large volumes of data, and connect with legacy and new data systems created in response to the current crisis,” reads the strategy.

States and localities still struggling with antiquated systems will be the focus, particularly those with problems connecting to testing laboratories and public health agencies for case investigation, contact tracing and isolating potentially exposed people. The federal government will provide surge personnel to help with manual processes in the short term and develop data technologies to automate those processes long term, according to the strategy.

A successful vaccination campaign needs better data systems as well. The CDC will track vaccine distribution working with states and localities, while the Food and Drug Administration is expected to make data on vaccine safety and efficacy public and improve its systems for real-time safety monitoring.

Vaccination progress will be measured by tracking states against their targets, assessing vaccination sites’ reach and comparing administered doses to those allocated. Among those who are vaccinated, the percentage that receive the initial dose and full regimen will be monitored, as will the timeliness of completion, reasons for dropping out of or refusing vaccinations, and issues with handling like temperature abuse and stock-outs.

Software systems critical to collecting this data will be integrated and streamlined by a cross-functional team of experts working with the Department of Health and Human Services and CDC, as well as states and localities.

Health workforce mobilization and vaccination appointment scheduling may require new technology solutions the federal government is expected to develop. And improving the resilience of pandemic-related supply chains for personal protective equipment like masks will require improved surveillance and data systems, according to the strategy.

Longer-term, the strategy proposes establishing a National Center for Epidemic Forecasting and Outbreak Analytics for modernizing data systems and establishing global early warning and trigger systems for detecting, preventing and responding to emerging biological threats. The center would develop sustainable infrastructure for data integration and IT systems at all levels of government.

Improving data quality

Not only IT systems, but the data they manage, are in need of improvement.

The CDC needs up-to-date national and state data to issue the best possible guidance on social distancing, testing, contact tracing, school and business re-openings, and masking, according to the strategy.

Agencies are expected to release non-personally identifiable data in machine-readable form, and nowhere is the gap larger than when it comes to reporting on high-risk groups.

For that reason, Biden’s executive order instructs agencies to increase data collection by race, ethnicity, geography and disability while affirming the privacy of high-risk populations. HHS will optimize data collection from public and private entities when feasible.

A COVID-19 Health Equity Task Force will make recommendations for speeding up data collection on communities of color and other high-risk groups, identifying data sources and addressing challenges like data intersectionality.

The Centers for Medicare and Medicaid Services are expected to report COVID data where they can, and HHS will work with insurers, pharmacies, labs and state immunization offices to improve access to their data on high-risk populations.

The strategy assures that data won’t be shared with federal or state law enforcement or Immigration and Customs Enforcement. Rather an executive order is expected soon ensuring the Bureau of Prisons and ICE release their data on COVID spread at their facilities.

How SOCOM tapped college students to work on new tech

When the U.S. military’s special forces needed backup, they called on some college kids.

During the summer of 2020, the U.S. Special Operations Command (SOCOM) partnered with Coding it Forward, a program with a mission to get young, civic-minded STEM students into government, for a specialized internship program. The students worked directly on projects related to predictive maintenance, automation and video indexing, getting their hands on mission-critical code and real data.

SOCOM found such “tremendous progress” with the internship program, it plans to continue using internships as a new way to get more technology talent in the door. 

“Until we are able to fix the talent pipeline … we need more,” Nicole Nemmers, a SOCOM official who helped oversee the program, told FedScoop in an interview.

Hiring young interns, training them on the job and offering them future employment has been a common practice in finding talent in the private sector. Facing the eternal challenge of attracting tech talent to the government and retaining it, SOCOM sees internships as a possible solution.

It was also new territory for Coding it Forward, which for years has placed young data scientists and software engineers in federal agencies like the Census Bureau through its Civic Digital Fellowship. SOCOM was the first military or national security-focused organization the program worked with, a move that it hopes to continue.

For the Pentagon, recruiting more technology talent has become a higher priority as it faces the reality that the future of warfare will be technology-driven. Leaders all the way up to the secretary level have said that harnessing artificial intelligence and other emerging technologies will be paramount to victory in the coming years, and engineers who understand the complicated tech need to be in the building.

Top cover

The partnership with SOCOM started at the top. Coding it Forward co-founder Chris Kuang met with SOCOM Commander Gen. Richard Clarke and then-Chief Data Officer Dave Spirk (now CDO of the DOD) when they visited Harvard in the fall of 2019 to discuss artificial intelligence and machine learning. Kuang, then a student, pitched the pool of STEM fellows he and co-founder Rachel Dodell had organized as a way to fill the talent gaps Clarke and Spirk faced – an idea that both the long-time military officers embraced.

“We had that conversation and realized we could make that fit,” Kuang said.

After their initial meeting, Coding it Forward selected candidates from about 1,000 applicants. Six were placed with SOCOM, some in the Washington, D.C. area, and some at the SOCOM headquarters in Tampa, Fla.

The work included an AI-based predictive maintenance project on CV-22 helicopters, highly automated video indexing (HAVI) for the visual media SOCOM captures and building business process improvement applications. While the pandemic forced some of their work and extra-curricular mentoring activities to be done remotely, every intern did have some in-person work due to SOCOM’s paramount security requirements.

Three of the six interns have gone on to work in the national security sector, either directly for SOCOM or with contractors, Dodell said. The others had to go back to classes in the fall.

Nemmers, who’s officially SOCOM’s chief of mission management for its Command Data Office, said many of her colleagues were excited to see young faces virtually and in-office.

“Everyone was pretty excited,” she said. “Everyone recognizes the talent we have in the space.”

The diversity of the team was an added benefit, Nemmers said, pointing to the cohort’s mix of gender and racial backgrounds as an important part of building teams especially when it comes to working on things like AI.

Coding it Forward plans to keep sending interns to SOCOM and other parts of the federal government. SOCOM said it is already looking for more groups to partner with.

“There is a severe lack of talent,” Nemmers said. “We absolutely look to use internships as a way to hire.”

In public appearances, Gen. Clarke has also endorsed the program as a key part of the “AI-ready workforce.” The general also attended the final presentation this summer for the Coding it Forward cohort, a move those involved with the program welcomed as an indication of his support.

“We are excited by continued support from leadership,” Dodell said.

In confirmation hearing, Austin gives away little on how he would handle DOD tech as secretary

The soon-to-be confirmed nominee for secretary of Defense, Lloyd Austin, made no major pronouncements about technology policy during his confirmation hearing Tuesday, but he committed to working on overarching innovation and technology goals within the Department of Defense.

Austin, a decorated battlefield commander who ended a decades-long career in the Army as a four-star general, does not have deep experience in tech, but he showed an awareness of the DOD’s recent emphasis on those kinds of programs. He told the Senate Armed Services Committee that China was the No. 1 threat to U.S. strategic security and said he will work to ensure the DOD maintains and broadens its competitive advantage.

“We will have to employ the use of space-based platforms, we have to use [artificial intelligence],” he said. “This is not a choice, in my view.”

The expectation among military tech experts is that Austin will surround himself with the kind of tech leaders who can meet the challenges at hand, including working on new concepts like Joint All-Domain Command and Control (JADC2), where all battlefield networks will be linked and share data seamlessly with the help of artificial intelligence.

Even with good advisers and managers, though, the secretary would need to dedicate significant time and resources to thinking about technology, said Chris Brose, a former Senate Armed Services Committee staff director and current head of strategy at Anduril Industries.

“A level of technical understanding is required to call balls and strikes,” Brose told FedScoop in an interview. (Brose said he was aware that Austin has read his book “The Kill Chain” on concepts like JADC2 and the future of technology-driven warfare.)

Others shared Brose’s concerns about the need for those at the top of the Pentagon to dedicate significant time to technology modernization.

“I am anxious to see how the staff fills out in the coming months. Every nook and cranny of DOD has a stake in modernization,” Lindsey Sheppard, a fellow at the Center for Strategic and International Studies who focuses on emerging technology, told FedScoop.

Sheppard added she was not surprised by the tact of some of the questions and answers — and non-answers — during the hearing.

“Ret. Gen. Austin gave the answers I would expect given that, as he said, much of this requires working with the services and will require tough calls on competing priorities,” she added. “I would not expect him to box himself into any commitments in this particular setting at this time.”

Many project that the DOD’s overall budget will not grow, leaving a zero-sum fight for dollars between the services and for the secretary to make calls of how budget requests prioritize money. The fight for money could complicate ongoing risky modernization initiatives, like the Army’s contribution to JADC2, Project Convergence or the Air Force’s Advanced Battle Management System, which will be the technical backbone of the larger concept.

Austin committed to working with senators on technology issues, such as acquisition reform. He agreed with some senators that the DOD needs to move away from the approach where the department sends out hundreds of pages of requirements to industry and instead opt for a more innovative, solution-driven method without the rigid requirements.

There was one thing he was clear on: China is the U.S.’s preeminent security threat and global competitor, with much of that competition taking place on the technology front.

“I think that gap has closed significantly,” he said of the difference between the U.S. and China’s military might. “Our goal will be to expand that gap.”

He referenced needing to ensure the right operational concepts are in place to meet future threats. While Austin did not mention JADC2 specifically, he did say that connected battlefield networks would be a priority and that AI was a necessity. AI will play a key role in JADC2 by helping sift through data coming from sensors across the battlefield connected in the new operating concept.

Brose and Sheppard both agreed on one move by the Biden administration that signals technology modernization action on the horizon: nominating Kathleen Hicks as deputy secretary. She brings years of experience in defense policy and keen insights on modernization.

“She is the right person,” Brose said. Sheppard and many others have echoed praise for Hick’s experience and ability to help lead the department, especially on business practices and technology modernization.

The Senate Armed Services Committee approved Austin’s nomination Thursday. His full confirmation by the Senate requires a waiver from a rule for former officers to be out of uniform for seven years before taking a civilian DOD job. That waiver must pass both the House and Senate. The House passed the waiver Thursday afternoon 326-78, with the Senate expected to vote later in the day.

CMMC requirements open door to modernize security

Andy Stewart is a Senior Federal Strategist at Cisco and advises federal partners on strategies to implement that innovated cybersecurity and AI/ML solutions.

At the end of 2020, the Department of Defense finalized security requirements that the defense industrial base (DIB) must meet as part of contractual obligations to handle controlled unclassified information with defense agencies.

CMMC

Andy Stewart, Senior Federal Strategist, Cisco

The trust-but-verify model, known as Cybersecurity Maturity Model Certification (CMMC), assesses contractors on a five-level scale of cybersecurity maturity — a major change from the DIB’s prior cybersecurity requirements, under which companies could self-attest their compliance.

This couldn’t come at more critical time. The aftermath of the SolarWinds hack is a sharp reminder of the importance of securing supply chains from cybersecurity vulnerabilities. The DIB has long been targeted by malicious cyber-actors, and supply chain vulnerabilities are particularly large with a community of roughly 300,000 organizations that support the warfighter and contribute towards DOD systems.

Leaders at these organizations cannot afford to view cybersecurity as an area for just the IT department to focus on. Rather, to stay ahead of threats, these organizations need to commit to well-established cybersecurity practices as a core strategy across all lines of business — and implement them from the top down.

Better protect the government’s information

CMMC constitutes an important, practical checkpoint for every organization wanting to do business with the Defense Department. It is a prescriptive, standards-based approach that requires organizations to be assessed and certified by an accredited C3PAO assessor organization. The goal is not just to ensure the security of individual suppliers, but to drive broader change across the DIB community to safeguard the security of technology and information that supports military operations.

Security threats are growing in size and complexity and organizations are not going to be able to buy their way out of security challenges, hire enough cybersecurity professionals or purchase every vendor solution on the market to meet the growing number of threats.

At the end of the day, CMMC is about delivering trust throughout the greater supply chain security landscape — by ensuring products are trustworthy. It extends trust down to the device and software. And it gives organizations the means to verify that their products are free of counterfeit parts or that the software running on them hasn’t been corrupted.

Using a platform-based approach to security can provide organizations deep visibility into their enterprise and enable them to implement the required security controls and processes. This approach embeds tools into the enterprise network, at scale, and are designed to work together to support the streamlined implementation of CMMC capabilities and processes.

A security partner to ensure compliance and greater visibility

Organizations can sometimes fall into the trap of meeting compliance guidelines without getting the full benefit of their security tools. Therefore, as leaders look to meet CMMC requirements, it’s important to understand that it’s not just about securing technology, but rather how the technology supports and enables the implementation of mature processes.

Cisco recently partnered with the Cyentia Institute to conduct a study which surveyed nearly 5,000 IT, security and privacy leaders from across 25 countries to gain a clearer picture about where organizations are in their security journey.

The study examined what organizations are doing to meet security objectives, drawing from security standards such as NIST Cybersecurity Framework. According to the results, organizations wanting to maximize the overall success of their security programs ideally start with a modern, well-integrated technology stack.

The survey looked at 275 possible combinations of security practices and their resulting outcomes — 45 percent showed a significant correlation. In all, the results indicated seven practices that best contribute to key security outcomes: a proactive technology refresh, well-integrated technology, timely incident response, prompt disaster recovery, accurate threat detection, program performance metrics and the use of effective automation.

As the industry leader in zero trust, and from the breadth and depth of customers we serve, Cisco is uniquely positioned to help organizations excel in these practices, and implement an open, standards-based platform to achieve visibility and security controls that can meet or exceed CMMC requirements.

We work rigorously with leading technology partners to help ensure and attest to the security of products throughout the supply chain. And because our products are so prevalent throughout the DOD and global infrastructure, we implement very rigorous value chain security on our products to ensure that they are genuine. That’s why we are in the number one position on Gartner’s Supply Chain top 25 for 2020.

Add that to how our solutions work together — providing an open, standards-based platform that easily integrates with existing capabilities in a corporate enterprise — and you get an unequaled approach to providing products with embedded trustworthy technologies that help ensure security throughout the entire supply chain.

Mitigating cyber risk proactively

The study also makes it clear that in addition to gaining greater visibility of the supply chain, enterprises must also devote constant attention to detecting and remediating threats as they’re happening.

Cisco understands the threat environment better than anybody. We contribute to that by sharing threat information with the entire community, via Cisco Talos — our threat intelligence and vulnerability research organization at the center of our security portfolio.

Because the Cisco Security system covers email, networks, endpoints and everything in between, Cisco Talos provides more visibility than any other security vendor in the world. Cisco Talos understands the threat landscape — providing powerful intelligence to the DIB.

Meeting the requirements laid out under CMMC may prove challenging for many organizations in the DIB. Taking advantage of Cisco’s unique expertise, however, can help not only reduce those challenges, but also help underpin the security of the whole supply chain as well.

Learn more about how Cisco is helping the defense industrial base meet DOD security requirements.