DOD clears path for first assessor to enter CMMC market
The Department of Defense‘s cyber inspectors approved the first company to become a certified assessor for the department’s new contractor cybersecurity standards, clearing a critical hurdle in the process.
The DOD’s Defense Industrial Base Cybersecurity Assessment Center (DIBCAC) approved the first company, which was not named, to move forward in the Cybersecurity Maturity Model Certification (CMMC) process, a spokesperson told FedScoop. Now, it is up to the CMMC Accreditation Body (CMMC-AB) to grant the company Certified Third Party Assessment Organization (C3PAO) status, meaning that it can officially assess the maturity of defense contractors’ cybersecurity in compliance with new CMMC requirements.
“[W]e can say the first C3PAO has been certified by the agency. Keep in mind, the certification process is multi-tiered and [Defense Contract Management Agency’s] role is to verify and validate the ability of a C3PAO to protect the data that will be entrusted to them,” Matthew Montgomery, spokesperson for the DCMA, the agency that houses DIBCAC, told FedScoop.
The initial approval of the anonymous company is a critical milestone for the CMMC program as many have worried that there won’t be enough accredited C3PAOs to conduct CMMC assessments at a rate fast enough to meet DOD’s target of auditing all 300,000 companies in the defense industrial base over the next several years. Come fiscal 2026, the DOD will have CMMC requirements in all contracts.
The CMMC model is a tiered system with five levels of cybersecurity maturity that all defense contractors will be tested against once every three years. The DOD has said that most contractors will only need a level one assessment, but many expect level three, which is equivalent to the current standard for handling controlled unclassified information, to be more common than expected.
Under CMMC, companies can no long self-attest to meeting cyber requirements. Accredited assessors will need to evaluate and test their systems and policies against the new CMMC standards.
“If you do the math on that…how is that feasible?” Johann Dettweiler, director of operations for TalaTek, a prospective C3PAO, said in an interview in April. He added: “There is…a little bit of a log jam.”
At least for now, part of that log jam appears to be lifting, but many more assessment organizations are awaiting their initial assessment from the DIBCAC.
But many cybersecurity companies have found the rules on policy documentation during the initial assessments to be too strict. That could spell trouble in the future if cybersecurity experts have trouble meeting the standards; and, if it took the DIBCAC months to clear the first company, assessments for companies with less mature network defenses could take even longer.
“You have to be able to show that you have the policies and that you have been living the policies, and that last part is really tricky,” said Jim Goepel, a former CMMC Accreditation Body member and the CEO of Fathom Cyber.
Air Force working on an App Store for IT
The Air Force says it’s making huge leaps and bounds in acquiring enterprise IT services that could help move missions forward — but not everyone who could be using the tech knows about it.
To market the Air Force’s new IT services better, Chief Information Officer Lauren Knausenberger said she is working to build a one-stop-shop like Apple’s App Store or Amazon.com to list products and services that have an authority to operate (ATO) across the service’s enterprise.
Products like Tableau’s data visualization software, cloud offerings and others will be presented on a website that offices across the force and click and buy.
“There are actually some really great services [available] today, but it requires many, many meetings and phone calls and in-person interactions to help people understand what those services are,” Knausenberger said at Cisco’s FedFWD 2021 Summit produced by FedScoop. “At this point, they are mature enough that I should be able to go to a website, click on it and buy it.”
The current state requires airmen to sift through contracts, contact contracting officers and even go through another round of the ATO process, which can take weeks, even if a service is already cleared. It’s a frustrating process both for those seeking to use tech and those, like Knausenberger, that help buy it.
She said that with a “single storefront” for tech, the department could save time and money.
“We are very excited about this and I think it will streamline a lot for us,” Knausenberger said.
The initiative is a part of a broader effort by the Air Force to share more of its successes and market the products and services available to airmen. The storefront will also share some of the internal technology the Air Force has built with its Platform One, Cloud One and Kessel Run teams.
“Sometimes even when you solve problems in the digital transformation realm, it is really hard to tell people that you have solved them,” Knausenberger said. “We are not marketing organizations.”
The Air Force now has cloud capabilities through Cloud One that reach the secret level of security. Some legacy systems have even started migrating to the secret cloud, the CIO said. Similarly, its Platform One team created a DevSecOps platform that has a continuous ATO that allows teams of airmen to craft secure code that will be authorized for use from the get-go.
Steve Harris leaving Dell after 23 years to join ed-tech firm Ellucian
After more than two decades with Dell Technologies, Steve Harris is leaving the tech giant for a C-level leadership role at a well-known higher education IT firm.
Dell announced internally Thursday that Harris will be replaced by, Jim Kelly, vice president of Dell’s Department of Defense and Intelligence business.
Harris will join Ellucian on May 21 as chief revenue officer. Ellucian provides cloud software to power higher education institutions’ digital applications and platforms across campus. Best known for its enterprise resource planning and student information systems platforms, Ellucian has more than 2,700 higher education customers in more than 50 countries. The company’s software serves more than 26 million students.
“Having served the education sector for most of my career, I am acutely aware of the challenges facing many institutions, as well as the opportunities that result from implementing a modern technology strategy,” Harris said. “I look forward to working with the entire Ellucian team to lead higher education’s digital transformation in support of improved user experiences and student outcomes.”
“With more than two decades of leadership at Dell in public sector and education in the U.S. and global markets, Steve brings a strong background in digital transformation from edge to the core and cloud ideally suited for Ellucian’s rapid growth and the goals of our customers,” said Laura Ipsen, president and CEO of Ellucian. “Steve’s personal passion for higher education, expertise in cultivating deep customer and partner relationships, and a reputation for creating high-performing, diverse sales cultures made him the ideal leader for Ellucian.”
Since 1998, Harris has taken on a number of leadership roles with Dell, mostly focused on building the firm’s business with public sector organizations, including educational institutions. Most recently, he’s served as senior vice president and general manager of public sector.
“It was an incredible honor to work at dell for 23 years and a privilege to serve and support our most valued customers, the U.S. public sector,” Harris told FedScoop.
During that time, Harris has overseen a shift in Dell being known as a purely hardware-focused business to a partner to public sector organizations providing end-to-end, cloud-based digital transformation solutions. For instance, Dell was part of the team that won the Department of Defense’s massive Defense Enterprise Office Solutions (DEOS) cloud contract in 2019. He also worked through the merger of Dell and EMC federal government businesses into Dell Technologies.
Harris is a winner of multiple FedScoop 50 awards and recognized on FedScoop’s Best Bosses in Federal IT list.
Biden cyber executive order reignites push to cloud, zero trust
Zero trust security is no longer just an option for federal agencies.
The Biden administration issued a long-awaited cybersecurity executive order Wednesday that, among other things, requires federal agencies to develop an implementation plan for a zero-trust architecture for security.
This mandate falls under a larger push to modernize federal cybersecurity in the wake of the recent cyberattacks that have compromised federal agencies through the exploitation of software made by contractor SolarWinds and flaws in Microsoft’s Exchange software.
“The Executive Order helps move the Federal government to secure cloud services and a zero-trust architecture, and mandates deployment of multifactor authentication and encryption with a specific time period,” reads a fact sheet about the order. “Outdated security models and unencrypted data have led to compromises of systems in the public and private sectors. The Federal government must lead the way and increase its adoption of security best practices, including by employing a zero-trust security model, accelerating movement to secure cloud services, and consistently deploying foundational security tools such as multifactor authentication and encryption.”
Within 60 days, agency heads must update their existing plans “to prioritize resources for the adoption and use of cloud technology” and issue a new plan on moving to zero trust, in line with National Institute of Standards and Technology (NIST) guidance.
On top of that, the Office of Management and Budget will work over the next 90 days with the Department of Homeland Security and General Services Administration to develop and issue a federal cloud-security strategy and guidance.
And, within 180 days, civilian agencies will need to “adopt multi-factor authentication and encryption for data at rest and in transit, to the maximum extent consistent with Federal records laws and other applicable laws.”
Modernizing federal cybersecurity is just one element of the larger EO. It also calls for increased sharing of threat information between the government and private sector, and for the development of baseline software supply chain security standards for any software sold to the federal government.
“The current market development of build, sell and maybe patch later means we routinely install software with significant vulnerabilities into some of our most critical systems and infrastructure,” a senior Biden administration official told reporters. “The cost of the continuing status quo is simply unacceptable.”
Additionally, the order calls for the creation of a national Cybersecurity Safety Review Board, akin to the National Transportation Safety Board, and the creation of a playbook for responding to cybersecurity incidents. With that, the administration orders agencies to “employ all appropriate resources and authorities to maximize the early detection of cybersecurity vulnerabilities and incidents on its networks” through improved endpoint detection and response measures.
Democrats on the House Homeland Security Committee applauded Biden’s executive order.
“Cybersecurity is a national security issue, and we commend the Administration for prioritizing it that way. From the SolarWinds supply chain attack that gave Russian actors access to Federal networks to the Colonial Pipeline ransomware attack that temporarily shut down 5,500 miles of gas pipeline, cyber attacks jeopardize our national and economic security,” said Reps. Bennie G. Thompson, D-Miss., and Yvette D. Clarke, D-N.Y. “If nothing else, the cyber incidents that have occurred over the past six months have demonstrated that bold action is required to defend our networks today and in the future. The Executive Order signed by the President today is just that.”
NSF wants to predict and prevent the next pandemic with AI
The National Science Foundation wants researchers to develop artificial intelligence systems capable of forecasting pandemics like the ongoing COVID-19 pandemic.
Dubbed the Predictive Intelligence for Pandemic Prevention (PIPP) program, NSF has already funded and held four workshops since February scoping the research that still needs to be done to create the necessary algorithms.
PIPP’s goal is to detect disease outbreaks early enough to limit transmission and prevent epidemics and pandemics like the current one from ever occurring.
“It is clear that this will not be the last pandemic humanity will face,” Katharina Dittmar, program director of the Environmental Biology Division at NSF, said during a media briefing Wednesday. “We all recognize from the past year, from our experience that our approach to this continued challenge must really evolve away from crisis response during discrete outbreaks toward a sustained and integrated cycle of intelligent prediction, preparation, response and recovery.”
Intelligent prediction requires sustained investment, multidisciplinary teams, and cross-agency and international partnerships to continuously collect the necessary data. But it also requires new AI models for identifying the biological and sociological drivers of pathogen emergence, Dittmar said.
No “canned” algorithm exists that can do that, said Mitra Basu, program director of the Computing and Communication Foundations Division at NSF.
“New research has to be done in this space before one can confidently say, ‘Yes, this AI algorithm is predicting the right thing and providing us with explanations why it is doing that, instead of just a black box,'” Basu said.
That’s why PIPP convened workshops not only for computer scientists but also for medical researchers, biologists, and social and behavioral scientists. Predictive analytics are already underway in those spaces, so it’s not as if NSF is starting from scratch modeling and applying AI to pandemics, said Arthur Lupia, the agency’s assistant director of social, behavioral and economic sciences.
Still, NSF officials didn’t provide a timeline for when PIPP algorithms capable of forecasting pandemics would be operational.
“It’s often very difficult to pinpoint the day on which you’re going to make the key discovery,” Lupia said. “But the one thing that only NSF can do is accelerate those timelines so that we get the results that we really need, the insights we need sooner rather than later.”
Congress plans to keep a close eye on bloat in Space Force
The Space Force should stay as lean as possible while ensuring it onboards tech talent, key members of Congress warned in the past week.
While the White House has yet to issue a full defense budget request, the Space Force is already getting warnings to not add any bureaucratic bloat to its spending, Rep. Anthony Brown, D-Md., who sits on the House Armed Services Committee, said Wednesday.
“What we don’t want to see in the Space Force is a burgeoning headquarters and a Fourth Estate,” he said during a virtual Center for Strategic and International Studies event.
The Fourth Estate is the Department of Defense’s support agencies that are not part of the military services, like the Defense Information Systems Agency and Defense Contract Management Agency. Critics often point to these agencies as an easy target of defense budget cuts, and Brown gave the Space Force a preemptive warning not to start growing its own.
Brown did, however, extend his general support for the Space Force, given the advancement in space technology and the potential for conflicts in the domain.
Brown’s comments follow those from the top Democrat on the House Appropriations Defense Subcommittee, who told Space and Air Force leaders Friday they need to hurry up in filling top acquisition and tech roles. The Department of the Air Force houses the Space Force.
“[W]hile progress has been made on the operations side, progress in addressing long-standing acquisitions issues has been disappointing so far,” Rep. Betty McCollum, D-Minn., said during a subcommittee hearing. “Too often over the past two decades, the space acquisitions programs have been delivered late, over budget, and sometimes billions of dollars over budget.”
That disappointment extends to issues before the Space Force started. The Government Accountability Office found space-based missile warning satellites nine years and $15 billion over budget. It’s an example of what McCollum wants to avoid in the future, she said.
“GAO also found in March 2019 that key software-intensive space programs often did not effectively engage users to understand requirements and obtain feedback,” the report stated.
McCollum described the efforts she has seen so far as only “minor tweaks around the edges” and not the wholesale, ground-up reform the Space Force has promised. She added that senior civilian leadership focused on space acquisition is a must, which Chief of Space Operations Gen. John Raymond and acting Secretary of the Air Force John Roth both agreed with.
“We have got to go faster in modernizing our space capabilities and delivering capabilities and putting them in the hands of the warfighter,” said Raymond during the hearing.
The force recently published a strategy to become a “digital service,” where it would leverage technology in all of its operations. The strategy even pitched the idea of allowing guardians, as Space Force service members are called, to be “digital nomads” working remotely instead of being chained to a desk.
Army looking for tech to help segment data
There are two types of data in the Army: Data that needs to move at the speed of milliseconds and data that doesn’t.
Figuring out what data fits under which category and how to best partition a network to meet the speed demands of that data is a challenge the Army is looking to private industry for help. The No. 2 for the Joint Staff’s J-6 command, control and communications office said the Army wants to find technologies that can help it segment different types of data across networks to save precious bandwidth in conflict zones.
“We are also concerned about overwhelming a limited network, particularly in a denied, degraded, intermittent or limited environment,” Army Brig. Gen. Rob Parker, deputy director of Joint Staff J-6, said during an AFCEA DC virtual event. “We are looking to industry to help us find technological solutions to work through that.”
An example of the type of data that needs to move fast is information relating to targeting. The military has been trying to find tech to help to ensure bombs and other weapons hit their target.
The Army is working to build a “data fabric,” a system of systems that data can flow across. It’s a multi-pronged challenge that the services hopes will enable faster, multi-domain operations by having data from machines in the air talk directly with machines on the ground, sea, space or cyberspace.
The Army’s tech backbone of its multi-domain operation strategy is Project Convergence, the connect-everything-to-everything, sensor-to-shooter networking project that itself is a part of the larger military-wide Joint All Domain Command and Control (JADC2) concept of operations. It’s a nesting doll of technical systems and acronyms, but the essential challenge is to find ways to connect and use more data in operations.
The specific challenge Parker spoke about is ensuring that once platforms are connected through a data fabric, bandwidth and other limited resources are properly used.
“We need to figure out how we are sending the right data and only the necessary data,” Parker said.
What the military itself is doing to help answer that question is drawing up new cloud architectures and data management policies. The Army’s “nirvana” for cloud is a “poly-cloud environment” that provides continuous enterprise support and tactical-edge capabilities.
It’s a part of the unified network plan to integrate cloud and networking capabilities championed by the Army’s top uniformed IT official Lt. Gen. John Morrison, deputy chief of staff for the G-6. The architecture design of that network will be coming in the summer, Morrison said.
DOD Deputy Secretary Kathleen Hicks also recently released new “Data Decrees” that aim to implement the data strategy. Parker said those decrees and future policies from the deputy secretary and chief data officer will inform future standards and technical directives the J-6 and other Joint Staff offices will be putting out.
“In those, we will see reflections of those data decrees,” he said.
The pandemic accelerated digital transformation in Washington — what’s next?
For federal agencies, the COVID-19 pandemic served as a springboard opportunity to jump-start digital transformation — either scaling the transformative efforts they already had underway or rapidly pivoting during the crisis to innovate, catch up and maintain operations.
Now, as the world returns to some semblance of normalcy and agencies look to operate beyond the pandemic, the key will be to sustain that innovation and transformation at scale by creating a culture that fully embraces the shifts that occurred over the past year, said Carl De Groote, area vice president of U.S. federal for Cisco.
Ahead of Cisco’s FedFWD Summit on Thursday, De Groote spoke with FedScoop about what federal agencies should be focusing on in the post-pandemic time ahead and how they can double down and emerge from this tumultuous period as truly digital organizations.
Cisco is a big believer in the power of the platform to drive that transformation.
“Whether it be collaboration to bring expertise together with a citizen that has questions or bring the military together to collaborate around their high-value problem sets to protect our nation, whether it’s helping an analyst and national security find that needle in a haystack, platforms allow technologists and CIOs to extend those capabilities into workflows and processes, to take advantage of quicker access to data, converting it to information,” De Groote said.
Platforms are all about being “extensible, rapidly deployable, rapidly usable, and able to produce rapid results as it relates to work,” he said.
Cisco’s Webex is one of those platforms that came to the rescue during the pandemic and will likely play a key role across the federal government as agencies continue to work in a hybrid format split between in-office activity and personnel working remotely.
For Congress, De Groote said, Webex allowed the House and Senate to continue conducting hearings and writing legislation when it became unsafe to do so in-person.
“It extended the capabilities for the green and red light to be able to pass time back and forth between colleagues, to bring together all of our lawmakers, congressmen and women and senators to be able to be present, to take on topics, to learn, to vote, being able to poll and conduct things that they used to in a physical state rely on, to bring that experience to an online platform and mechanism,” De Groote said.
That’s key for the military as well, he said. “If you think about command and control in a secure environment, how do they do that successfully where they can rely on the fact that collaboration is secure? That the content they’re sharing remains protected? So, now it’s really looking at the different problem sets, the different processes and keep on extending and driving that innovation.”
And that security of the platform, it can’t be thought-about after-the-fact, De Groote said. “It’s got to be integrated, it’s got to be built into the design so that as our customers experience our platforms and conduct their work, they have the confidence that it’s going to be secure all the time.”
De Groote hopes that attendees of Thursday’s FedFWD Summit will come away with “the art of what’s possible,” he said. “How can technology serve the government and its mission, and understand that it’s not only an opportunity to deploy technology and platforms, it’s also the opportunity to reshape policy so that we can administer and deploy new capabilities and embrace change.”
DOD takes automation a step further with machine learning
Automating tasks has long been a goal of large workforces, and none is larger than the Department of Defense.
With financial management systems that process a more than $700 billion budget annually, getting a helping (digital) hand can reduce wasted labor hours and costly mistakes. But simple automation sometimes is not enough to help solve more complex challenges, like pairing unmatched transactions in databases.
That’s why the Defense Innovation Unit teamed up with the Joint Artificial Intelligence Center to inject a boost of machine learning so that robotic process automation (RPA) can approach more complex tasks, like finding mislabeled and unmatched transactions.
“The DOD has been using RPAs for several years to help fix [unmatched transactions], but RPAs are based on simple ‘if-then-else’ cases where most of the [unmatched transactions] require more sophisticated analysis, which up to now have required manual intervention,” said Eric Dorsey, a DIU program manager in the AI portfolio.
The specific problem of unmatched funds wastes millions of labor hours each year, according to the DOD. When dollars are misclassified or unmatched, humans need to go into the spreadsheets to correct the record, often a roughly two-hour job per unmatched data point. With the Army’s two million unmatched transactions per year, that’s millions of hours spent on tasks DIU is proving robots can easily do.
Most RPA bots are limited to small, repetitive tasks where operations are explicitly written into software, limiting the program’s flexibility in uncertain scenarios. But using the massive amount of financial data from the DOD’s systems, machine learning is now being used to help advance RPA’s flexibility to solve problems, saving time and money, according to the department.
“After the machine learning front end creates candidate correction, it is routed to an RPA for the final database update,” Dorsey said in an email to FedScoop.
Machine learning finds knowledge within data, learning from massive data sets to find patterns and turn them into predictable algorithms. The data generated through financial systems provide ample information for machine learning to work through and improve RPA’s ability to close the loop on certain tasks, like matching unmatched funds.
DIU paired contractors DataRobot with the Army and Summit2Sea with DOD’s Office of the Comptroller. Both are developing their machine learning platforms within Advana, the government’s secure data management cloud, according to DIU.
Time and money saved
A solicitation for the problem was initially posted in May 2020 with the request to find a machine learning platform that “will identify and suggest corrections to business processes that are not limited to previously well-defined business logic methods.”
The two pilots do not yet cover 100 percent of the unmatched transactions, but they are currently on a “high volume” of use cases.
“We are currently achieving a high level of accuracy with these use cases,” Dorsey said.
It also happening quickly. Because DIU is able to work with commercial and non-traditional vendors by navigating around the traditional contracting morass the department usually goes through, there haven’t been the typical lengthy request for proposals filled with stringent requirements and government legalese — just problem statements and requests for solutions.
Working with the JAIC, DIU helped save time and effort in building out the machine learning algorithms.
“The collective team identified the unmatched transaction use case to be highly suitable for automation while delivering significant mission impact,” Bryan Lane, chief of business and health transformation with the JAIC, said.
GSA leads rise in automation projects governmentwide
The General Services Administration has saved about 50,000 labor hours in 2021 alone by automating work.
On top of that, a dozen machine learning and artificial intelligence projects are in the pilot or developmental phase, while four more are fully operational, according to an agency spokesperson.
The projects are part of GSA‘s “eliminate, optimize or automate” effort over the last two years, an effort that’s only speeding up over time, the spokesperson said.
“We expect that the velocity of AI/ML adoption will accelerate similar to our [robotic process automation] program over the next few years,” the spokesperson said. “The various pilots and projects are on different deployment timeframes but cover all our primary mission areas including Public Buildings Service, Federal Acquisition Service, finance, IT and HR.”
One such project is the Solicitation Review, which uses supervised ML to predict whether federal IT solicitations posted to beta.SAM.gov are compliant with Section 508 of the Rehabilitation Act, which lays out IT accessibility requirements. The tool helps GSA employees more efficiently review solicitations and reduces the risk of noncompliance.
A second automation project is a virtual assistant that provides employees with IT self-help capabilities.
GSA doesn’t rely on one procurement method for AI services, instead using a number of contracts to encourage competition and equity among small and disadvantaged businesses. Such contracts are made available to other agencies as well.
Automation spans a number of technologies including ML, natural language processing, chatbots and RPA — the last of which is often the lowest-hanging fruit for agencies. The State Department, Social Security Program, U.S. Patent and Trademark Office, Department of Labor, Army, Air Force, and Navy are among the agencies that have RPA programs.
A big reason automation projects are on the rise governmentwide is the Federal RPA Community of Practice (CoP) and its voluntary leadership team within GSA, said Jim Walker, chief technology officer at UiPath, during a recent ACT-IAC event.
The government-only user group launched in 2019 and has grown to 69 member agencies and about 1,200 attendees on monthly calls.
In November, the CoP issued a State of Federal RPA report —the first detailed review of the technology across government. The report found a 110% increase in deployed automations between fiscal 2019 and 2020.
Additionally, the report found a 195% increase in capacity hours created.
The CoP created a maturity model for agencies to gauge their RPA progress and saw a 70% increase in Level 4 projects, which went from zero to five between fiscal 2019 and 2020.
While only 23 agencies participated in the first report, that number should grow with enthusiasm for RPA.
GSA Chief Financial Officer Gerard Badorrek recently oversaw a 100-day, industry-wide challenge to create RPA solutions that improved the experience for agencies submitting budget justifications. A total of 10 RPA solutions came out of the event, and 12 employees were trained in the technology.