EIS team developing network-as-a-service offering

The Enterprise Infrastructure Solutions program team is developing a network-as-a-service (NaaS) offering to free agencies from the “vicious lifecycle” of replacing outdated technologies, said Allen Hill, the deputy assistant commissioner of category management at the General Services Administration.

NaaS will move agencies to a cloud business model allowing for continuous modernization of network infrastructure, similar to what’s been done for email and collaboration tools, Hill said during an ACT-IAC event Monday.

Agencies will be able to adopt emerging technologies faster from startups and build out and manage their networks without worrying about integrating new security components and software.

“There are some agencies that are going to be challenged,” Hill said. “They have a large amount of legacy inventory, and we’ve provided a number of options for them to consider to ensure there’s no break in services.”

The federal government is in a “better place” with its transition to the $50 billion network and telecommunications modernization contract, EIS, than it’s been previously, he added.

Out of 212 forecasted EIS task orders, 164 have been released to industry. A total of 93 task orders have been awarded with 55 completed and 48 more awards expected soon.

Of the 17 large federal agencies, nine have awarded all task orders, and 11 of 25 midsize agencies awarded all theirs.

The EIS team has begun its transition closeout project focused on limited and authorized users of the Networx, Washington Interagency Telecommunications System (WITS) 3, and Local Service Agreement (LSA) contracts that expire May 31, 2023.

Agencies’ EIS transitions consist of two parts: transitioning off the legacy contracts and moving all network services onto appropriate EIS task orders.

About 40% of the National Oceanic and Atmospheric Administration‘s spend is already on commercial contracts.

“Once we get through the legacy contracts, we’ll be moving those commercial contracts over as appropriate onto EIS task orders,” said Jeff Flick, deputy director of the Service Delivery Division within NOAA’s Office of the Chief Information Officer.

NOAA is “pushing hard” to meet GSA‘s 18-month deadline for transitioning but needs risk-mitigation contracts in place as stopgaps for any services still lagging behind, Flick added.

While agencies structure their own task orders and requirements, GSA is willing to help however it can when vendors fail to deliver on time.

“The one thing at GSA, we don’t necessarily have insight into what those specific task order requirements an agency has,” Hill said. “The agencies can certainly reach out to us, and we’ll work with them and help them to facilitate any type of challenges they may be having with the vendors.”

New bottleneck emerges in DOD’s contractor cybersecurity program, concerning assessors

Companies in line to become certified assessors for the Department of Defense‘s supply chain cybersecurity program are facing a new roadblock: Getting and passing an assessment of their own.

There’s a bottleneck in licensing assessors under the DOD’s Cybersecurity Maturity Model Certification (CMMC), according to multiple organizations waiting to go through the process. It not only frustrates these companies that are waiting to enter a potentially lucrative market but also threatens to complicate the timeline for implementing a critical DOD cybersecurity program.

The CMMC program requires every contractor in the defense industrial base to hire a licensed assessor to inspect its networks, something that cannot be done if there are no fully licensed assessors to hire.

“There is … a little bit of a logjam,” Johann Dettweiler, director of operations for TalaTek, a prospective Certified Third-Party Assessor Organization (C3PAO), said in an interview.

TalaTek’s assessment is slated to get its required assessment from the DOD’s Defense Industrial Base Cybersecurity Assessment Center (DIBCAC) this spring if all goes well with those in line before it. But Dettweiler was concerned that might not be the case, after learning on a call with other C3PAOs that they were having difficulty meeting CMMC level three, the mid-tier level of security required in the DIBCAC assessment.

Four people familiar with the matter who asked not to be named said they also were told initial audits were difficult and taking longer than expected; one source directly familiar with the matter pointed to the maturity documentation associated with level three as what was tripping some up.

“You have to be able to show that you have the policies and that you have been living the policies, and that last part is really tricky” Jim Goepel, a former CMMC Accreditation Body member and the CEO of Fathom Cyber, said in an interview.

The CMMC Accreditation Body, the organization that issues licenses to C3PAOs and oversees other parts of the CMMC ecosystem, announced in March that one assessment had been completed but did not share the results or name the company. More than 100 companies are cleared to get their assessment and hundreds of others are awaiting their initial background check and training from the AB.

The AB had little comment on the bottleneck besides expressing its steadfast support for the current requirements and saying the “CMMC-AB is on target for projections” to meet demand. DOD and DIBCAC did not return multiple requests for comment.

In public comments, Katie Arrington, the chief information security officer for acquisition and sustainment, has defended the need for the level three requirement for assessors, saying it’s a security imperative.

“Why? Because they’re going to be the ones processing your company’s information and inputting that into the only place that your company’s information will be stored,” in DOD’s own database called eMASS, Arrington said last week at an Amazon Web Services summit.

Cascades of bottlenecks

A secondary challenge with the C3PAO rollout is that it is a critical piece of a multi-step process to ensure the success of the CMMC program. The end goal of having a third-party verification of the cyber standards of the roughly 300,000 contractors in the defense industrial base relies on having those third parties available to do the verification. Without enough assessors and C3PAOs, the entire ecosystem could fall short of its stated goals.

The DOD has given the program five years to get its feet under it — after that, CMMC will be a requirement in all defense contracts. The AB, which is largely responsible for rolling out the program, says it remains on track to meet the DOD’s timeline. But Dettweiler and others have concerns about the potential downstream effects from the current pace of C3PAO accreditation.

“If you do the math on that…how is that feasible?” Dettweiler said of getting the number of companies certified by C3PAOs on time.

Matt Titcombe, CEO of Peak InfoSec and chief information security officer of its parent company Gigit, added a strong “no” on whether he thinks the program is on track to meet the eventual demand for CMMC assessments.

“I don’t know if we are even going to get one done this year,” he said, adding that he thinks the current timeline is based on a “perfect world” scenario.

Many offered potential fixes, such as allowing for the initial assessment for the C3PAOs by DIBCAC to be less stringent and allow for assessors to submit their plans to meet compliance instead of demanding maturity of compliance upfront. Others have urged DOD and the CMMC AB to issue a more clear written policy on the scope of assessments.

“You’ve got to make this upfront huge investment without having any potential business from the DOD at all,” Max Aulakh, CEO of Ignyte Assurance Platform, said of the current state of security requirements.

Small assessors, bigger concerns

For small assessors, the concerns are even more acute. With only five people on his team, Steven Senz, CEO Ascertis Solutions, knows he would have to hire more people just to meet the level three security maturity requirements. He said in an interview he is hoping to subcontract with larger C3PAOs to be able to do assessments in the future.

Senz said he wished there was more regular, official communication from DOD and the AB about the requirements and policies of the program. If he had known more when the initial application payment was due, a $1,000 fee, he may have chosen a different path for his new company.

“Pay $1,000 in, then another $3,000 to show you can be CMMC level three, but you never really put a disclaimer in that unless your company is of a certain size, don’t bother to apply,” he said of the information provided by the AB when he initially applied. “You took my money and I’m not certain under the criteria now you are imposing on C3PAOs I am actually going to be able to get through all the gates.”

Sustaining telehealth services with data intelligence

The use of electronic telehealth platforms by public and private health care organizations have become essential to delivering care remotely during the pandemic. They’ve also demonstrated a promising way to deliver health care at scale — especially as telehealth adoption gains wider acceptance by patients and doctors.

However, the underlying technology that is required to make telehealth a sustainable, long-term solution depends both on data intelligence tools and security controls to be properly deployed, according to Ann Mehra, strategic healthcare programs leader at Splunk, and a former associate director at Massachusetts General Hospital. Telehealth goes well beyond videoconferencing. It requires assembling real-time data — at scale — to facilitate end-to-end communications and workflow.

telehealth

Read the full report.

“Health care presents a special challenge in gathering information, both because of compliance regulations as well as the stakes involved in making clinical decisions — all of which demand that IT systems and security controls be properly deployed, up to date and monitored continuously,” she says in a new FedScoop report, underwritten by Splunk.

Mehra describes how health care organizations initially raced to scale up virtual appointments. The result, however, was a swift realization that their systems weren’t configured to handle the added volume — in some cases, services were ground to a halt.

The report cites an example of the new volume demands from the U.S. Department of Veteran’s Affairs which had already launched a live-video consultation platform in the summer of 2017. In fiscal year 2019, the VA conducted a little over 2 million “episodes” of telehealth care. That number grew from roughly 10,000 per week in February 2020 to 120,000 per week by that May, according to VA figures cited in the report.

This level of behind-the-scenes technical support demonstrates the need for a modern and secure platform to manage capacity, interoperability, security and risk, the report argues.

Mehra shares how Splunk is working with their health care partners to more fully assess the security and reliability of their IT ecosystems.

Through Splunk IT Service Intelligence, they provide organizations of all sizes the ability to “attain end-to-end visibility of the services operating on their networks in real time; identity abnormalities and perform root cause analysis; and perform automated remedial actions to streamline incident resolution,” she says.

The report describes Splunk’s “data-to-everything” platform which allows organizations to ingest machine data, non-machine data, structured and unstructured data and turn it into actionable insights. “As we ingest more of this historical data, we then can start to apply AI and machine learning… and predict over time how the system will perform,” says Mehra.

The platform ultimately helped one of Splunk’s customers decrease teleconference call failures from thousands per day to less than 10. It also improved bandwidth and call capacity from 50% to nearly 100%. And it exposed unknown interoperability gaps and extended visibility into remote network user access to improve communications.

That enhanced level of assurance will become more essential as health care agencies and providers continue to embrace telehealth as a model for delivering care more efficiently.

“It becomes very important to not just have telehealth for the sake of telehealth; but having all of the backend applications and technologies necessary to have a productive encounter, just as you would if you were in a clinic,” she says.

Learn more about how Splunk can help your agency better prepare for the data demands of today’s modernized health care systems.

This article was produced by FedScoop and StateScoop and sponsored by Splunk.

Enabling new air, space and satellite capabilities through the cloud

When the Department of Defense was tasked with creating the new U.S. Space Force, few people offered more experience to lead the planning and implementation effort than Maj. Gen. Clint Crosier, a 33-year Air Force veteran. Crosier had headed numerous high-profile assignments, including as director of Space and Intelligence Programs in the Office of the Under Secretary of Defense, and Chief of long-range strategic planning for the Air Force. Now he is taking that experience to AWS’s Aerospace and Satellite business.

Crosier has built a team of aerospace and satellite experts to bring the power of cloud computing to an industry that itself is undergoing its own revolution. In this exclusive interview, Crosier talks about how the aerospace and satellite industry is leveraging cloud computing, and how AWS is helping organizations get “to the stars, through the cloud.”

FedScoop: What are some of the broad challenges you’ve seen where the scale and power of cloud computing and analytics proved critical to advancing the mission?

aerospace

Clint Crosier, Director, Aerospace and Satellite Solutions, AWS

Crosier:  In my experience, teams often encounter some common challenges that can make it difficult to successfully complete their mission. In aerospace, these challenges or barriers might include anything from the upfront costs associated with building infrastructure, to the time and cost associated with engineering work, modeling, simulation, and testing for new satellites, launch vehicles, or designs.

As we look to the future, moving necessary processes like these to the cloud is going to be a game changer for defense as well as commercial aerospace applications because it has the potential to save significant time and expense. Digital engineering, digital testing, and digital modeling and simulation using the cloud will allow you to do all of this in a much more efficient way.

In the satellite industry, a huge challenge continues to be the ability to rapidly and reliably downlink, store and manage the vast amounts of data that operators are capturing every day. Consider that operators are capturing high-resolution satellite imagery that amounts to petabytes every single day. There is simply no way to analyze and share such massive volumes of raw data quickly and efficiently without the help of the cloud. And as satellite operators continue to grow their constellations, the amount of data will grow, too.

And then there are the challenges associated with space exploration. NASA JPL, for instance, is using the AWS cloud for mission-critical communication and transfer of telemetry data in support of its Perseverance rover mission on Mars.  The Mars Rover team is receiving hundreds of images from Mars each day from a record number of cameras, resulting in thousands of images over Perseverance’s time on the planet. By using AWS, NASA JPL is able to process data from Mars, on Earth, faster than ever before. The increased processing speed is helping NASA JPL scientists to plan the rover’s next day activities. The increased efficiency will allow Mars 2020 to accomplish its ambitious goal of collecting more samples and driving longer distances during the prime mission, compared to previous rovers.

Simply put, the cloud is removing barriers that traditionally have held back the space industry and is helping to redefine the art of the possible.

FedScoop: There has been something of a renaissance in how the commercial aerospace industry has brought more affordable and technically advanced solutions to launching payloads, including satellites, into space. How are advances in cloud computing enabling those gains?

Crosier: We’re probably at the most exciting and significant inflection point in the space industry since the original Apollo days, back in the 1960s. Back then, the U.S. government would design, acquire, build, launch, operate and sustain all of its own systems. This was because one: nobody else in the world could do it. And two: security needed to be built in to protect those systems.

The industry growth that we are seeing today is creating enormous opportunities for companies of all sizes. AWS has extensive experience helping commercial and government customers and partners design satellites, and conduct space and launch operations. Our AWS Aerospace and Satellite team was established last year to directly support these customers and their long-term goals.

In order to support these customers and their space missions, we know that a flexible and secure cloud computing environment is essential. At AWS, security is a top priority and has been from day one. We did a lot of listening in the early days to really understand the challenges of our federal customers and show that we deliver security that is second to none.  AWS has been a proven partner to the federal government for years, and government agencies trust us to handle their most sensitive workloads.

FedScoop: How do you see cloud computing playing a larger role in supporting space infrastructure and the growing mesh of satellites in orbit?

AWS Ground Station is a fully managed service that allows customers to downlink data and provide satellite commands across multiple regions with speed and agility — and at a low cost. AWS Ground Station customers can downlink data and provide satellite commands across multiple regions quickly, while paying only for the satellite time they use. We’ve found that companies can save up to 80% of their ground station infrastructure costs by using AWS Ground Station.

As the number of satellites in orbit continues to grow, operators will need to be able to increase the rates at which they deliver high-precision data to the people who need it most. The more we make cloud-based solutions available in near real-time, the more we will see companies develop new and exciting ways to use that data.

I think we will also continue to see the cloud playing a larger, more valuable role by supporting autonomous activities, allowing certain tasks to be done without a human in the loop. Artificial intelligence and machine learning can help to automate many tasks including data analysis, space traffic management and collision avoidance. With so many satellites in orbit, automating capabilities like these and delivering results to end users more rapidly are things that can only be done using the cloud.

FedScoop: Moving a bit back down to Earth, how do you see cloud computing playing a larger role in suborbital space?

Crosier:  That’s a great question. People talk about the air domain and the space domain as though they’re interconnected; that it’s a singular domain where you just move from one to the other. But it’s not easy. We in the space industry are always cognizant that they behave in two very different ways.  However, we have to transit between those domains interchangeably. For instance, we have aerospace operations — drones, doing intelligence in different parts of the world — that are taking their cues and information from satellites in space.

Boom Supersonic is a wonderful example. Boom is developing a new generation of supersonic aircraft and is using the cloud to perform digital design and engineering, stress tests, high-performance compute modeling and simulation. They estimate achieving a six-times increase in productivity operating on the AWS cloud versus running these simulations in an on-premises environment.

Here’s the amazing piece: Boom tells us that they have consumed 53 million compute hours on the AWS cloud; and they will double that over the next two years, in order to complete design and testing of their aircraft design. Boom is demonstrating how you can build an aircraft entirely on the cloud. It’s just a powerful example of where the aerospace industry can really benefit.

FedScoop: Looking ahead, what’s on your radar that excites you most about how the public is likely to benefit from these cloud-enabled advances in the aerospace industry? And what’s perhaps the biggest challenge you see?

Crosier: We see a number of exciting things that our customers are doing to advance the global good that really can only be executed on the cloud – like environmental protection, climate change monitoring, or disaster response activity.  I’ll give you two examples.

Fireball International, one of our customers located in Australia, is using space data in the infrared spectrum to monitor and detect new wildfire breakouts within three minutes of ignition. Being able to respond in this way can only be achieved by using AWS cloud capabilities and our global infrastructure.

Another example is a company called Digital Earth Africa. They use high resolution imagery from space to focus relief efforts on the continent of Africa. There are ways you can detect what’s happening in patterns of life from space, such as where crops are not getting enough water and where there’s the risk of famine. They estimate that as satellite resolution improves, they’ll also be able to improve decision-level intelligence 800 times faster than what they could achieve before they moved to the cloud.

Exciting capabilities like these are possible because of the global infrastructure that AWS provides, coupled with the ability of the cloud to process large data sets more quickly and efficiently. I believe the cloud will become an indispensable   foundation to the aerospace industry. Government and commercial customers alike will only stay competitive and relevant if they operate in the cloud.

Watch this story featuring Major General Crosier and Astronaut Peggy Whitson on how AWS is helping astronauts, scientists and everyday heroes make the future of space a reality.

Read more insights from AWS leaders on how agencies are using the power of the cloud to innovate.

This article was produced by FedScoop and underwritten by AWS.

Technology Modernization Fund use may be worked into the FITARA scorecard

Agency and industry experts want the House Government Operations Subcommittee to consider grading agencies on their use of the Technology Modernization Fund (TMF) as part of the Federal Information Technology Acquisition Reform Act (FITARA) scorecard.

While not every agency has applied for or received some of the $1 billion injected into the TMF last month, those that do could be rewarded on the FITARA scorecard in some way, said Kevin Walsh, director of IT and cybersecurity issues at the Government Accountability Office, during a subcommittee hearing Friday.

The suggestion comes as the subcommittee considers how to continue evolving the FITARA scorecard and worry mounts over just how quickly the TMF Board intends to approve IT modernization projects.

“There’s probably a bit of concern that if the $1 billion doesn’t get consumed, what does that mean for the future of the TMF,” Joe Flynn, public sector chief technology officer at tech company Boomi, told FedScoop. “To that end, one of the things I think you’re going to really see is they’re going to start to look at maybe a scoring on agencies and how they’re actually taking advantage of the TMF.”

Rep. Gerry Connolly, D-Va., who chairs the subcommittee, said he hopes GAO is monitoring the TMF Board’s criteria for project approvals. He also introduced a bill called the Performance Enhancement Reform Act.

If passed, the bill would require agencies to include IT modernization investments, system upgrades, staff technology skills and expertise, and stakeholder feedback in their annual performance plans.

“To determine the scope and feasibility of IT modernization [chief information officers] must be more involved in agency performance planning,” Connolly said.

Agencies should “absolutely” be planning how to secure or eliminate their oldest systems, Walsh said.

The TMF will help modernize agencies’ legacy systems if the funding is used effectively,” Walsh said. “The challenge is going to be ramping up that team that manages the TMF to make sure that they have the expertise necessary to oversee these projects.”

An evolving scorecard

The Department of Labor was the only agency to receive A grades in six of seven categories on this latest FITARA scorecard and one of a few agencies to receive two TMF awards. Used in conjunction with the department’s working capital fund and IT modernization appropriations, the money helped digitalize its Temporary Labor Certification Program in January for $2 million in annual savings.

“I shudder to think what would’ve happened to that printing operation during COVID-19,” said Gundeep Ahluwalia, CIO at DOL.

Agencies’ FITARA scorecard grades have continued to improve despite the removal of easy As, like the software licensing metric, from the last scorecard.

Some federal officials have criticized the scorecard for its changing expectations as a result.

“It is a bit of a moving target, but you have to think of the technology landscape as a moving target,” Flynn said. “The idea of these five-year strategic roadmaps don’t exist anymore because the speed of technology is moving so quickly.”

The subcommittee may expand the scorecard’s cyber category in light of the SolarWinds hack, start grading agencies’ implementation of the Federal Data Strategy or evaluate artificial intelligence use, he added.

“Softer” areas like how well agencies serve citizens, human capital skills and gaps, and IT acquisition cadres and strategic sourcing remain difficult to measure, but the subcommittee could begin scoring federal websites’ accessibility in accordance with the Individuals with Disabilities Education Act, Walsh said.

“The subcommittee will continue to evolve the scorecard in ways that facilitate tracking improvement over time,” Connolly said. “While adding new metrics as necessary to raise the bar on what is needed across the federal enterprise.”

DISA’s Dave Bennett to retire at end of April

Long-time Defense Information Systems Agency leader David Bennett will retire from the agency April 28, DISA announced Thursday.

Bennett will finish his career as director of operations, a job he has had since 2016. Before that, he served as chief information officer, director of implementation and sustainment and director of enterprise services.

It’s unclear who will replace Bennett or what his next move will be. Bennett served in the Army, rising to the rank of colonel before retiring from uniformed service.

Bennett helped push DISA to adopt more analytics in its assessments of networks and operations, saying during a 2016 event produced by FedScoop analytics were the key to harnessing the power of big data.

“We are seeing real-world scenarios right now, both in performance and cyber, that are kind of scary,” Bennett said. “We too often, without analytics, shoot in the dark.”

As a workforce leader he focused on empowering others to help achieve DISA’s mission of securing the Department of Defense’s networks, he said.

“Mentoring and empowerment are both critical to enable personal and professional growth for your workforce,” he previously told WorkScoop.  “Leaders invest their time in helping others identify opportunities for growth and professional development through career management advice and feedback. Empowerment builds two-way trust and facilitates personal growth.”

USCIS automating pre-processing of immigration cases

U.S. Citizenship and Immigration Services is focused on automating functions that will help pre-process immigration cases for adjudication, according to CTO Rob Brown.

Natural language processing helps harvest names for adjudicators and flag potential fraud when applicants’ stories don’t align, machine learning (ML) combs biographic and biometric data to identify people with USCIS benefits, and network analytics make connections regarding their relationships and employers, Brown said.

New tools will dissect supporting evidence related to immigration cases, making it easier for adjudicators to make decisions to award people benefits like green cards.

“Now we start to think about a lot of that pre-processing of adjudication really up front, as opposed to it being manually done or swivel chaired at an adjudicator’s workstation or workstations,” Brown said during an AI in Government event. “So providing a lot of that information upfront.”

Computer vision and optical character recognition will be used to validate documents and classify evidence, so adjudicators can click on what they want rather than sort through.

Identity proofing like mobile verification and sentiment analysis are proving more challenging, Brown said.

“We, I feel, need industry experts and assistance in looking at what does this mean from a privacy perspective and abating some of the challenges therein,” he said. “What does this mean from a security perspective?”

Identity validation presents a number of cyberattack vectors when doing something as seemingly benign as verifying photos or videos of people.

Presentation-layer, man-in-the-middle, and backend and data poisoning attacks are all possible.

“Simple things like Avatarify and even TikTok technologies have creeped in,” Brown said. “So I feel this is an area we need a lot of help with.”

Brown also hopes to deal with ML and artificial intelligence “sprawl” by consolidating toolsets and platforms to provide a more robust continuous integration/continuous delivery (CI/CD) pipeline.

Proper experimentation on algorithms that accounts for security and their sharing is also important, Brown said.

USCIS is still trying to solve the problem of data bias by automating algorithms to filter out biased data, audit pipelines and flag where data quality issues persist, Brown said.

Brown hopes to see more adaptive automated services embedding customer and adjudicator personas before 2025.

VA to pause rollout of new EHR sites during review

Officials told lawmakers the Department of Veterans Affairs will stop the rollout of its modernized electronic health record system at new sites while the administration conducts a “strategic review” of the $16 billion program.

The VA officials said they want time to examine any potential problems with the EHR at the Mann-Grandstaff VA Medical Center in Spokane, Washington — the first facility to bring the system online. Prescription mix-ups and delays in care have led to lawmakers and others calling for the VA to pause its work to fix issues that could harm patients.

“The strategic review covers a full range of program areas, including productivity and clinical workflow optimization, a human-centered design effort to understand what veterans want to see from VA’s patient portal and a sandbox environment that will allow employees at future implementation sites to conduct interdisciplinary, team-based rehearsals of these workflows in the new EHR solution,” Dr. Carolyn Clancy, VA’s acting deputy secretary, told the House Veterans Affairs Technology Modernization Subcommittee.

VA Secretary Denis McDonough first announced the review with few details late last month. At the time of his announcement, the review was said to last 12 weeks.

The decade-long EHR modernization program will migrate VA’s patient records to a Cerner Millennium-based cloud platform that comes with an all-new user interface for clinicians. It’s a massive overhaul from the current health IT system that will replace much of the front and back ends of the VA’s EHR. The system is designed to become completely interoperable with the Department of Defense’s version of the modernized system. The DOD has already launched the EHR at several military hospitals with fewer publicly-known issues.

The Government Accountability Office in February recommended VA pause the EHR’s rollout, a request the department was initially lukewarm on. But the VA appears to be heeding the watchdog’s call now.

The next center slated to get the new system is in Columbus, Ohio, but it’s unclear now when that might be given the pause. The timeline of EHR’s site launches has been delayed several times in the past due to the need for more training and the transferring of resources during the early days of the pandemic.

Ann Dunkin picked to be Energy CIO

Ann Dunkin will return to federal service as a CIO, this time at the Department of Energy.

Dunkin will soon be tapped to take over the Energy CIO role, which has been vacant since Rocky Campione left government earlier this month, sources close to the matter told FedScoop.

She comes to the job after spending the past 15 months as Dell Technologies’ CTO for state and local government, building off of her three-year tenure prior to that as the CIO of Santa Clara County.

Before her time focused on state and local government, Dunkin served as CIO of the Environmental Protection Agency during the latter years of the Obama administration. Based on that work, she was called upon recently to serve as a member on the Biden-Harris transition team working with the EPA.

Shortly after the 2020 election, Dunkin penned a report with her former EPA CTO colleague Greg Godbout on how the Biden administration should think about scaling IT modernization and innovation across government, namely through the leadership of the General Services Administration.

Energy officials did not respond to FedScoop’s request for comment prior to publication.

Matt Cutts to depart as USDS administrator

Matt Cutts announced he’s stepping down as U.S. Digital Service director in a Medium post on Wednesday.

Deputy Administrator Edward Hartwig will fill the role in an acting capacity until a new administrator is appointed.

The changing of the guard comes as USDS receives additional funding, pursues new agency partnerships and looks to hire — all as it scales its operation modernizing government services and making them more accessible.

“USDS was created to provide private sector technologists an opportunity to serve their government for a short period of time,” Cutts wrote in his post. “This year, in addition to those that joined the civil service permanently, we’ve seen an impressive number of alumni return to serve their government a second time.”

When Cutts joined in 2016, two years into USDS’s existence, he only intended to stay on three to six months.

Now the agency seeks engineers, designers, product managers, acquisition strategists, and policy experts to continue its work. That work includes supporting the Centers for Disease Control and Prevention during the COVID-19 pandemic, streamlining financial relief, improving the immigration and refugee processes, aiding students with their loans, and reforming procurement and federal hiring.

USDS grew to a team of about 180 people and a network of 500 alumni under Cutts while becoming a farm system for federal chief information officers and chief technology officers.

“The team has created and deployed tools to help better fulfill the promises we’ve made to our veterans. We’ve digitized the naturalization process and reimagined hiring across the federal government,” Cutts wrote. “We began supporting states in building more responsive systems for the millions of Americans who rely on them.”