DISA takes over cloud office from DOD CIO
The Department of Defense’s Cloud Computing Program Office will be fully absorbed into the Defense Information Systems Agency, moving away from the DOD Office of the CIO.
The move is mostly administrative, a DOD spokesperson told FedScoop. The physical office will remain in Arlington instead of being moved to Fort Meade, and no personnel impacts will result from the change.
The most substantive change will be “operational control” over the office, which manages programs like the controversial $10 billion Joint Enterprise Defense Infrastructure (JEDI) cloud acquisition
“This structural makeup will afford CCPO the opportunity to integrate into and participate fully in DISA’s governance bodies and will allow CCPO to fulfill one of its critical mission tenants, which is to provide combat support, with respect to cloud computing capabilities, to the Department,” the DOD spokesperson told FedScoop about the move.
The transition further consolidates DISA as the IT service provider to DOD, a move it has been making along with its work consolidate support agency networks under its control.
Vice Adm. Nancy Norton, DISA’s outgoing director, first announced the move earlier in January at an AFCEA NOVA event. Norton said her agency was “very excited” about the move to be able to bring together offices that work to implement the cloud modernization strategies.
“It pulls to gather the rest of the cloud strategy,” she said.
The move will likely be completed by the end of the month. When all the administrative moves are taken care of, CCPO will have its own “DISA Center.”
USDS pilot to hire data scientists at 10 agencies closed in less than 2 days
A joint hiring announcement for data scientists issued by 10 agencies lasted less than 48 hours before the 500-applicant limit was reached Tuesday evening.
Now subject matter experts (SMEs) and human resources specialists will set about assessing whether each applicant qualifies for one of the 63 openings between the agencies.
The U.S. Digital Service spent the last year-and-a-half on 10 SME Qualification Assessment (SMEQA) pilots across 17 agencies, and now use of the hiring strategy is being expanded. SMEQAs reduce the need for lengthy resumes in favor of shorter, industry-style ones that are three pages maximum. SMEs review resumes and written assessments and conduct structured interviews depending on how the particular SMEQA is structured.
“A government action like this, even though it’s a heavy lift for all of us, is so much better than having each individual agency have to do their own action with their own HR staff and their own SMEs — especially for agencies that might not have a huge pool of SMEs,” Amy Paris, product manager at USDS, told FedScoop. “This allows them to rely on other, larger agencies and get that expertise across government.”
USDS developed the announcement that went live Monday with help from the Office of Personnel Management, which ensured federal hiring regulations were followed, and the Federal Chief Data Officers Council, which helped recruit agencies, SMEs and applicants.
The agencies are accepting talent from the private sector as well as through merit promotion inside government while preserving veterans’ preference.
“A cross-agency posting like this is a great way to appeal to a larger group of talent,” said Nick Sinai, former U.S. deputy chief technology officer in the Obama administration. “Using actual data scientists in government to help screen talent, in partnership with HR, is a promising approach that I’m hopeful the Biden administration will accelerate.”
Applicants only had to apply once to be considered for all of the data scientist openings, which fall within the General Schedule-13 and -14 pay grades. And applicants certified through the SMEQA who consented can still be hired by other agencies for up to a year if they don’t land a job upfront.
In this case, USDS is testing some new techniques for screening applicants including a multiple-choice questionnaire with some technical questions. Everything USDS did was cleared by OPM.
“These are established policies and procedures,” Paris said. “And we’re just helping agencies take advantage of the flexibilities and rules that they already have available to them.”
USDS intends to analyze data it’s collecting throughout the process and will evaluate how comfortable agencies are with the applicants they select, as well as the hirees’ progress over time. Some of the agencies involved are discussing the possibility of creating cross-agency data science cohorts among the new hires.
The 10 participating agencies are: the Census Bureau; Consumer Financial Protection Bureau; Equal Employment Opportunity Commission; General Services Administration; State Department; Department of Transportation; Treasury Department; U.S. Agency for International Development; U.S. Department of Agriculture; and Department of Veterans Affairs.
“A lot of agencies are interested in improving their data science capabilities,” Paris said. “This is an emerging field, and government has a huge repository of data. How do we use it to best serve the American people?”
CISA issues recommendations to strengthen cloud security
As remote work becomes a more permanent practice, the country’s top cybersecurity agency is warning that poor cyber hygiene can make an organization’s cloud service configuration ripe for adversarial attacks.
An analysis report released Wednesday from the Cybersecurity and Infrastructure Security Agency outlines security practices for organizations that use cloud services, drawing from recent incident reports of recent successful cyber attacks.
“These types of attacks frequently occurred when victim organizations’ employees worked remotely and used a mixture of corporate laptops and personal devices to access their respective cloud services,” the report reads. “Despite the use of security tools, affected organizations typically had weak cyber hygiene practices that allowed threat actors to conduct successful attacks.”
The analysis was not explicitly tied to the SolarWinds Orion software compromise, though CISA has been assisting agencies and other affected organizations with the fallout from that attack.
The most common attack types include phishing and brute force login attempts. Once inside the system, the malicious actors redirected emails to their own accounts, searched for sensitive keywords and set up systems to prevent legitimate users from seeing phishing warnings. In one incident, attackers used stolen session cookies to bypass multi-factor authentication protocols.
While organizations are always at risk for these types of attacks, remote-work practices such as forwarding emails from a professional to a private account and accessing the corporate system on an easily-hacked home network increase vulnerabilities.
CISA offered 21 recommendations for organizations to strengthen their cloud security practices. They included establishing a baseline for normal network activity, reviewing user-created email forwarding rules, enforcing multi-factor authentication and creating blame-free employee reporting for suspicious activity. CISA also offered four additional points of recommendation for users of Microsoft Office 365, whose suite of cloud-based products was caught up in the SolarWinds breach.
The Microsoft-specific recommendations include setting a limit for unsuccessful login attempts and using tools to investigate and audit breaches.
VA combatting shadow IT by having options ‘readily available’
Agencies can avoid unmanaged shadow IT creeping into their operations by offering customers a bevy of easy-to-use options, says Dominic Cussatt, the Department of Veteran’s Affairs’ principal deputy chief information officer.
“You have to give your customers options. If they don’t feel like they’re getting serviced properly from the central IT function, they’ll go find their own way. Because they’ve got a mission to execute,” Cussatt said Thursday during the Data Cloud Summit produced by FedScoop.
Shadow IT refers to the use of non-approved software or systems, which can open up huge security risks.
In 2017, FedScoop conducted a survey that found a majority of federal IT personnel use shadow IT to perform their jobs because government-issued devices often don’t support the applications they need. And according to Cussatt’s experience at the VA, that seems to still ring true.
Cussatt said the VA is reorienting its development and operations approach around portfolios of services that customers can then shop from. The idea is that these portfolios are ready to deploy, checked out from a security standpoint and with buys already in place.
“That ease of access helps them and helps them avoid seeking other options,” he said.
The VA is also utilizing a Systems-as-a-Service platform as a solution to shadow IT.
Customers can access and shop for things like a customer relationship management tool or call center option and then use their own funds to access it. Even though it’s an outsourced service, the VA will have already checked it for security and interoperability.
Biden calls for ‘most ambitious effort ever’ to modernize federal IT, cybersecurity
President-elect Joe Biden will make federal IT modernization and cybersecurity top priorities during the early days of his administration — second only to COVID-19 response, it seems.
In a fact sheet circulated Thursday ahead of Biden’s speech announcing his American Rescue Plan to respond to the COVID-19 pandemic and economic crisis, his transition team declared that the administration “will provide emergency funding to upgrade federal information technology infrastructure and address the recent breaches of federal government data systems.”
“This is an urgent national security issue that cannot wait,” the fact sheet says, positioning the nation’s state of cybersecurity as another crisis it’s facing in addition to the pandemic. “The recent cybersecurity breaches of federal government data systems underscore the importance and urgency of strengthening U.S. cybersecurity capabilities. President-elect Biden is calling on Congress to launch the most ambitious effort ever to modernize and secure federal IT and networks.”
Under this, Biden’s team wants to give the Technology Modernization Fund (TMF) $9 billion to help launch integral shared IT and cybersecurity services across government. Additionally, Biden wants to change the reimbursement structure of the fund, which has been a key critique of it since its creation and a reason it has struggled to receive appropriations from wary lawmakers.
As it exists, TMF recipients must pay their “borrowed” funds back within five years. This plan suggests a change to that, which would make it more suitable for “more innovative and impactful projects,” the fact sheet says.
Biden’s plan also budgets $200 million to boost the hiring of cybersecurity experts “to support the federal Chief Information Security Officer and U.S. Digital Service,” $300 million for “no-year funding” to support the General Services Administration Technology Transformation Services (TTS) team, and another $690 million to help the Cybersecurity and Infrastructure Security Agency (CISA) “bolster cybersecurity across federal civilian networks, and support the piloting of new shared security and cloud computing services.”
As Biden is inaugurated Jan. 20, the federal IT community will await the announcement of the leaders who will put these proposals into action. This week, the Biden team announced that Jason Miller, deputy director of the National Economic Council in the Obama White House, as his pick to be deputy director for management in the Office of Management and Budget. The people eventually hired in the Federal CIO and CISO roles will report to Miller.
JADC2, but for medics: Military looks to link medical networks, data
The military’s talk of wanting to connect its networks is not limited to just battle. New requests to emerging technology developers show that it is also taking that same approach for improving medical data and networks.
A consortium of medical enterprise technology developers has been asked to develop prototypes to link medical data, software and situational awareness across different parts of the military, from research labs to tactical responders. The solicitation adds another health data interoperability project to the ongoing major electronic health records overhaul programs the DOD is working on.
The request “aims to establish prototypes with the ability to provide commanders at all levels with visual understanding of how medical capabilities are arrayed throughout the operational environment,” the Medical Technology Enterprise Consortium wrote in a pre-announcement. That visibility will come from linking the disparate networks and software medical personnel in the military use.
The language and overall approach appear to mirror other major network modernization initiatives, such as Joint All-Domain Command and Control — the new operating concept the military will rely on to be able to link networks together and create a common data architecture for commanders to control troops and weapons in air, land, sea, space and cyber conflicts, all at once. The similarity here is creating networks that work together, not just within their own domain or digital stovepipe.
The DOD currently anticipates spending $5.8 million on a prototype, with more money potentially available for follow-on work.
GSA wants to be an ‘end-to-end digital entity’
The General Services Administration isn’t a “true, end-to-end digital entity” yet, but it plans to be, Chief Information Officer David Shive said Wednesday.
GSA intends to take the digital assets and tools it has created and ensure they’re always operating as efficiently as possible by reengineering agency business processes, Shive said.
That way GSA can continually improve systems and services throughout the acquisition life cycle, rather than simply patting itself on the back for going paperless.
“We’ve digitized the vast majority of the intellectual assets of GSA, but a lot of those are first gen,” Shive said during an ATARC virtual event. “They’re basically digital copies of existing paper processes.”
GSA recently established shared service product lines for cloud and identity and credentialing services, and has more on the way. The product lines allow for better system integration, which in turn improves security and data exchange, said Beth Killoran, deputy CIO at GSA.
The agency also wants to improve its cybersecurity tools by augmenting them with artificial intelligence for predictive analytics, Killoran said.
GSA aims to procure such AI using its new fourth-generation, enterprisewide Infrastructure Capabilities and IT Operations contract. Structured differently than its predecessors, the contract is designed to reduce operational costs for reinvestment and enable GSA to partner with new, innovative contractors to aid the agency’s digital push.
The agency is a “heavy” user of AI in “predictable” places like cybersecurity, where threat detection is easily automated, with around 25 projects being piloted or in the minimum viable product or production stage, Shive said. Another 15 AI projects are in the pipeline.
Currently 51% of GSA workloads are in its cloud and 22% in off-premises managed services, with the remaining 27% on premises.
“We will continue to look for targets to move to the cloud,” Shive said. “But we will also be smart about it.”
The highest-value IT spend isn’t always the cloud, he added, and GSA will remain a hybrid environment for the foreseeable future.
The public sector’s path to modern customer identity and access management
The COVID-19 pandemic has created both motivation and opportunity for federal agencies to interact with citizens in new ways. The growing demand for digital experiences means that identity and access management has become critical to the government’s success in delivering services.
To meet the high expectations of today’s users, reduce development time of digital experiences and eliminate potential security gaps, government agencies need to put user identities front and center.

Read the full report.
In order to accomplish this, public sector enterprises are employing modern customer identity and access management (CIAM) solutions, according to a recent Okta report.
The report describes the ability of CIAM tools to “securely capture and manage a user’s identity and profile data and control their access to applications and services.” This allows agencies to centralize access management across multiple applications and offer a frictionless user experiences that is more secure and highly scalable.
The report outlines examples of how divisions under the U.S. Air Force, State Department and Centers for Medicare and Medicaid all have implemented modern identity and access solutions to streamline user experience and reduce overall IT operations costs.
In each use case, the Okta platform helped the organization simplify and secure access to systems and set up APIs to streamline access to databases. Additionally, agencies benefited from a broader view of user activity across applications to understand exactly who saw what information.
“Citizen and government users expect the same frictionless experiences across all their devices that they receive as consumers,” says the report, and FedRAMP-approved CIAM solutions help agencies achieve that by creating a secure, scalable repository designed specifically to store and manage user information.
Learn more about the poser of cloud-based customer identity and access management (CIAM) solutions.
This article was produced by FedScoop for, and sponsored by, Okta.
GAO: Pentagon’s critical technology communication efforts need work
The Department of Defense already has a process to identify and protect the billions of dollars of critical technologies it acquires, but it needs to improve the way it communicates those findings internally and to other agencies, according to the Government Accountability Office.
The department began implementing a new four-step process in February 2020 that is more specific about what parts of acquisition programs, technologies, manufacturing capabilities, and research areas must be protected, and how the DOD will accomplish that. But it needs to do better at sharing what it learns in that process, the GAO found in a new audit.
“Critical technologies — such as elements of artificial intelligence and biotechnology — are those necessary to maintain U.S. technological superiority. As such, they are frequently the target of theft, espionage, and illegal export by adversaries,” the Jan. 12 report reads.
Officials haven’t finalized the steps in the new process on how the DOD will communicate the list internally and to other agencies, what the assessment metrics for protection measures are, and which organization will manage future protection efforts. The sooner they figure out those steps, the better, according to the GAO.
“By determining the approach for completing these tasks, DOD can better ensure its revised process will support the protection of critical acquisition programs and technologies consistently across the department,” the report reads.
Officials from the Protecting Critical Technology Task Force told the GAO they may plan to communicate the new list of critical technologies the same way they have in past years: by formal memorandum to the military secretaries. The GAO found that this approach, however, did not always loop in the people actually responsible for protecting critical technologies. Entities such as the Anti-Tamper Executive Agent and the Defense Security Cooperation Agency reported not receiving the 2019 critical acquisition programs and technologies list.
The GAO recommended that the Pentagon specify how it will communicate its critical programs and technologies list, develop metrics to measure protection efforts, and select a DOD organization that will manage protection efforts after 2020. The department concurred with the first recommendation and partially concurred with the second and third.
To get more use out of data, DOD needs to cut some old sources, Air Force CIO says
To get better use out of its data, the Department of Defense may need to actually reduce many of its data sources that are not useful in a modern context, says Lauren Knausenberger, the Air Force’s chief information officer.
The Air Force has been working to adjust to a cloud-based data storage model, and some of the old methods it used to collect will not be as useful with modern tech. Some of those old methods the department used to collect data were also just plain inaccurate, Knausenberger said Thursday during the Data Cloud Summit produced by FedScoop. The messy data practices should be left behind, she said, with many of the legacy systems and on-premises data centers the military needs to ditch.
“There are a lot of data sources that need to go away,” she said during her keynote address.
The seemingly contradictory recommendation is the result of a new direction the entire military is heading with its technology modernization schemes. The new ubiquitous phrase tech leaders repeat is their desire to use “data as a strategic asset.” The ability to make that a reality is largely dependent on the military’s ability to pull form quality data sources and have modern storage and analytical capabilities.
The use of data is not new, but its primacy and importance is. The DOD released its first ever data strategy in October. Knausenberger said she is now fast at work to build the technical architecture to better store and use data across the Air Force and with partner services.
Along with paring away some of the old sources, the Air Force has been able to use high-quality sources to improve the analytical impacts. For example, $1.5 million was saved in airplane exit door repair through predictive maintenance. By knowing to replace a door before it failed, the force saved on emergency repairs from doors that accidentally might open mid-flight.
“Using data to predict outcomes with relatively high level of confidence is another huge area here,” Knausenberger said of modernization efforts.