Gain insights from 6 leaders working to transform government service delivery
Transforming the way governments serve the public is about more than choosing the right technologies. This level of change requires agencies to rethink how they engage their customers — from citizens and external partners to the frontline employees assisting the public.

Read the full report.
Over the last couple years, more and more federal agencies have openly acknowledged their desire to transform and improve overall customer engagement and associated outcomes. Becoming more engaging requires a focus on the intersection between the provider of service and the end user. This intersection is what supports the overall customer experience.
The time is now to lower the barrier of entry for government agencies to deliver exceptional experiences. Although agencies understand the importance of CX, many still struggle with how to integrate it within and across programs. Utilizing a cloud platform that accounts for the many ways in which stakeholders interact with your agency is a sure way to deliver vital services.
Read the stories.
Learn more about Salesforce and trailblazers in government.
Google Cloud gets FedRAMP High authorization
Google has earned approval to manage some of the federal government’s most sensitive data in its commercial cloud infrastructure.
Google Cloud Platform received the Federal Risk and Authorization Management Program’s (FedRAMP) High authorization to operate for 17 products it can now offer agencies like NASA that face stringent security requirements.
“These new certifications reflect our continued investment and support for customers in the U.S. public sector, and is another example of momentum we’re seeing as government agencies move to the cloud,” Mike Daniels, vice president of public sector for Google Cloud, wrote in a Wednesday blog.
In 2011, the Office of Management and Budget established FedRAMP to authorize and continuously monitor cloud service offerings across agencies. FedRAMP requires cloud service providers like Google Cloud meet minimum security requirements for the data they handle based on the impact its compromise would have on agency operations, assets and people.
The program authorizes offerings at the Low, Moderate and High levels, with high-impact data defined as that which would have a “severe or catastrophic adverse effect” on agencies if its confidentiality, integrity or availability were lost. In other words, high-impact data is the government’s “most sensitive, unclassified data” in the cloud because of its potential impacts on lives and financial ruin, according to the FedRAMP Program Management Office.
With a High authorization, Google Cloud can now support unclassified workloads like health care delivery, emergency response and space operation.
The tech company already provides machine-learning technology to NASA’s Frontier Development Lab, as it works to identify life on other planets, and is helping the Air Force modernize modeling and simulation training infrastructure.
High authorization required extensive documentation of how Google Cloud’s infrastructure and platforms secure data through zero-trust networking among other measures. The FedRAMP Joint Authorization Board ensured Google Cloud’s monitoring, patching and vulnerability scanning met High requirements.
Google Cloud already maintains a Moderate authorization for the platform and G Suite, which was expanded to 64 products across 17 cloud regions, the tech company also announced Wednesday.
VA touts success moving big, mission-critical systems to enterprise cloud
Often, cloud newcomers are advised to take migration slow and start small. But the Department of Veterans Affairs has seen success in doing the opposite.
After launching the Enterprise Cloud Management Office in 2018, the VA decided early on to move one of its most critical systems over to the commercial cloud to realize the benefits as soon as possible, David Catanoso, head of that office, said Wednesday at the 2019 Amazon Web Services re:Invent conference in Las Vegas.
“Most people, when you start your cloud migration journey, will tell you to start small, gain experience,” Catanoso said. “We took a completely different approach moving one of our biggest, most critical systems to the cloud very early.”
That system is the Veterans Benefits Management System, which operates 24/7 and manages more than 800 million veterans benefits-related documents. Catanoso said the system “was starting to see performance and reliability issues in its legacy hosting environment, and not only that, but it was also running out of space to add storage devices to store all those documents.”
The VA couldn’t afford to wait to start small and learn, he said. The challenge, though, was “how do we move a system that complex without disrupting service to our veterans and the users that use it?”
Catanoso credits AWS services like S3 (Simple Storage Service), Snowball and EC2 (Elastic Cloud Compute) for the quick move away from its proprietary systems. “And we did all that in 8 months,” he said. VA’s enterprise cloud is a hybrid environment with Microsoft Azure services and its own private enclaves as well.
The result was VA benefits personnel were able to trim download speeds of large documents to less than a minute, whereas it used to be more than triple that. “That just gave a huge performance boost to those folks that are working every day and downloading documents” and processing benefits, Catanoso said. “So it was a huge win. And not only did we get better performance, we got all the cost benefits of being in the cloud, the reliability, the disaster recovery. It’s been a really huge success story for us.”
Now VA is looking to move more and more of its applications over to this cloud environment. The department so far has moved 60 applications to its enterprise cloud to operate more than 3,000 virtual machines and store and manage 6 petabytes of data.
Since moving VBMS to the cloud, VA has moved other high-profile systems there as well, including its new MISSION Act tools, “which enable veterans to get health care not only from the VA but from out in the community,” Catanoso said, and the department’s legacy electronic health record, the Veterans Information Systems and Technology Architecture, which it will operate until fully migrated to a more modern health system over the next decade.
Coding it Forward fellowship changes participants’ minds about federal service, survey shows
There is now some hard data to show that Coding it Forward’s Civic Digital Fellowship can have a lasting impact on participants’ career trajectories and, by the same token, the future tech workforce of the federal government.
A survey of summer 2019’s 55 participants, administered both before and after the program by the Partnership for Public Service, shows fellows walk away from the 10-week program more likely than before to seek a job in the federal government. This is the first time such a survey has been given to the participants of the fellowship, which will mark its fourth summer in operation in 2020.
The numbers are pretty dramatic — in the pre-fellowship survey, zero students said they’d be “extremely likely” to pursue a job in the federal government after graduation. In the post-fellowship survey, 23% said they would be. On the other end of the scale, the percentage of students saying they are “unlikely” to work for the federal government went down from 22% before the program to 10% after.
The survey shows a similar impact on students’ interest in civic tech more broadly. In the post-fellowship survey 44% said they would be “extremely likely” to take a job in civic tech, up from just 14% before the fellowship experience began.
“I think my experience has really opened up my eyes to the many career options in civic tech and the dire need for tech talent in government,” one fellow said in the survey. “My career intentions have definitely been molded by CDF and I am more likely to choose a career in civic tech.”
Results like these are exciting for program cofounders Chris Kuang and Rachel Dodell because they bolster the fellowship’s big picture raison d’être: giving the government — which currently struggles with an aging IT workforce — a “best practice” example on how to attract young tech talent. The demand from federal agencies is real — over just three years in existence the fellowship has grown from 14 fellows at one agency to 55 at six agencies.
“We’re lucky to be at a really exciting time where innovation and being forward looking are on the minds of a lot of federal leaders,” Kuang told FedScoop.
And there’s more to come. The program recently launched the application for its summer 2020 fellowship. Kuang said he expects to accept anywhere between 60 and 80 new fellows.
DOJ to focus on zero-trust, identity pilots in 2020
The Department of Justice is focusing on zero trust and identity and access management pilots in fiscal 2020 as it adapts its security posture to the cloud and a remote workforce.
Procurements began last year, and the department now has eight to 10 pilots experimenting with different zero-trust architectures and vendors, Nickolous Ward, chief information security officer of DOJ, told FedScoop.
DOJ consolidated more than 100 data centers into 12 by pushing about 60 services — roughly 40 percent of what it delivers in total — to the cloud in the last five years, Ward said. On top of that, the department’s lawyers and agents are increasingly working remotely helping state and local law enforcement complete investigations.
As a result, DOJ’s attack surface has “expanded dramatically” at the same time much of its security perimeter has disappeared, Ward said during the Fortinet Security Transformation Summit produced by FedScoop.
The National Institute of Standards and Technology defines zero trust as the narrowing of cyberdefenses from wide network perimeters to micro-perimeters around individual or small groups of assets. No implicit trust is given to systems based on their location, and user and device authentication are required prior to establishing a connection.
Zero-trust security products will help DOJ more safely expose services in the cloud to the internet, but “none of them are bulletproof,” Ward said.
That’s where the identity and access management pilots come in to ensure the identity of every person connecting to every piece of data.
“That’s been a huge source of data breaches…we trust all these different vendors to store our data, and how do we know that they’re protecting it properly?” Ward asked.
Threat intelligence is a “mandatory” capability, and data must be shared between federal and commercial partners, he added.
Any contracts Ward puts out as CISO will require open application programming interfaces (APIs) and connections to the rest of DOJ’s security infrastructure, he said.
“If we can’t take an action within 15 minutes, a good nation-state actor is already hopping to other systems once they’ve made their initial compromise,” Ward said.
Quick threat response also requires speedy analysis of cyber data coming in. That process needs to be automated due to the “massive shortage” of cyber professionals, Ward said.
DOJ is also piloting robotic process automation and orchestration solutions for that reason.
“Automation will be something we’re looking at this year,” Ward told FedScoop. “We’re looking at things like deception technology and how it can really help us with lateral movement aspects [of breaches].”
When it comes to AI for security, ‘the marketing hype is at a 10, and delivery is at a 1’
When it comes to commercial artificial intelligence applications for cybersecurity in government, so far there’s more hype than true value, agency tech leaders said Tuesday.
“I try to read a lot of industry materials and I have yet to see a real product out there that helps me as a force multiplier … and I can flip a switch and turn it on and I feel safer and I sleep better at night. I have yet to find that product or service,” Ryan Cote, CIO at the Department of Transportation, said during a panel at the Security Transformation Summit presented by Fortinet and produced by FedScoop and StateScoop.
“I would say the marketing hype is at a 10, and delivery is at a 1,” Cote said. “I think we’re in the early, early stages of applying real AI to cyber. We’re still trying to figure out the definition of AI in some circles and what is AI and what isn’t AI.”
There are reasons for the disconnect, suggested Rick Piña, Chief Technology Advisor for Public Sector at World Wide Technology said. Infrastructure is one of them, given that government networks are still catching up with public-sector adoption of technologies like cloud.
“Some of this has to do the fact that there aren’t really architectures,” he said. “So some of the things that you do in Silicon Valley or Seattle or Boston or Austin are great in the lab, but when you actually have to deploy this in a real world environment … all of this AI cool stuff that you saw in Silicon Valley is just not going to apply on your network.”
The fact that adoption of AI in government is at such an early stage could also be due to slow adoption of emerging tech in the public sector overall, Piña said.
“Within the government, there’s often times not really an openness to really want to be aggressive or to want to be leading edge or bleeding edge or to want to be somewhat courageous,” His company, he went on, is seeing “very few pockets where people are being as aggressive as they can.”
The two government leaders present on the panel — Cote and U.S. Air Force Chief Information Technology Officer Frank Konieczny — did say their agencies are interested in exploring potential applications of AI and machine learning. Some areas, especially tasks that are very rule-based, for example, are ripe for automation. But full AI is several leaps away from that.
“We’ve done some [robotic process automation] stuff … but when you get to a cybersecurity thing it’s much more complex than that,” Konieczny said. “We’re going down the path, but it’s taking a long time.”
CBP eyes including US citizens in Biometric Exit program
U.S. Customs and Border Protection (CBP) is considering expanding the scope of its Biometric Exit facial recognition program to include U.S. citizens. If implemented, this new rule would represent a marked expansion of the system as it currently exists.
An early indication of the Department of Homeland Security’s thinking comes from a recently posted notice. “DHS is proposing to amend the regulations to provide that all travelers, including U.S. citizens, may be required to be photographed upon entry and/or departure,” the notice states.
To date, U.S. citizens have been able to opt-out of inclusion in the program — signs placed near the checkpoints alert travelers to this option. Additionally, CBP says, any photos of U.S. citizens that are captured are deleted within 12 hours.
But now DHS is saying that including citizens in the pool will help the agency achieve “seamless” implementation of the system.
The rule hasn’t been officially issued just yet, a CBP spokesperson told FedScoop, but it is in the “final stages of clearance.” Once issued, there will be a public comment period before the agency moves forward.
It could be controversial — in June a group of House Democrats sent a letter to CBP demanding that the agency explain its authority to use facial recognition on traveling American citizens. “This is an unprecedented and unauthorized expansion of the agency’s authority,” the letter stated.
Biometric Exit traces its legislative mandate back to the 1996 Illegal Immigration Reform and Immigrant Responsibility Act when Congress asked for an automated identity check on departing foreign nationals. Post-9/11, Congress set up the expectation for a biometric check, and CBP kicked off a pilot in Atlanta in 2016. More recently, President Trump’s Executive Order 13780 from March 2017 called for the “expedited completion” of the system.
CBP’s system uses airline manifest data and government databases (including passport and visa databases) to assemble a gallery of existing photos of passengers who are expected to arrive in or depart from the U.S. The system then matches photos of passengers taken during the boarding process against this gallery, looking for a positive match.
Biometric Exit is expected to be in use at the country’s 20 largest airports by 2021.
GSA fielding hundreds of comments daily concerning beta.SAM.gov
The General Services Administration says it has received about 200 comments daily from the feedback tool on beta.SAM.gov since the website became the official source for contract opportunities.
The outdated but familiar FedBizOpps.gov was retired by GSA on Nov. 12 as part of the agency’s initiative to merge 10 legacy contract-award systems into one. Users understandably have questions and issues with the replacement.
“We are actively reviewing comments coming from the feedback tool and the Federal Service
Desk on a frequent, regular basis,” reads a new GSA factsheet on the transition. “We have operations teams working to curate feedback and to add to the agile iteration process.”
GSA migrated 5.6 million pieces of data from FedBizOpps to beta.SAM.gov, and while some users experienced latency issues in the first 72 hours, those have been resolved, the agency said.
Current issues tend to fall into one of seven categories, with users requesting email alerts for saved searches — a forthcoming feature — and additional search parameters. The Federal Acquisition Service, an office within GSA, also continues to make on-screen search more intuitive, in response to users’ feedback.
Some users have questioned the need for two-factor authentication — the login method that requires users to have an additional layer of security beyond just a password.
“FAS works with our partners in GSA IT to ensure that we are meeting or exceeding cybersecurity protocols,” reads the factsheet. “We realize that two-factor authentication adds an extra step prior to accessing the data, but it is necessary to our mission to not only transact and display but safeguard data.”
Neither individual watchlist indicators nor saved search parameters and histories transferred over to the new system “due to differences in functionality,” but users can build those functions anew on beta.SAM.gov, according to GSA. The agency is also working on cleaning up old and dead links.
The new site has seen steady use since its launch, based on activity reported by the Federal Service Desk.
In the two weeks following the transition, FSD received about 170 inquiries daily regarding Tier 1 contracting opportunities in the new system. The desk currently fields about 600 calls daily from users wanting to register for SAM.gov.
The System for Award Management is the next system to be merged, at which point SAM.gov will leave beta. The merger is part of GSA’s Federal Marketplace Strategy for streamlining acquisitions.
Census hiring systems have issues, report says, but bureau begs to differ
The Census Bureau says its employee recruiting systems are on track to perform as required as it hires a big temporary workforce next year, despite a recent alert from an inspector general about IT infrastructure and software issues.
In a management alert released Nov. 21, the Commerce Department’s inspector general says tech challenges led to delays and performance failures during the testing of key recruiting systems for the 2020 census. The IG recommends that the bureau come up with a contingency plan in case similar issues occur during actual operations.
The bureau, meanwhile, says these findings rely on outdated information and thus aren’t applicable.
The challenge is with two decennial census recruiting systems, the IG says — the Decennial Applicant Personnel and Payroll System (DAPPS) and the Census Hiring and Employment Check (CHEC). These are systems that the bureau will use to field and support the large workforce it needs during decennial census operations. But scheduled “performance and scalability” tests on both of these systems were somewhat delayed, the IG found, and then both failed to meet “performance goals” when subjected to peak-usage activity.
“According to Bureau personnel, the failures for both systems were due to either inadequate infrastructure and/or inefficiencies in the software,” the management alert states. The failure of the DAPPS to meet test scenarios, for example, is said to be due to “issues with third-party software,” which Census Bureau personnel told the IG they were working to address.
The bureau itself, however, sees things very differently. While it says it “appreciates” the IG’s work, it seems to disagree with the fundamental conclusion of the management alert.
The alert “appears to rely upon an outdated draft testing document and is no longer current,” a spokesperson told FedScoop in an email. “We understand the critical nature of our hiring and payroll systems and are extremely confident that both systems will function as intended during peak 2020 Census operations.”
The systems have been in use, successfully, since the 2010 census, the spokesperson added.
This alert is just one component of the IG’s ongoing investigation into how the bureau is preparing for the 2020 census. A complete audit report is coming “at a later date,” the IG says.
Multi-domain operations: Like bringing Waze to the battlefield
In about a month’s time, the U.S. Air Force will host the first demonstration of its Advanced Battle Management System — the networking concept that will serve as the technological backbone of the military’s shift to an advanced way of seeing the battlefield.
Through multi-domain operations, the military services aim to link together air, sea, land, space, cyber and information assets to better identify and eliminate threats. And while the idea could leave to revolutionary jump forward in awareness and information sharing for warfighters, the technology necessary to achieve it isn’t at all revolutionary, said Will Roper, Air Force’s assistant secretary for acquisition, technology and logistics.
“We’ll connect F-22s, F-35s, SpaceX Starlink satellites, Navy ships, Army soldiers,” Roper said at a recent Center for a New American Security discussion. “We’re going to connect them in an internet-like style. What we’re really doing to enable multi-domain is finally building the internet in the Air Force. It’s all the stuff that you know. There are no show-stoppers here.”
Rather, it’s taking the best of what already exists in the commercial world — things military personnel have access to and expect in their personal lives — and cloning that for command and control.
“The great news is … this exists,” he said. “We just simply have to be able to clone it and probably put a little more security in it. But it’s not unachievable. But it is going to be different to become a digital service, a digital department.”
Roper pointed to navigation app Waze as an example. Every time he drives home from the Pentagon and uses Waze, he thinks about “how would this work on the battlefield?”
Below such an app, he said, are strong software development capabilities with cloud infrastructure and platforms to build on. He referred to the Advanced Battle Management System as “a whole internet company in the Air Force,” built on the service’s Cloud One, Platform One and Data One programs.
“The way that this will work … we’ll have kind of a big cloud, a big [Department of Defense] cloud, and if you’re a system … and you’re connected to ‘big cloud,’ and very similar to the app Waze, or pick your favorite, we have a user profile for you based on your mission,” Roper said. “And the data that hits our cloud, we can recognize, ‘Oh this is something [you] should see, because you’re driving a ship and you don’t know that threat is over there and we just collected on it.’ And we can push it to you in a way that’s very similar to Waze, easy to engage with, and as you respond to it, we get better at recognizing you.”
But it’s also very different from an app like Waze because adversaries will be constantly trying to take it offline, Roper said. “So the real secret sauce is going to be when the disconnect happens, how much are we able to locally store and process? Can I inform you how long you will have digital superiority, digital stealth so that when you connect back up — because I don’t think any adversary will be able to keep us disconnected forever — we can immediately refresh your data, almost like we’re kind of resetting the clock.”
To do that, he said, the military is going on the same digital journey that many successful companies have already gone on. Roper said the Air Force has already recruited a cadre of “internet-type gurus who have joined our team as pioneers” to build this system-of-systems out. “So we’re not designing this in a traditional defense fashion. We’re designing this with champions of commercial internet technology who are willing, just out of patriotism, to be contracted designers, to make sure that we don’t get outside of what worked for the internet.”
Connected to the world
If the Air Force gets the ABMS program right, “the benefit will be that finally for once, if the government has a piece of data that can help the warfighter, we can get it to them,” Roper said. “And it’s crazy in the world we live in where you right now with your personal device are connected to the entire world. Think about how much ability there is to interact, to understand, to command control things, your house, your car, you control everything. We live in that world. Our operators go home with that capability. And they come into a military where things can’t talk to each other.”
The biggest challenge in all of this, Roper said, will be convincing Congress to deliver the required funding. “It’s a big risk to put billions of dollars into digital transformation,” he said. “You can’t take a picture of it. But you know that behind your phone is an amazing, powerful architecture that allows that phone to be so much more than a platform.”
But, he anticipates “sizable dollars” will be dedicated in fiscal 2021 to support the ABMS development.
“It’s enough money to actually do real stuff, and it’s not tied to any platform,” Roper said. “So, ABMS will basically be a competition among existing platforms. Whoever can kind of make their platform look more like an Internet of Things-type system, you get first dibs to the pot.”