Potential supply-chain threats prompt Senate bill on training acquisition officials
Agency officials who handle supply-chain risk management would receive training on how to spot potential foreign threats in IT and communications technology under a bill senators proposed Friday.
The bipartisan Supply Chain Counterintelligence Training Act would emphasize identifying and mitigating aspects of technology that could allow adversaries to spy on the U.S. government. Sens. Ron Johnson, R-Wis., and Gary Peters, D-Mich. — the chairman and ranking member of the Senate Homeland Security and Governmental Affairs Committee — are the lead sponsors.
“America’s adversaries use any means necessary to gain access to valuable and sensitive government information, including possibly inserting compromising code into products or enlisting untrustworthy IT support personnel to exploit government systems,” Peters said in an announcement about the bill. “Allowing an adversary to gain a foothold in America’s technological supply chain is a risk that simply cannot be tolerated.”
The bill comes as the U.S. cybersecurity community and government are paying increased attention to where federal technology originates. Most prominent is the Department of Homeland Security’s 2017 binding operational directive ordering agencies to remove Russian cybersecurity company Kaspersky Lab’s products from their systems. DHS cited Kaspersky’s close ties to Russian intelligence, as well as Russian laws that could potentially force the company to hand over information on U.S. systems. The defense authorization bill that President Trump signed into law in August 2018 also blocks government purchases from Chinese tech companies Huawei and ZTE on similar grounds.
Kaspersky and Huawei both have rejected the U.S. accusations.
The Senate bill would require the Office of Management and Budget, Office of the Director of National Intelligence, the Department of Homeland Security and General Services Administration to collaborate on creating the program.
In December, Trump signed legislation establishing the Federal Acquisition Security Council and allowing classified information to be used in supporting supply chain risk assessments.
Last month, Federal Chief Information Security Officer Grant Schneider said the new council is developing criteria for making recommendations on equipment, products and services that shouldn’t be allowed to do business with government.
A Senate bill introduced earlier this year would create a White House Office of Critical Technologies and Security to protect against the theft of U.S.-developed technologies and risks to critical supply chains. Senators also have expressed concerns about the use of foreign VPN apps.
Customer experience is a whole-of-agency endeavor, leaders say
Through initiatives like the Trump administration’s customer experience priority goal and legislation like the 21st Century Integrated Digital Experience Act, the federal government is increasingly considering the experience citizens have when interacting with agencies.
But in order to make a real difference, digital-services leaders said at Adobe’s Government Symposium on Thursday, agencies need to expand their thinking. One way to make significant improvements in customer experience (CX) is to take a holistic view of all the public-facing elements of an agency’s mission, said Marcy Jacobs, the executive director of the U.S. Digital Service at the Department of Veterans Affairs, and Simchah Suveyke-Bogin lead of the Customer Experience Center of Excellence at the General Services Administration.
“The way government is funded and organized doesn’t lead to good cross-agency CX improvements,” Jacobs said during a panel on personalized digital experiences. “It could be that a business line is building a thing and they’re not thinking about the context of how … anyone interacts across multiple touch-points.”
For example, a large agency like the VA might provide several unrelated services to the same person, Jacobs said, but there shouldn’t be wide variations in the CX for those tasks.
“It gets tricky when one channel at a time is trying to [improve customer experience]” Suveyke-Bogin said. “Coordinating” plans across different areas and teams is important she said.
Michael Leen, who works at the digital production company MediaMonks, said that in the private sector, this kind of coordinated approach can yield “incredible cost savings.”
But taking a wide view isn’t without challenges, Jacobs admitted. Sometimes looking at the whole of an agency “can be paralyzing,” she said.
“It’s big and it’s complicated and it’s multi-channel and it’s multi-year budgets and it’s not something that can be tackled, probably, in an administration,” she said. But try not to get too caught up in all that. Instead — “start with something.”
At the VA, the relaunch of VA.gov in November 2018 as a homepage for the agency-provided benefits that veterans use most was preceded by years spent developing and testing new digital tools at Vets.gov. And while the new VA.gov is important, it’s also just the very tip of a very large iceberg. The initial relaunch included just 200 redesigned pages of the more than 400,000 across VA.gov. Poke around on the site and it doesn’t take long to find “old” pages. These, Jacobs told FedScoop in February, will get rewrites and updates in the coming months.
“Just follow a thread and get some momentum and learn and then continue to iterate and grow that,” Jacobs advised Thursday. “Because trying to solve it all at once is not possible.”
National Geospatial-Intelligence Agency gets new deputy director
The National Geospatial-Intelligence Agency on Thursday named Stacey Dixon its eighth deputy director.
“Dixon is a proven leader who has a deep understanding of NGA and the entire intelligence community, its current challenges and the bright opportunities that lie ahead,” Vice Adm. Robert Sharp, director of NGA, said in the announcement. “She has earned a stellar reputation for synthesizing complex national security problems, developing solutions, boldly leading enterprise operations and caring for people.”
Dixon, previously the director of the Intelligence Advanced Research Projects Activity, has been with NGA since 2010 — first as the chief of congressional affairs and in various capacities since.
She has also served as deputy director of the Office of Corporate Communications, director of the Information Integration Office, and deputy director of the research and development directorate. Dixon’s 2016 IARPA assignment was joint duty as deputy director.
Date set for oral arguments in Oracle’s JEDI lawsuit
Oracle will finally get its day in court to be heard on why it thinks the Pentagon’s multibillion-dollar cloud computing acquisition is unfair and limits competition.
The Court of Federal Claims will hold oral arguments in Oracle’s case against the Department of Defense’s $10 billion Joint Enterprise Defense Infrastructure (JEDI) cloud contract July 10 in Washington, D.C., according to a new order from Judge Eric Bruggink.
At that hearing, Bruggink is expected to make a decision on the pre-bid protest, which Oracle first filed in December. Oracle also protested the contract with the Government Accountability Office last August, but it was ultimately denied in November 2018.
Oracle’s argument has always centered on its beliefs that DOD’s limiting of the contract to a single award is unlawful. But over time, its case has grown to make allegations of the Pentagon and contract frontrunner Amazon Web Services having conflicts of interest. Numerous times, DOD has cleared the contract of any such conflicts, but Oracle, as recent as this week, continues to press possible illicit connections between the military and AWS.
The court’s decision should close the book on Oracle’s JEDI protest, at least for a bit. If the ruling doesn’t favor Oracle, it’s all too likely the company will try again protesting the contract, likely after an award. Oracle has also taken to Capitol Hill, asking lawmakers to exercise “oversight authority regarding the JEDI procurement.”
With the case settled, DOD would soon after be cleared to make an award to one of the two vendors — Microsoft and AWS — that meet the procurement’s “competitive range” requirements.
Industry groups and activists urge Senate to support funding for OTA in 2020
A broad group of industry associations and advocates led by the Lincoln Network and Demand Progress sent a letter to the Senate Appropriations Committee on Thursday urging lawmakers to support funding for the reestablishment of the Office of Technology Assessment.
“We write to express our concern that Congress does not have sufficient capacity to tackle 21st century science and technology policy challenges,” the letter reads. “Accordingly, we urge you to prioritize efforts to augment this institutional capacity, including providing funding for the Office of Technology Assessment (OTA), as part of the fiscal year 2020 Legislative Branch Appropriations bill.”
The letter is signed by groups like Code for America, the American Civil Liberties Union, R Street Institute and more. It also bears the signatures of individuals including former U.S. Chief Data Scientist DJ Patil, former Deputy Chief Technology Officer of the United States Nick Sinai and others.
A draft 2020 spending bill, which was advanced by the House Appropriations Committee on Thursday, currently includes $6 million to relaunch the office.
OTA provided members and committees with objective, forward-looking reports on the impacts of science and technology developments from when it was established in 1972 until it was defunded and shuttered in 1995.
The revival of the office also boasts support among former members of the legislative branch.
“Its time to bring back the Office of Technology Assessment,” Congressman Vic Fazio, who represented California’s third district from 1979 through 1999, said during a recent hearing of the Select Committee for the Modernization of Congress. “I think we’ve all been embarrassed by the way Congress fails to understand technology. OTA needs to come back in some form.”
Near-annual attempts to bring back OTA, however, have so far fallen short. Last summer, for example, an amendment that would have reestablished the office with a $2.5 million budget failed to pass.
It’s OK for government to joke around on social media sometimes, comms leaders say
When it comes to citizen outreach, social media can be a place to bring the human out from behind the bureaucracy.
During a panel on “multichannel outreach” at Adobe’s Digital Government Symposium Thursday, communications leaders from the Department of Interior, National Science Foundation and the Bureau of Alcohol, Tobacco, Firearms and Explosives discussed the respective benefits of different media channels.
“I think social media is where people now prefer to engage,” said Jennifer Plozai, lead of external affairs at NSF. “We are engaging as real people, we are showing empathy, we are joking with people when they are joking with us.”
Before joining NSF, Plozai worked at the Transportation Security Administration where she helped launch @AskTSA — a social media-based customer service effort to answer people’s questions about travel and safety requirements. Plozai gave an example of how @AskTSA has used humor to its advantage.
“We had people asking us what they could bring on planes,” she said. “And they just started sending us photos — it wouldn’t have any text with it, just photos, because a picture is worth a thousand words. So they’d send us pictures of like a lightsaber, you know, ‘I’m going to ComicCon can I bring my lightsaber on the plane.’ And rather than just giving a typical government response…. we would say, you know, ‘Your lightsaber is good to go, but we’re afraid you’ve just told the Jedi you’re coming.'”
The team at TSA, she said, worked at being “really being engaging, and making that OK to do as a government agency.”
A few government agencies have managed to create truly unique social media presences, and humor often has a lot to do with it. The Consumer Product Safety Commission‘s Twitter account, for example, leans heavily into weird memes. And meanwhile, TSA’s Instagram account, which is mostly dedicated to pictures of the very strange and surprising things people try to get through security checkpoints, also often includes funny captions.
“People don’t come to a government social media account and expect to see humor,” the late Bob Burns, the brains behind TSA’s Instagram, said during a Facebook Live appearance in December 2017.
Joe Galbo, who runs the CPSC twitter account, also sees value in surprising people. “Doing the serious messaging constantly — people will tune you out,” he told FedScoop in an interview last year. “You have to mix it up a little bit.”
Agencies trying to find their ‘dark data’ face policy, leadership hurdles
Most IT managers agree finding and capturing dark and grey data should be a top priority, but antiquated policies and lack of senior-level support remain major hurdles.
Dark data describes all the unknown and therefore unused data across an agency, while grey data is known but unused.
San Francisco-based software company Splunk released a survey of 1,357 IT managers April 30 that found 56 percent of public sector data is assumed dark or grey. While 77 percent of public sector respondents said locating and using that data was paramount, 76 percent said lack of support from senior agency leadership was a challenge.
The Federal Deposit Insurance Corp. is wrestling with how to more efficiently and securely share all the personally identifiable information it collects from banks — short of reengineering its systems.
“It’s not magic,” Howard Whyte, CIO and chief privacy officer at FDIC, said Tuesday at the Splunk GovSummit. “You have to set a policy, and you have to go out and market the capabilities you’re trying to deliver and show value to the corporation.”
First FDIC has to model its data and then look at automation, Whyte said.
That’s easier said than done when 82 percent of public sector IT managers identified a mistrust of artificial intelligence and lack of knowledge around what can be automated within their agencies, according to the Splunk report.
Whereas machine learning helps identify patterns to get value out of grey data, AI is the key to finding and analyzing dark data, Frank Dimina, vice president of public sector at Splunk, told FedScoop.
“But AI won’t work if we’re not supplying it with massive data sets to make the technology smart,” Dimina said. “And I think when the dust settles, that’s when we’ll see some really interesting use cases and success stories.”
FDIC has the luxury of owning its own data center, so it’s now able to map where its data sets are and make decisions about consolidating data for better usage and security. The agency has also started running analytics on the data where it resides to do work faster for banks, identify data misuse cases and flag them for immediate action, Whyte said.
Ensuring leadership understands the importance of investing in such capabilities is critical, he added.
The Joint Special Operations Command is grappling with who should have access to its data, said Col. Carl “Jeff” Worthington, director of C4 system within JSOC.
“Inside JSOC we have the Dothraki — those are the Rangers, we send them out ahead — we have the Tullys, and we have the Karstarks, and we have the Knights of the Vale,” Worthington said, alluding to groups of characters on the popular TV show “Game of Thrones.” “And no one really wants to tell everyone everything because there’s danger in that, when you open yourself up and you show them your cards, so it’s been a struggle at times.”
Worthington controls data sharing within JSOC IT operations but said even within his organization it’s hard getting people to understand the value.
When it comes to making dark and grey data actionable, IT and cybersecurity are “low-hanging fruit,” Dimina said.
“But what they really should be using it for is to impact the mission of government — to make smarter investments at the public level, to have more openness with citizens, to deliver better services to their citizens, to improve the security of the nation,” he said.
The U.S. Postal Service is using its data to monitor the health of its applications and proactively go after cyberthreats by monitoring for excessive failed password attempts.
And the Department of Homeland Security’s DevOps team is tracking code check-ins when developers upload code for administrative review to gauge not only efficiency but count the number of bugs.
Another sign agencies recognize dark and grey data are a problem is the OPEN Government Data Act’s requirement that they appoint a chief data officer, Dimina said. While many agencies have done so, not all have them yet.
FDIC is considering appointing a CDO.
“And that person should own that framework — the data architecture,” Whyte said. “I’m not saying that everyone can’t use data, but someone has to be responsible for sharing it through the organization and making sure it’s providing the value that’s needed.”
Artificial Intelligence in Government Act is back, with ‘smart and effective’ use on senators’ minds
Editor’s Note: This story has been updated to reflect the introduction of companion legislation in the House.
Senate legislation to boost the government’s use of artificial intelligence is returning with bipartisan sponsorship and a long list of industry supporters.
The Artificial Intelligence in Government Act’s four proponents from the previous Congress — Brian Schatz, D-Hawaii, Cory Gardner, R-Colo., Rob Portman, R-Ohio, and Kamala Harris, D-Calif. — announced Wednesday that they plan to reintroduce it. The legislation was first introduced by the same group in September 2018 but stalled as the previous Congress came to a close.
The bill, broadly, aims to “improve the use of AI across the federal government by providing access to technical expertise and streamlining hiring within the agencies.” It would create a Center of Excellence for AI within the General Services Administration; establish a government advisory board on AI; direct the Office of Personnel Management to identify the skills necessary for employee competence in AI; and prompt agencies to “create governance plans to advance innovative uses and reduce barriers to AI for the benefit of the public,” according to a news release.
“We can’t continue to lead the world in AI technology if our own government isn’t making the most of it,” Schatz said in a statement. “Our bill will give the federal government the resources it needs to hire experts, do research, and work across federal agencies to use AI technologies in smart and effective ways.”
Reps. Jerry McNerney, D-Calif. and Mark Meadows, R-N.C., introduced a companion bill in the House the same day.
The bill also boasts support from a healthy list of outside companies, industry groups and think tanks, including the Center for Democracy and Technology, the Internet Association, Facebook and others.
AI, and specifically American leadership in AI, continues to be very popular with the current administration. In March the White House launched AI.gov, a new landing page for all the federal government’s efforts around artificial intelligence.
Should agencies pilot zero trust now?
Agencies building zero-trust networks should start with a single, successful application and modularize successive components, according to public and private sector security experts.
Introduced in 2004, zero trust (ZT) is a cybersecurity framework rooted in the notion that the network is always hostile and every device, user and flow must be continuously authorized whether they’re local or not.
A recent American Council for Technology and Industry Advisory Council report found that no single vendor currently offers a holistic ZT solution. But agencies performing information technology modernization can make apps with security baked in and through repetition can “be well on their way” to having a zero-trust environment, Sean Frazier, advisory chief information security officer of federal at Duo Security, told FedScoop.
“The biggest challenge that government agencies have is they tend to try and boil the ocean, so they tend to look at zero trust and go, ‘I’m going to layer this over my entire agency,’” Frazier said. “That will never happen.”
As agencies do assessments they may find that they’re “a little zero-trusty anyway,” he added.
Many agencies already have strong identity management tied to an authenticator as part of compliance, said Steven Hernandez, CISO at the Department of Education, on Wednesday during an ACT-IAC panel that also included Frazier.
Using authenticators — such as physical security tokens or apps that generate temporary codes exclusive to the device — is already an emphasis of ZT. So the next step, Hernandez said, is to automate the process and tie authentication to behavioral analysis. The result is that the process is tied not to something that a person has, but to physical traits that are unique to them.
“This checks many boxes, not only in the [National Institute of Standards and Technology] space, but also what [the Office of Management and Budget] is asking us to do and a lot of the counterintelligence folks are asking us to do on the [Director of National Intelligence] side,” Hernandez said. “So zero trust can drive a lot of the compliance programs we already have, if we do it right.”
Endpoint management, continuous diagnostics and mitigation, software-defined networking, microsegmentation and cloud monitoring are other things agencies are already doing that can easily be leveraged as part of ZT rather than replaced, he added.
While Frazier argued agencies should get started on ZT now and identify additional data sets as they go, Hernandez advocated for a more measured approach.
“We have to understand our data; we have to understand how our users interact with that data and how important that data is to our mission,” Hernandez said. “If we can’t answer that fundamental question, then don’t even start down the zero-trust path because you’re going to spend a lot of money on a very expensive capability — that’s incredible and can do amazing things — but you’re probably going to [distributed denial of service] yourself before you actually deliver something that looks like zero trust.”
This is particularly challenging when vendors don’t want to give agencies data from their tools about risk to the enterprise, he added.
NIST has an opportunity to standardize ZT so that agencies and vendors have a shared language around implementation — particularly important because every agency’s ZT framework will involve multiple vendors, Frazier said.
In the meantime, agencies can still take the ZT plunge, said Jeffrey Flick, acting director of the Enterprise Network Program Office within the National Oceanic and Atmospheric Administration.
“From a federal perspective, you have to be willing try a pilot,” Flick said. “We’re not real good at that.”
GSA floats training program to get ‘FedRAMP way out there’
The Federal Risk and Authorization Management Program (FedRAMP) office may begin offering hands-on training for federal security officials.
Leaders from the General Services Administration-based office, which is in charge of governmentwide cloud security compliance, floated the idea during a panel appearance at the Cloud Security Alliance’s federal summit Tuesday afternoon.
“We’re going to start bringing security officers into our office, give them some training on FedRAMP, radicalize them to our methodologies…” Zach Baldwin, program manager at FedRAMP, said to laughter in the room. “My wife is a terrorism analyst,” he responded, explaining his “inappropriate” word choice.
Selected security officers would “work through a FedRAMP [authority to operate] project” and then be sent back to their respective agencies, he went on. The initiative would be a “grassroots effort” to “get the FedRAMP way out there.”
The concept bears some resemblance to existing federal training and knowledge-sharing programs.
At the Defense Innovation Unit in Silicon Valley, there’s HACQer, an immersive bootcamp that gives acquisition officials from across the Department of Defense a crash course in the way DIU uses its other transaction agreement (OTA) authority to do iterative contracting. The program, which was first launched in Spring 2018, recently chose its 2019 cohort.
It’s unclear how developed the idea for a FedRAMP training program is. FedScoop has reached out to GSA for further details.
FedRAMP currently offers a number of DIY online training opportunities that aim to “provide all stakeholders with a deeper understanding of FedRAMP and the level of effort that is required to successfully complete a FedRAMP assessment.” These trainings are targeted at cloud service providers and Third Party Assessment Organizations.
A recent report by GSA’s inspector general found that FedRAMP’s program office “has not established an adequate structure comprising its mission, goals, and objectives for assisting the federal government with the adoption of secure cloud services.” This lack of a clear and concise mission, the IG argues, means the office can’t really assess its own effectiveness.