Date set for oral arguments in Oracle’s JEDI lawsuit

Oracle will finally get its day in court to be heard on why it thinks the Pentagon’s multibillion-dollar cloud computing acquisition is unfair and limits competition.

The Court of Federal Claims will hold oral arguments in Oracle’s case against the Department of Defense’s $10 billion Joint Enterprise Defense Infrastructure (JEDI) cloud contract July 10 in Washington, D.C., according to a new order from Judge Eric Bruggink.

At that hearing, Bruggink is expected to make a decision on the pre-bid protest, which Oracle first filed in December. Oracle also protested the contract with the Government Accountability Office last August, but it was ultimately denied in November 2018.

Oracle’s argument has always centered on its beliefs that DOD’s limiting of the contract to a single award is unlawful. But over time, its case has grown to make allegations of the Pentagon and contract frontrunner Amazon Web Services having conflicts of interest. Numerous times, DOD has cleared the contract of any such conflicts, but Oracle, as recent as this week, continues to press possible illicit connections between the military and AWS.

The court’s decision should close the book on Oracle’s JEDI protest, at least for a bit. If the ruling doesn’t favor Oracle, it’s all too likely the company will try again protesting the contract, likely after an award. Oracle has also taken to Capitol Hill, asking lawmakers to exercise “oversight authority regarding the JEDI procurement.”

With the case settled, DOD would soon after be cleared to make an award to one of the two vendors — Microsoft and AWS — that meet the procurement’s “competitive range” requirements.

Industry groups and activists urge Senate to support funding for OTA in 2020

A broad group of industry associations and advocates led by the Lincoln Network and Demand Progress sent a letter to the Senate Appropriations Committee on Thursday urging lawmakers to support funding for the reestablishment of the Office of Technology Assessment.

“We write to express our concern that Congress does not have sufficient capacity to tackle 21st century science and technology policy challenges,” the letter reads. “Accordingly, we urge you to prioritize efforts to augment this institutional capacity, including providing funding for the Office of Technology Assessment (OTA), as part of the fiscal year 2020 Legislative Branch Appropriations bill.”

The letter is signed by groups like Code for America, the American Civil Liberties Union, R Street Institute and more. It also bears the signatures of individuals including former U.S. Chief Data Scientist DJ Patil, former Deputy Chief Technology Officer of the United States Nick Sinai and others.

A draft 2020 spending bill, which was advanced by the House Appropriations Committee on Thursday, currently includes $6 million to relaunch the office.

OTA provided members and committees with objective, forward-looking reports on the impacts of science and technology developments from when it was established in 1972 until it was defunded and shuttered in 1995.

The revival of the office also boasts support among former members of the legislative branch.

“Its time to bring back the Office of Technology Assessment,” Congressman Vic Fazio, who represented California’s third district from 1979 through 1999, said during a recent hearing of the Select Committee for the Modernization of Congress. “I think we’ve all been embarrassed by the way Congress fails to understand technology. OTA needs to come back in some form.”

Near-annual attempts to bring back OTA, however, have so far fallen short. Last summer, for example, an amendment that would have reestablished the office with a $2.5 million budget failed to pass.

It’s OK for government to joke around on social media sometimes, comms leaders say

When it comes to citizen outreach, social media can be a place to bring the human out from behind the bureaucracy.

During a panel on “multichannel outreach” at Adobe’s Digital Government Symposium Thursday, communications leaders from the Department of Interior, National Science Foundation and the Bureau of Alcohol, Tobacco, Firearms and Explosives discussed the respective benefits of different media channels.

“I think social media is where people now prefer to engage,” said Jennifer Plozai, lead of external affairs at NSF. “We are engaging as real people, we are showing empathy, we are joking with people when they are joking with us.”

Before joining NSF, Plozai worked at the Transportation Security Administration where she helped launch @AskTSA — a social media-based customer service effort to answer people’s questions about travel and safety requirements. Plozai gave an example of how @AskTSA has used humor to its advantage.

“We had people asking us what they could bring on planes,” she said. “And they just started sending us photos — it wouldn’t have any text with it, just photos, because a picture is worth a thousand words. So they’d send us pictures of like a lightsaber, you know, ‘I’m going to ComicCon can I bring my lightsaber on the plane.’ And rather than just giving a typical government response…. we would say, you know, ‘Your lightsaber is good to go, but we’re afraid you’ve just told the Jedi you’re coming.'”

The team at TSA, she said, worked at being “really being engaging, and making that OK to do as a government agency.”

A few government agencies have managed to create truly unique social media presences, and humor often has a lot to do with it. The Consumer Product Safety Commission‘s Twitter account, for example, leans heavily into weird memes. And meanwhile, TSA’s Instagram account, which is mostly dedicated to pictures of the very strange and surprising things people try to get through security checkpoints, also often includes funny captions.

“People don’t come to a government social media account and expect to see humor,” the late Bob Burns, the brains behind TSA’s Instagram, said during a Facebook Live appearance in December 2017.

Joe Galbo, who runs the CPSC twitter account, also sees value in surprising people. “Doing the serious messaging constantly — people will tune you out,” he told FedScoop in an interview last year. “You have to mix it up a little bit.”

Agencies trying to find their ‘dark data’ face policy, leadership hurdles

Most IT managers agree finding and capturing dark and grey data should be a top priority, but antiquated policies and lack of senior-level support remain major hurdles.

Dark data describes all the unknown and therefore unused data across an agency, while grey data is known but unused.

San Francisco-based software company Splunk released a survey of 1,357 IT managers April 30 that found 56 percent of public sector data is assumed dark or grey. While 77 percent of public sector respondents said locating and using that data was paramount, 76 percent said lack of support from senior agency leadership was a challenge.

The Federal Deposit Insurance Corp. is wrestling with how to more efficiently and securely share all the personally identifiable information it collects from banks — short of reengineering its systems.

“It’s not magic,” Howard Whyte, CIO and chief privacy officer at FDIC, said Tuesday at the Splunk GovSummit. “You have to set a policy, and you have to go out and market the capabilities you’re trying to deliver and show value to the corporation.”

First FDIC has to model its data and then look at automation, Whyte said.

That’s easier said than done when 82 percent of public sector IT managers identified a mistrust of artificial intelligence and lack of knowledge around what can be automated within their agencies, according to the Splunk report.

Whereas machine learning helps identify patterns to get value out of grey data, AI is the key to finding and analyzing dark data, Frank Dimina, vice president of public sector at Splunk, told FedScoop.

“But AI won’t work if we’re not supplying it with massive data sets to make the technology smart,” Dimina said. “And I think when the dust settles, that’s when we’ll see some really interesting use cases and success stories.”

FDIC has the luxury of owning its own data center, so it’s now able to map where its data sets are and make decisions about consolidating data for better usage and security. The agency has also started running analytics on the data where it resides to do work faster for banks, identify data misuse cases and flag them for immediate action, Whyte said.

Ensuring leadership understands the importance of investing in such capabilities is critical, he added.

The Joint Special Operations Command is grappling with who should have access to its data, said Col. Carl “Jeff” Worthington, director of C4 system within JSOC.

“Inside JSOC we have the Dothraki — those are the Rangers, we send them out ahead — we have the Tullys, and we have the Karstarks, and we have the Knights of the Vale,” Worthington said, alluding to groups of characters on the popular TV show “Game of Thrones.” “And no one really wants to tell everyone everything because there’s danger in that, when you open yourself up and you show them your cards, so it’s been a struggle at times.”

Worthington controls data sharing within JSOC IT operations but said even within his organization it’s hard getting people to understand the value.

When it comes to making dark and grey data actionable, IT and cybersecurity are “low-hanging fruit,” Dimina said.

“But what they really should be using it for is to impact the mission of government — to make smarter investments at the public level, to have more openness with citizens, to deliver better services to their citizens, to improve the security of the nation,” he said.

The U.S. Postal Service is using its data to monitor the health of its applications and proactively go after cyberthreats by monitoring for excessive failed password attempts.

And the Department of Homeland Security’s DevOps team is tracking code check-ins when developers upload code for administrative review to gauge not only efficiency but count the number of bugs.

Another sign agencies recognize dark and grey data are a problem is the OPEN Government Data Act’s requirement that they appoint a chief data officer, Dimina said. While many agencies have done so, not all have them yet.

FDIC is considering appointing a CDO.

“And that person should own that framework — the data architecture,” Whyte said. “I’m not saying that everyone can’t use data, but someone has to be responsible for sharing it through the organization and making sure it’s providing the value that’s needed.”

Artificial Intelligence in Government Act is back, with ‘smart and effective’ use on senators’ minds

Editor’s Note: This story has been updated to reflect the introduction of companion legislation in the House.


Senate legislation to boost the government’s use of artificial intelligence is returning with bipartisan sponsorship and a long list of industry supporters.

The Artificial Intelligence in Government Act’s four proponents from the previous Congress — Brian Schatz, D-Hawaii, Cory Gardner, R-Colo., Rob Portman, R-Ohio, and Kamala Harris, D-Calif. — announced Wednesday that they plan to reintroduce it. The legislation was first introduced by the same group in September 2018 but stalled as the previous Congress came to a close.

The bill, broadly, aims to “improve the use of AI across the federal government by providing access to technical expertise and streamlining hiring within the agencies.” It would create a Center of Excellence for AI within the General Services Administration; establish a government advisory board on AI; direct the Office of Personnel Management to identify the skills necessary for employee competence in AI; and prompt agencies to “create governance plans to advance innovative uses and reduce barriers to AI for the benefit of the public,” according to a news release.

“We can’t continue to lead the world in AI technology if our own government isn’t making the most of it,” Schatz said in a statement. “Our bill will give the federal government the resources it needs to hire experts, do research, and work across federal agencies to use AI technologies in smart and effective ways.”

Reps. Jerry McNerney, D-Calif. and Mark Meadows, R-N.C., introduced a companion bill in the House the same day.

The bill also boasts support from a healthy list of outside companies, industry groups and think tanks, including the Center for Democracy and Technology, the Internet Association, Facebook and others.

AI, and specifically American leadership in AI, continues to be very popular with the current administration. In March the White House launched AI.gov, a new landing page for all the federal government’s efforts around artificial intelligence.

Should agencies pilot zero trust now?

Agencies building zero-trust networks should start with a single, successful application and modularize successive components, according to public and private sector security experts.

Introduced in 2004, zero trust (ZT) is a cybersecurity framework rooted in the notion that the network is always hostile and every device, user and flow must be continuously authorized whether they’re local or not.

A recent American Council for Technology and Industry Advisory Council report found that no single vendor currently offers a holistic ZT solution. But agencies performing information technology modernization can make apps with security baked in and through repetition can “be well on their way” to having a zero-trust environment, Sean Frazier, advisory chief information security officer of federal at Duo Security, told FedScoop.

“The biggest challenge that government agencies have is they tend to try and boil the ocean, so they tend to look at zero trust and go, ‘I’m going to layer this over my entire agency,’” Frazier said. “That will never happen.”

As agencies do assessments they may find that they’re “a little zero-trusty anyway,” he added.

Many agencies already have strong identity management tied to an authenticator as part of compliance, said Steven Hernandez, CISO at the Department of Education, on Wednesday during an ACT-IAC panel that also included Frazier.

Using authenticators — such as physical security tokens or apps that generate temporary codes exclusive to the device — is already an emphasis of ZT. So the next step, Hernandez said, is to automate the process and tie authentication to behavioral analysis. The result is that the process is tied not to something that a person has, but to physical traits that are unique to them.

“This checks many boxes, not only in the [National Institute of Standards and Technology] space, but also what [the Office of Management and Budget] is asking us to do and a lot of the counterintelligence folks are asking us to do on the [Director of National Intelligence] side,” Hernandez said. “So zero trust can drive a lot of the compliance programs we already have, if we do it right.”

Endpoint management, continuous diagnostics and mitigation, software-defined networking, microsegmentation and cloud monitoring are other things agencies are already doing that can easily be leveraged as part of ZT rather than replaced, he added.

While Frazier argued agencies should get started on ZT now and identify additional data sets as they go, Hernandez advocated for a more measured approach.

“We have to understand our data; we have to understand how our users interact with that data and how important that data is to our mission,” Hernandez said. “If we can’t answer that fundamental question, then don’t even start down the zero-trust path because you’re going to spend a lot of money on a very expensive capability — that’s incredible and can do amazing things — but you’re probably going to [distributed denial of service] yourself before you actually deliver something that looks like zero trust.”

This is particularly challenging when vendors don’t want to give agencies data from their tools about risk to the enterprise, he added.

NIST has an opportunity to standardize ZT so that agencies and vendors have a shared language around implementation — particularly important because every agency’s ZT framework will involve multiple vendors, Frazier said.

In the meantime, agencies can still take the ZT plunge, said Jeffrey Flick, acting director of the Enterprise Network Program Office within the National Oceanic and Atmospheric Administration.

“From a federal perspective, you have to be willing try a pilot,” Flick said. “We’re not real good at that.”

GSA floats training program to get ‘FedRAMP way out there’

The Federal Risk and Authorization Management Program (FedRAMP) office may begin offering hands-on training for federal security officials.

Leaders from the General Services Administration-based office, which is in charge of governmentwide cloud security compliance, floated the idea during a panel appearance at the Cloud Security Alliance’s federal summit Tuesday afternoon.

“We’re going to start bringing security officers into our office, give them some training on FedRAMP, radicalize them to our methodologies…” Zach Baldwin, program manager at FedRAMP, said to laughter in the room. “My wife is a terrorism analyst,” he responded, explaining his “inappropriate” word choice.

Selected security officers would “work through a FedRAMP [authority to operate] project” and then be sent back to their respective agencies, he went on. The initiative would be a “grassroots effort” to “get the FedRAMP way out there.”

The concept bears some resemblance to existing federal training and knowledge-sharing programs.

At the Defense Innovation Unit in Silicon Valley, there’s HACQer, an immersive bootcamp that gives acquisition officials from across the Department of Defense a crash course in the way DIU uses its other transaction agreement (OTA) authority to do iterative contracting. The program, which was first launched in Spring 2018, recently chose its 2019 cohort.

It’s unclear how developed the idea for a FedRAMP training program is. FedScoop has reached out to GSA for further details.

FedRAMP currently offers a number of DIY online training opportunities that aim to “provide all stakeholders with a deeper understanding of FedRAMP and the level of effort that is required to successfully complete a FedRAMP assessment.” These trainings are targeted at cloud service providers and Third Party Assessment Organizations.

A recent report by GSA’s inspector general found that FedRAMP’s program office “has not established an adequate structure comprising its mission, goals, and objectives for assisting the federal government with the adoption of secure cloud services.” This lack of a clear and concise mission, the IG argues, means the office can’t really assess its own effectiveness.

Oracle continues to cite AWS-Pentagon ties in revamped JEDI lawsuit

Oracle is not letting go of its allegations that the Department of Defense’s $10 billion landmark cloud procurement was tailor-made for its biggest competitor, Amazon Web Services.

Despite the Pentagon concluding last month that there are no conflicts of interest surrounding the ongoing Joint Enterprise Defense Infrastructure (JEDI) procurement, Oracle filed a supplemental Court of Federal Claims complaint made public Tuesday once again questioning the ties of between DOD and AWS. Oracle’s claims are centered on a set of individuals who have worked for both the Pentagon and AWS and played a role in the JEDI acquisition during their time with DOD.

While the new complaint follows the same basic framework as Oracle’s initial lawsuit, its allegations introduce new ripples into the ongoing saga around the JEDI procurement. Oracle now claims that former DOD employee Deap Ubhi and a Navy official whose name was redacted in the document were offered jobs and bonuses by AWS while working on the contract.

“Neither official timely disclosed his employment dealings with AWS to DOD or timely recused himself from JEDI. Instead, both Ubhi and [employee 2] participated in JEDI, accessing sensitive procurement information even after accepting job offers from AWS,” Oracle writes in the updated lawsuit. The company claims that DOD found both men in violation of a Federal Acquisition Regulation’s statute on conflicts of interest.

Oracle also presses on the involvement of Anthon DeMartino, a Pentagon employee who worked for the deputy secretary of Defense and formerly consulted for AWS. A prior DOD investigation cleared DeMartino of conflict of interest, but Oracle is challenging that finding.

AWS and DOD declined to comment.

The Pentagon’s inspector general is currently looking into ethical violations surrounding the acquisition, a DOD spokeswoman announced in April when a department investigation on possible conflicts of interest determined “there is no adverse impact on the integrity of the acquisition process.”

Meanwhile, Oracle has been taking its case to Capitol Hill as well, writing letters to lawmakers, asking them to exercise “oversight authority regarding the JEDI procurement.” Some have responded.

“The size and scope of this contract highlights the need for a completely aboveboard process. I understand that there’s ongoing litigation, but congressional oversight has an important role to play in making sure the Defense Department is properly using taxpayer dollars and maintaining policies to keep people from potentially gaming the system,” Sen. Chuck Grassley, R-Iowa, said in April.

Likewise, in a recent appropriations hearing, Rep. Steve Womack, R-Ark., called JEDI “an ill-conceived strategy,” saying it is “geared toward producing a desired outcome,” and asked why the Pentagon hasn’t changed its course despite ongoing criticism.

Acting Defense Secretary Patrick Shanahan responded by downplaying the significance of JEDI.

“Across the department, there is a proliferation in terms of implementing clouds. Everyone was moving to the cloud,” Shanahan said. “The JEDI competition is about creating a pathway so that we can move as a department on a small scale. This isn’t wholesale. This sometimes gets advertised as this is winner take all. This is winner take all for a very small subset of the amount of cloud infrastructure we’re going to have to build out over time.”

Despite Oracle’s persistence, it appears that an actual JEDI award may be getting closer. Last month, DOD narrowed the pool of bidders to just Microsoft and AWS as the only cloud providers that meet the procurement’s “competitive range” requirements. The lawsuit will continue into July, when the Court of Federal Claims will hear final oral arguments before making its decision. The court forbids DOD from making an award before July 19.

First QSMOs are ‘proof points’ for new shared services model

Moving agencies to integrated systems in accordance with the Office of Management and Budget’s new shared services policy won’t be easy, but the methodology is already being refined.

OMB released a memo on April 26 designating four initial agencies as hosts of quality service management offices, or QSMOs, charged with leading governmentwide adoption of common technology solutions.

Bigger, back-office lines of business like financial services and grants management were targeted first to best develop a replicable process for creating marketplaces, Margie Graves, U.S. deputy chief information officer, told FedScoop after speaking at the CFO/CIO Summit 2019.

“That’s why when you’re launching these — and I would call them proof points — it sort of lights the way because the flywheel moves faster after you get past those first few,” Graves said.

On the financial management side, the Department of Treasury and Department of Health and Human Services are the QSMOs for finance and federal grants, respectively. DOT “is a little better off” because it’s been experimenting with shared services for the past five years, said Tim Soltis, U.S. deputy controller, during the panel discussion.

Despite the progress, the finance side “is still going to be a challenge because you’ve got 24 agencies plus … around 40 systems, 19 other contracts that are maintaining the infrastructure of agency financial systems,” Soltis said. “Grants? Multiply that by 10; I mean that is going to be a major hassle to try and figure out just what the landscape is and where the lines are.”

OMB has been meeting with larger departments to figure out immediate needs and plan intermediate courses of action because, unlike with the last shared services policy memo in 2009, agencies won’t be kept waiting until a solution is ready perhaps a decade later, Soltis said.

“Everyone’s eyes glaze over when we talk about architecture,” Graves said. “But if we don’t do these plug-and-play, modular, Lego approaches to how we build these things and put them together, then we will always be in a situation where we will have to do the next hard thing.”

Gone are the days of agencies building their own, highly customized solutions that “no one else can use because it loses its license,” Soltis said.

Vendors of duplicative electronic invoicing, travel, payroll, and acquisition systems may “need to go out of business” because those systems are “built to not be interchangeable,” he added.

OMB isn’t done identifying new lines of business, and QSMOs for assisted acquisition, contract writing systems, customer experience, FOIA, travel, and real property management are forthcoming, according to the official website.

The biggest difference with the new shared services policy is governance and end state, Soltis said.

“Because the lines of business never had an end state,” he said. “It was only ever going to get to standardization.”

Trump’s choice for OPM director says she’ll listen closely to tech leaders

The nominee to lead the Office of Personnel Management told senators Tuesday that her past experience in working on the congressional response to the agency’s major 2015 data breach will help inform how she handles IT issues related to any major changes at OPM.

Dale Cabaniss, who was a senior Republican aide on the Senate Appropriations subcommittee with jurisdiction over OPM when the breach happened, said she saw first-hand how important it is to work closely with officials who understand technology and can make independent assessments of it. Her confirmation process comes as the Trump administration is moving toward merging OPM with the General Services Administration — an idea that has been floated for years by Republican and Democratic officials in the interest in putting more administrative processes under one roof.

“I think there would just have to be a real partnership between me and the CIO, who I’ve met with, as well as the [Office of the Inspector General], and their folks who work on IT,” she told members of the Homeland Security and Governmental Affairs Committee at a confirmation hearing. OPM’s CIO, Clare Martorana, and her deputy, David Nesting, were installed in February and both are veterans of the U.S. Digital Service.

The OIG in particular plays “a really, really important role to making sure that any kind of risk assessment is done, and that no changes are made until people are confident there’s not going to be a problem,” Cabaniss said. “Because the last thing that we need to do is make things more difficult for federal employees.”

Cabaniss noted that her own information was compromised in the 2015 breach, which she monitored closely as Republican staff director at the Appropriations Subcommittee on Financial Services and General Government. She was upbeat about OPM’s work on improving its technology and cybersecurity.

“I am more positive about OPM’s IT improvement than I’ve been in the past,” Cabaniss said.

She said she met briefly with Martorana and Nesting after being nominated by President Trump.

“They came from [USDS], they’re part of the original group who came in during the Obama administration,” she said. “They’re just incredible technical people who really are here just because they want to serve.”

Cabaniss’ nomination appears to face few obstacles, as Democrats on the committee mostly asked her to offer assurances that once confirmed, she will be responsive to any of their requests for information related to oversight.

‘Bottom up’ process at USAJobs

Committee member Sen. James Lankford, R-Okla., asked Cabaniss about the federal workforce site USAJobs.gov, which still faces criticism years after the government brought it in-house from private company Monster.com. “We get constant complaints from people that can’t find any of their listings on USAJobs, or if you don’t know the secret keywords to be able to get to it, you can’t actually navigate it. How do we fix this?” the senator asked.

Cabaniss said the site is due for a “bottom-up review,” as well as more work to ensure that agencies are improving how they communicate with applicants once they’ve actually gotten their paperwork through to the other side.

“It’s something that we’ve really got to take a look at, because when my kids can apply for a job on their phone and get an answer within … a matter of hours … I don’t know how we compete against that,” Cabaniss said.

Wooten nomination

The hearing also included consideration of Michael Wooten to be administrator of the Office of Federal Procurement Policy at the Office of Management and Budget. Many of his answers about improving federal acquisition processes pointed to the private sector — especially finding ways to leverage what industry is already doing. Like Cabaniss’ nomination, Wooten’s appeared to face little opposition from Democrats on the panel.

In the past, commercial off-the-shelf (COTS) technology typically hasn’t been treated right in the federal government, said Wooten, whose long career in public service has included about a decade at the Defense Acquisition University, a Department of Defense agency that trains military personnel, civilian staff and contractors.

The government actively goes looking for those solutions, “but then we proceed to break the COTS solution and then try to retrofit it into the peculiar set of government policies or practices,” Wooten said. “That needs to end … we need to ask ourselves if we can retrofit the process instead of the product.”