Tiptoeing out of the intel contracting clique — NGA tries to change its ways

The government’s eye in the sky is undergoing a major shift toward greater openness and leveraging new acquisition techniques to attract nontraditional companies, strategies one of its officials told FedScoop are actually crucial to staying ahead of adversaries.

The government’s eye in the sky is undergoing a major shift toward greater openness and leveraging new acquisition techniques to attract nontraditional companies, strategies one of its officials told FedScoop are actually crucial to staying ahead of adversaries.

A new office in the National Geospatial-Intelligence Agency has been fostering internal conversations on using prize challenges, DevOps and other ideas to improve acquisition, and the agency recently held its second-ever challenge competition to try to foster new relationships with industry, Air Force Col. Marc DiPaolo, chief of mainstreaming capabilities in NGA’s Enterprise Innovation Office, told FedScoop.

But while the agency is making progress in that direction, DiPaolo noted those discussions take time and require a culture shift for the notoriously low-profile, closed agency.

“It’s a paradigm shift, but this is a case where if we’re too cautious, our risk will go up because our adversaries, believe me, are working in a very fast cycle. And if our secrecy causes us to go too slow, then we’re not really securing anything,” DiPaolo said. “It’s a culture change. But I think NGA is making that culture change. Our director has demanded it, but [also] the people inside NGA understand that we have to do things differently.”


Coming up next, DiPaolo said the agency is embarking on a new major software development project. While he couldn’t elaborate before publication on the development’s exact specifications, he said the way the agency is going about it is what really makes it different.

“It’s a very technical, very complicated software development effort and we’re going to do it open source, and we’re going to do it using DevOps concepts from the ground up,” DiPaolo said. “And this first project, we believe, is going to be our first real-big step in digital transformation of capabilities.”

The agency also just finished the design process for artificial intelligence capabilities and is beginning prototyping, DiPaolo said.

The ideas, he said, were born out of an innovation focus area team on artificial intelligence.

“We’re not the only ones inside NGA looking at this by the way — there are several offices that are looking at artificial intelligence. But we started with the design process and that gave us some insights that sort of clarified that our initial assumptions about where we should start were just wrong,” he said. “Because it wasn’t going to solve the real user’s problem.”


With all of this work going, the Disparate Data Challenge that finished up at the end of October was just the most recent public-facing effort toward greater agency openness, DiPaolo said.

“The really important developments in the geospatial market are happening so far outside of government that we’re afraid that we’re not even aware of them,” he said.

DiPaolo’s office was established nearly a year ago in October 2015, an effort he says was created to “start to get reconnected to those sources of innovation.”

“As we started to do that and interact with these smaller nontraditional companies, what became obvious to us is we can’t just change where we look for innovation, but we have to change how we acquire it as well,” DiPaolo said. “Because if we can’t acquire these capabilities on timelines that are suitable for these young tech companies who can sell this stuff anywhere, then they’re not going to do deals with the government.”

It’s a common refrain around government, particularly at the Defense Department, which has been working to develop technology faster by utilizing what is called a “commercial solutions opening.”


[Read more: Carter: DIUx armed with new method to develop tech faster]

NGA is now prototyping implementations of the challenge’s winners, DiPaolo said, putting the new capabilities in the hands of agency users for them to decide if they are effective. It will also help the agency see how the capabilities work in the classified network “against real mission data,” DiPaolo said.

He called that work “another paradigm shift,” as it is letting the users decide for themselves if the new technology is beneficial instead of leaving those decisions to a closed select group.

Using a challenge to aid acquisition

The challenge received 34 responses overall, only four of which were from large companies familiar to NGA. And six of those 34 responses were from four individuals and two universities, he said.


“The normal acquisition process never would have connected us with those types of companies and organizations and individuals,” he said. “And now we’re tapping into, you know, the source of the best ideas wherever they are.”

He also noted: “The disparate data challenge was … you could describe it as sort of a prototype for ways we could do acquisition in the future.”

In this case, the challenge served as additional market research “beyond simply posting to FedBizOpps,” he said, by gathering information on capabilities that can be brought to a group at NGA who is running a big acquisition project to do search.

“In this case we’re informing a larger traditional acquisition,” he noted, “but you could envision a future where either it becomes the traditional acquisition and the challenge is sort of the end stages of down selection leading right to a contract award, or you could see a challenge identifying candidates for prototyping, that you might roll into something like that DIUx model, where you actually prototype these things on our classified network, and you acquire them that way.”

In addition to getting people comfortable with using a challenge as part of an acquisition, DiPaolo said his team also spent time encouraging people to look outside of NGA’s circle of traditional providers.


“That was also counter-cultural,” DiPaolo noted, “because you know we have established relationships that have worked in the past.”

He added that “limiting ourselves to those relationships in the future is just not going to lead us to where we need to go. And we need to go on offense and find those new capabilities.”

The agency isn’t sure what its next challenge will be, but officials know there will be one, DiPaolo said.

Explaining the need for change

To connect with his colleagues and explain the need for change, DiPaolo said he focuses on the level of uncertainty in which the agency is operating.


DiPaolo said commercial satellite companies and even apps like Waze are driving geospatial technology forward at an incredible pace. And when the agency tries to acquire technology by using contracting vehicles laden with requirements, DiPaolo said it is often making “faith-based decisions.”

“We’re really just engaged in faith-based decision-making very frequently. We just don’t recognize it because we surround those faith-based decisions with lots of systems engineering charts and detailed concepts of operations that make it look like we’re in a low uncertainty environment when we’re really not,” he said.

DiPaolo said he thinks applying leadership and management styles from a low-uncertainty environment in a high-uncertainty environment is what is making change initiatives die.

“This explains why you can have very senior leadership that wants to do things differently, a workforce that wants to do things differently, but all of those change initiatives sort of die in the middle in this layer of permafrost,” he said. “It’s guaranteed that you’re not going to succeed because you’ll always be late to need or you’ll always be doing something based on bogus assumptions that you haven’t tested.”

DiPaolo said agency officials are hardwired to use requirements to get at a problem, and then “we’ve bludgeoned industry with those requirements until they give us exactly what we asked for.”


“Our habitual response to a new development effort is to immediately start doing systems engineering and breaking out requirements and attempting to reduce our risk by making ever more precisely defined requirements,” he said. “But all of those requirements are guesses… Instead what we would propose is applying the scientific method: Identifying the hypotheses that are most important to the success of the new effort and then testing those new hypotheses as soon as possible and as cheaply as possible. So when we finally decide to build at scale, we know we’re building the right thing.”

Samantha Ehlinger

Written by Samantha Ehlinger

Samantha Ehlinger is a technology reporter for FedScoop. Her work has appeared in the Houston Chronicle, Fort Worth Star-Telegram, and several McClatchy papers, including Miami Herald and The State. She was a part of a McClatchy investigative team for the “Irradiated” project on nuclear worker conditions, which won a McClatchy President’s Award. She is a graduate of Texas Christian University. Contact Samantha via email at, or follow her on Twitter at @samehlinger. Subscribe to the Daily Scoop for stories like this in your inbox every morning by signing up here:

Latest Podcasts