Continue to FedScoop.com

Big data just a start for Homeland Security geeks

HSARPA may not have the clout or profile of its defense counterpart, but its work on big data and cybersecurity is making its way into agency programs.

Greg Otto
Bio
Greg Otto Technology Reporter - FedScoop

Greg Otto is FedScoop's technology reporter, covering all of the innovative tech government is leveraging: cloud computing, mobility, cybersecurity...

Stephen Dennis is helping the government realize what can be done with big data, from cybersecurity to procurement to counterproliferation. 

Dennis is doing this as part of the Homeland Security Advanced Research Projects Agency, or HSARPA, a small office within the Department of Homeland Security’s Science & Technology Directorate that focuses on emerging technologies — and how they can be useful across the sprawling department, with its many component agencies and their various missions. 

Dennis has primarily worked with big data, experimenting with the massive amount of information DHS creates, to find new ways for agencies to streamline the way they handle both internal and mission-based operations. He has been working in this field so long he scoffed at the notion of “big data” being a buzzword when he talked to FedScoop on Wednesday. 

“We’ve always had more data than the machines can process,” Dennis said. “The revolution we are talking about here is the technological innovation, a pipeline of technologies that make it possible for us to harness commodity technologies, which is great, but there’s a next generation to that.” 

That next generation is finding how big data tools can be applied inside agencies. One example Dennis cited is a tool created for DHS' Immigrations and Customs Enforcement that allowed officials to use trade data as part of its counterproliferation investigation program. Born out of an experiment, ICE is now using that data to track and prevent weapons from falling into the hands of terrorists and criminals. 

“We just found excess servers, racked them in an office, got a Hadoop cluster up and running, and started to look at what it meant to look over the history of data as opposed to whatever the frame is in front of you,” Dennis said. “From that, we were able to make a well-supported pitch to make a program out of that.” 

Since then, Dennis said, his office often holds monthly workshops at DHS, which gives his team insight into where they should focus their research. One area HSARPA focuses on is cybersecurity, with Dennis saying that big data could give rise to the autonomous monitoring functions that could pave the way for future versions of the agency’s Continuous Diagnostics and Monitoring program. But for that to happen, the infrastructure needed to support big data is going to have to be installed. 

“None of the legacy systems in place will process the data the way that you need them to,”  he said. “Part of it is giving [the agency offices] the infrastructure and techniques, and then teaching them to fish.” 

One of the ways Dennis believes agencies could promote security from within is by relying on commercially supported open source platforms, like Cloudera or Red Hat, for their big data tools. Even though he believes the government’s opinion on open source is “mixed,” his office prefers the solutions due to researchers’ ability to examine whether they are secure before they deciding to use them.  

With commercially supported open source, "a lot of thought and hardening comes to the table,” Dennis said. “People are trying to make a living out of that code, so they really need it to be bulletproof. As long as we can verify the lineage and the source of the code and there is a responsible party behind it, that is a big plus for us. In our lab, we don’t have any code that wasn’t hardened the right way.”

But for more instances of big data to be used across the government, Dennis pointed to something a lot of other federal IT workers often harp on: Procurement needs to change. 

“The acquisition methodologies that are used for research and development aren’t necessarily in tune,” he said. “In a perfect world, the government would be able to jump into an experiment, figure out what works really well, and be able to act on the results based on the value of the experiment. I found myself having to justify that every tool I bought was the best, and I thought, ‘This doesn't make any sense.’”

Contact the reporter on this story via email at greg.otto@fedscoop.com, or follow him on Twitter at @gregotto. His OTR and PGP info can be found hereSubscribe to the Daily Scoop for stories like this in your inbox every morning by signing up here: fdscp.com/sign-me-on.

-In this Story-

Departments, Homeland Security Department, Tech, Data & Analytics, Big data, big data, DHS S&T

Join the Conversation