Advertisement

Big data’s $1.6B role in DOD’s IT strategy

Special to FedScoop: A look at the Defense Department's big data plans as outlined at immixGroup's recent FY 16 Big Data Sales Opportunities at Federal Agencies.

It’s become undeniable that big data technology is an essential part of an organization’s overall IT strategy. The Defense Department is expected to spend nearly $1.6 billion on big data in fiscal year 2016 to cover areas such as cyber defense analytics and situational awareness. Civilian agencies are spending even more, with an estimated $2 billion going to big data solutions and services.

In the wake of a number of highly publicized cyber attacks, federal agencies are opting for more robust security approaches. Security is evolving from solely network-centric systems into more comprehensive systems integrating data analytics that go beyond traditional data sources. The benefits go beyond security as disparate data sources are compiled and explored, revealing new areas for research and ways to optimize device performance.

In general, however, technologies are outpacing adoption, as policies are still being developed to determine how to put big data to work across all agencies.

In a recent panel on fiscal year 2016 big data sales opportunities at federal agencies, organized and hosted by immixGroup, three government panelists discussed the current state and future plans for big data. Panelists included:

Advertisement
  • Gary Blohm, Director, Army Architecture Integration Center, U.S. Army
  • Wo Chang, Digital Data Advisor, National Institute of Standards and Technology Information Technology Lab
  • Tom Morton, Cloud Strategist, Office of the Defense Department Chief Information Officer

Laying the federal groundwork

Blohm noted that DOD’s migration to the Joint Regional Security Stack (JRSS) is “ideal” for data analytics. Consolidation of data centers and systems to reduce the number of silos and fuse the data results in a more centralized data architecture, which lends itself to analytics-based projects. Morton agreed, and further underscored the need to protect existing data and the recombination of data, while keeping budgetary limitations in mind. Morton said open source software, commercial off-the-shelf software, cloud services and analytics tools can help to achieve more efficient use of an organization’s data. He recommended that this be done in collaboration across agencies.

Chang explained the recent work of the NIST big data public working group writing the Big Data Interoperability Framework, which aims to create a “common lexicon” and reference architecture. NIST submitted seven documents to the International Standards Organization and anticipates three report versions, addressing top level architecture, interface and validation, respectively. The first version of results should be available by the end of July, Chang said.

In the process of preparing results, NIST identified 170 technologies, most of which were open source. Understanding that, NIST had to be flexible in creating standards and frameworks. “People want to work fast,” Chang explained.

Advertisement

Asked about ways in which the government is deriving value from big data, Blohm noted that it is “challenging” to incorporate big data into enterprise solutions and instead it is being done in “small chunks,” with progress measured in setting up pilots to learn best practices before moving into full production. The intelligence community is taking a particularly hard look at big data across the enterprise, where before solutions were more stovepiped.

Morton noted that the Cyber Situational Awareness Analytical Capability (CSAAC) has played an important role to increase DOD’s understanding of infrastructure requirements. Similarly, the DOD has introduced the rapid deployment kit (RDK), a collection of open source software as a standard infrastructure from which to develop analytics, in an effort to be more leading edge.

JRSS is equally important in making sense of digital information, Morton added. A compelling use of big data analytics in the DOD is in “distributed query,” Morton said. “You don’t want to move data to the query; you have to move the query to where the data is.”

First steps in becoming a data driven organization

Blohm noted that data governance is a key consideration for big data implementation in the government and that data governance should be implemented at the program level; it’s not sufficient to have data with an app riding on it and no defined policies around that data.

Advertisement

While Morton agreed, he also noted “you can’t start with heavy-handed governance.” Organizations have to be open to new solutions, Morton said, and have to look to the private sector for innovative ideas. “Twitter does more with big data than we do,” Morton conceded, adding that the federal government needs to look at how organizations in Silicon Valley are implementing and using these technologies.

Open source or premium software?

Morton noted “open source is not free,” carrying with it installation, accreditation, operation and support costs. The government needs contracts for sustaining open source applications. “We can’t do it on our own,” he said.

Chang said the “whole ecosystem is changing” in favor of open source solutions, but current investments in legacy systems require a bridge from those systems to new solutions. Blohm added that “not getting locked in” to a solution is a real concern, because the government has to be able to keep up with emerging technologies within its own cycles of procurement and implementation.

When asked about ways to minimize the possibility of “rogue developers” with potentially nefarious agendas adding to open source code, Morton acknowledged “we do take risks.” He said the government needs a credible way to get support. “If we see risks, we have to know that [vendors] have a response, or we can’t put it into operational use,” Morton said.

Advertisement

Cybersecurity and privacy frameworks form a checklist for understanding risk, Chang said. It’s important to check code on the server side, and validate data at the global registry level. Most importantly, the government needs to know that vendor organizations have their own security policies in place.

Blohm added that the acquisition community has to be focused on supply chain management. It is worrisome, in his opinion, that some open source purchases come from multiple sources, Blohm said. “We have to look at this more.”

Christopher Short is a Senior Account Manager with immixGroup, focusing on helping Big Data technology companies do business with the government. He can be reached at Christopher_Short@immixgroup.com or connect with him on LinkedIn at www.linkedin.com/in/cshort.

Latest Podcasts