DHS releases RFP for NextGen Security Operations Center

The Department of Homeland Security released the final version of a nearly $400 million contract that will support the agency’s cybersecurity services.

The Next Generation Security Operations Center services contract will allow DHS to buy various services to protect its internal networks. The purchased products will be used to protect and mitigate threats to the agency’s wide area networks, Trusted Internet Connections program, servers and workstations.

The contract will be a single-award, indefinite delivery/indefinite quantity contract with a ceiling of $395 million. The award will come with a base ordering period of one year and six one-year optional ordering periods.

The awardee will operate the NextGen SOC — which will be under the direction of the Office of the Chief Information Officer, the National Protection and Programs Directorate, and the Science and Technology Directorate — around the clock, providing continuous monitoring, intrusion detection, vulnerability assessments and other security services.

Additionally, the SOC may need to work with other parts of the agency, including the United States Computer Emergency Readiness Team, component agency SOCs, the Computer Security Incident Response Center, and various law enforcement or intelligence offices.

This contract is not for the department’s Einstein program, which protects the networks that connect to other federal agencies.

According to a Q&A posted with the RFP on FedBizOpps.gov, DHS expects to award the contract in the third quarter of fiscal year 2016. However, given that the final proposal was dropped on the first day of the fiscal year’s third quarter, the award could come later.

Read the full RFP at FBO.gov.

Contact the reporter on this story via email at greg.otto@fedscoop.com, or follow him on Twitter at @gregotto. His OTR and PGP info can be found here. Subscribe to the Daily Scoop for stories like this in your inbox every morning by signing up here: fdscp.com/sign-me-on.

NIST guide provides new standard for PII protection

The National Institute of Standards and Technology has released a new encryption standard to improve safeguards for sensitive data, like credit card numbers and health information.

The guide, NIST SP 800-38G, creates standards for “format-preserving encryption,” which makes long strings of numbers indecipherable in both binary and decimal formats. Previously NIST standards were only applicable to binary data; it wasn’t technically feasible to encrypt decimals while also allowing computer programs to read a number in its original format.

Using this encryption method allows enterprises to encrypt sensitive data without completely overhauling their existing IT infrastructures. Heartland Payments Systems switched to format-preserving encryption after a 2009 hack, which saw more than 130 million credit and debit card numbers compromised.

While the encryption method will be primarily used to make encrypted credit card numbers unrecognizable from unencrypted ones, NIST believes another potential application is the “anonymizing” of personally identifiable information from databases, particularly those containing sensitive medical information.

“Databases of this sort are invaluable for researching the effects of different treatment methods on diseases, for example, but they often use social security numbers to identify individual patients and can contain other personal information,” NIST said in a release. “FPE encryption could handle this problem as well, though [guide author Miles] Dworkin stresses that in this case the approach would not necessarily be foolproof.”

“FPE can facilitate statistical research while maintaining individual privacy, but patient re-identification is sometimes possible through other means,” Dworkin said in a statement. “You might figure out who someone is if you look at their other characteristics, especially if the patient sample is small enough. So it’s still important to be careful who you entrust the data with in the first place.”

The standards for this form of encryption have been in the works for years, with NIST holding two public comment periods related to the guide over the past seven years. Numerous private companies helped developed the standard, as FPE has been used in the commercial sector for some time.

“As organizations and government agencies demand new data-centric security approaches that mitigate risks without stifling business strategies, vendors have rushed to market with a range of proprietary methods that are unproven and not peer-reviewed,” Mark Bower, global director of product management at Hewlett Packard Enterprise, said in a release. “The NIST standard is critical in setting the bar to ensure organizations are maintaining regulatory and audit compliance, as well as using proven methods to protect against a data breach.”

Robert Carr — the chairman and CEO of Heartland Payments Systems, which uses HPE’s technology — said in the release this type of encryption could help the government protect the vast troves of PII it handles on a daily basis.

“As one of the earliest victims of massive cybercrime affecting millions of cards, Heartland Payment Systems implemented cutting-edge technology to remediate the situation,” said Carr, who also serves on the National Infrastructure Advisory Council. “I hope the federal government incorporates this type of technology for protecting vast amounts of sensitive data in disparate systems.”    

The full guide is available on NIST’s website

Contact the reporter on this story via email at greg.otto@fedscoop.com, or follow him on Twitter at @gregotto. His OTR and PGP info can be found here. Subscribe to the Daily Scoop for stories like this in your inbox every morning by signing up here: fdscp.com/sign-me-on.

FBI begins hunt for new headquarters IT

The FBI is considering what IT systems it will need in its new Washington, D.C.-area headquarters.

It issued a request for information last week to gain better insight into the IT and audio-visual networks on the market that may fit its needs in its new headquarters building, for which President Barack Obama requested $1.4 billion in fiscal year 2017. The administration hopes to break ground on the new site in the upcoming fiscal year at one of three possible locations — Springfield, Virginia, and Landover and Greenbelt, Maryland — bringing the 11,000 FBI personnel from offices throughout the capital region together at one location.

In the new solicitation, the bureau said it’s hoping to “obtain information that may lead to the selection of a business partner with in-house capability to design, construct and maintain all aspects of advanced turnkey IT and AV networks solutions and systems including but not limited to network architecture design; backbone/distribution cabling design & construction; data center design & construction; advanced PDS solutions; satellite communication systems; fully integrated audio visual systems, Cable television (CATV) systems; security networks including access control & building management systems; wireless technologies; and Voice over Internet Protocol (VoIP) telephone networks.”

In particular, the FBI is looking to work with a vendor “with specific and relevant expertise in ‘cradle to grave’ IT facility design & construction services including operations and maintenance” at facilities “in excess of one million [rentable square feet] in the past seven years of at least 2,500 employees on a single campus,” the RFI says.

The RFI says work on the headquarters IT will begin Jan. 1, 2017, and the type of contract it uses will be determined by responses to the solicitation. 

On behalf of the FBI, the General Services Administration issued a request for proposals in January to a handful of pre-vetted developers. Congress already approved $390 million for the project in the 2016 omnibus bill. In addition to the nearly $1.8 billion total that would be dedicated to the project if Congress approves the White House’s request, the federal government plans to exchange the J. Edgar Hoover Building, the highly sought after current home of the FBI in downtown D.C., with the developer as part of the payment for the new property. 

“The Administration is committed to acquiring a consolidated new headquarters facility for the FBI, a member of the intelligence community,” Bill Dowd, GSA’s manager of the project, said in a press release. “The consolidated headquarters facility will allow the FBI to perform its critical national security, intelligence, and law enforcement missions in a new modern and secure facility.”

FedScoop did not receive comment from the FBI on its IT plans for the headquarters prior to publication. 

The FBI is accepting responses to the RFI until April 12. 

NIST ups transparency in new crypto standards

The National Institute of Standards and Technology has released the final version of a controversial document that lays out the process by which it agrees cryptography standards.

The agency’s standards were questioned after documents leaked by Edward Snowden showed that a NIST-approved encryption algorithm — the Dual Elliptic Curve random number generator (Dual_EC_DRBG) — contained a backdoor for the National Security Agency. The algorithm came under further scrutiny when it was revealed that Juniper Network’s firewall was manipulated by exploiting the backdoor.

Shortly afterwards, the Dual_EC_DRBG algorithm was removed from NIST special publications and other standards handbooks. 

But the agency’s Chief Cyber Security Advisor Donna Dodson went further, announcing NIST would be reviewing and overhauling the processes it uses to decide on the cryptographic standards it approves.

The document published earlier this week is the final fruit of that process, containing nine principles that NIST is supposed to adhere to when creating strong crypto standards, including transparency, openness and global acceptability.

The “global acceptability” was included, according to NIST, to reflect “the global nature of today’s commerce.” Leveraging the U.S.’ leading position in the development of internationally recognized security benchmarks in also a component of the Obama administration’s International Strategy for Cyberspace.

“Our goal is to develop strong and effective cryptographic standards and guidelines that are broadly accepted and trusted by our stakeholders,” said Dodson in a release. “While our primary stakeholder is the federal government, our work has global reach across the public and private sectors. We want a process that results in standards and guidelines that can be used to secure information systems worldwide.”

NIST also acknowledged in the release that there is the “possibility for tension” between its own goals and the missions of law enforcement and national security agencies. Encryption has been a white-hot topic, particularly since the Justice Department tried to force Apple to bypass the security features of an iPhone before a third-party found a way into the device.

The final document can be found on NIST’s website.

Contact the reporter on this story via email at greg.otto@fedscoop.com, or follow him on Twitter at @gregotto. His OTR and PGP info can be found here. Subscribe to the Daily Scoop for stories like this in your inbox every morning by signing up here: fdscp.com/sign-me-on.

Not just robots: Army plans for 2025 and beyond

The Army has made a prediction on what the battlefield will look like in the year 2025 and beyond: more technology-reliant than ever; and using sensor data, faster decision-making and human cognitive enhancement, as well as robotics and autonomous systems, to dominate the enemy.

The Office of the Army Chief Information Officer/G-6 released a predictive, scenario-based strategy update Thursday for the service’s long-term network modernization, shaped by “leap-forward technologies” and the adversary’s increased access to digital capabilities that level the playing field. 

Lt. Gen. Robert Ferrell, Army CIO/G-6, introduced the strategy, “Shaping the Army Network: 2025-2040,” during a keynote at AFCEA’s Army IT Day Thursday. The new document takes up where others just like it for the Army’s near-term and mid-term strategies leave off in planning a servicewide network modernization, Ferrell said.

“What I challenged my team to do is to assume…all this effort of modernization is done — what’s next?” Ferrell said.

The plan focuses around five broad areas of emerging technologies that the Army CIO and his team think will play a big role in future battlefield dominance. They are dynamic transport, computing and edge sensors; data-to-decisive-action; human cognitive enhancement; robotics and autonomous operations, and cybersecurity and resiliency.

“It really gets at the capability of the Internet of Things, software-defined networks, advanced analytics, adverse sensors and actuators, and self-healing networks,” Ferrell said Thursday in Tysons Corner, Virginia. The strategy document goes into even more detail on the specific technologies the Army believes it will need to dominate, like robotics, which “may represent one of the most significant and game-changing technologies for the Army since the development of the tank in World War I.” 

It even provides a fictional battlefield scenario in which a Joint Task Force, led by the Army, responds with a “battalion-sized raid” to a chemical warfare threat from a criminal gang called “Xipe Totec.” The adversary is headquartered in a “sprawling mega-city” on an island state where the government has “limited control” and is in any case not friendly to the U.S.

Using rodent-sized mini-bots and swarming micro-drones to map building interiors — along with sentient cyber-weapons and social media monitoring to seize control of local communications systems — the future task force is able to gather situational intelligence on the ground. When fused with surveillance data from satellites and other “national assets” overhead, this enables the soldiers, clad in robotic exoskeletons and employing a wide variety of lethal and non-lethal weaponry, to locate and destroy the biomedical facility where the toxin is being made, “neutralize” adversary leaders and leave before most people in the city even know they’re there.

“The technologies and the soldiers’ use of them described in this operational scenario are … believed to be entirely feasible by 2040,” states the document.

But’s not all gee-whizz — as in all conflicts, the enemy gets a vote, and emerging technologies may help adversaries level the playing field on which they are confronting the U.S.

“One development the military must closely watch,” the strategy explains, “is the growing availability of ever-increasing data processing power and faster transmission speed at lower cost. This trend gives resource-poor states, criminal organization and even individuals access to capabilities traditionally monopolized by advanced countries.”

“The pace of innovation in information technology is increasing the pace of operations, and our adversaries’ ability to influence our operating environment. The Army’s success in 2040 will depend on our retaining overmatch in both security and capability to provide freedom of action within the cyber domain while denying it to our adversaries.”

Then again, the Army could be way off — and it recognizes that.

“The projected technology developments described in this document may not all come to pass,” the document reads. “In fact, given the rapid pace of scientific advancement, the technology Soldiers eventually employ likely will differ from what is presented here. Therefore, investment strategies will likely need to be adjusted in the future as they will be influenced by global economics and still-unforeseen leap-ahead discoveries.”

New NIST working group born out of IoT complexities

Given that tens of billions of “things” will be connected to the Internet by 2020, it’s probably worth setting out some standards on how these devices and their digital architectures will work together. 

That’s been the task of several people at the National Institute of Standards and Technology over the past few years as the Internet of Things — also referred to as cyber physical systems — has begun exploding in popularity. 

However, David Wollman, deputy director of NIST’s Smart Grid and Cyber-Physical Systems program office says things are changing so fast that it’s been difficult to develop concrete standards for the greater IoT ecosystem to use. 

Wollman spoke about NIST’s efforts at an American Bar Association event Thursday, outlining how parties from academia, industry and government have all pitched in as the standards agency looks to finalize a new framework. 

NIST released a draft framework for cyber-physical systems last September, which is meant to create some common vocabulary and best practices for how everything from connected cars to fitness trackers is architected and built to communicate with one another. 

“What we’ve learned in this is very important to be able to cover many different domains,” Wollman said, referring to things like smart transportation, manufacturing and energy grid systems. “You not only want a system to work, you want it to work together with other infrastructures. If you have your smartphone in your car, you want them to pair up and help you do something.” 

The complexity surrounded these systems has kept the framework in what Wollman called the “very early stages” despite being released in draft form. 

With the complexity of these systems comes a heightened amount of risk. Wollman said every IoT system — and the users — will to have to balance security, privacy, safety, resiliency and liability.

“Usually [those categories] are siloed,” he said. “What we’ve realized in developing this framework is that its really going to be the interactions between those groups that are important. It’s not going to be sufficient to do a separate security analysis and then have a separate safety analysis. 

“There are places where you need to understand where those two are linked.” 

Yet even has the final framework is delayed, NIST is using its own in-house programs to gather further input. 

A number of standards agencies have come together with NIST to form a new working group that will develop standards for IoT-enabled smart cities, using the projects created during the agency’s Smart City challenge. 

“Here the goal is not to develop anything new,” Wollman said. “We are very cognizant that there is already a lot of activity going on. What we want to do is look at the deployed field of architectures in smart cities. When you have a city where multiple architectures are going to pop up, you need to have an understanding of how they will possibly work together.” 

The group, which consists of the American National Standards Institute, FIWARE and the U.S. Green Building Council among others, will work to eliminate interoperability barriers and coalesce around standard architectural design for smart city systems. That work will eventually makes it way into NIST’s own framework. 

Wollman went on to say that figuring out how all these things work together, as well as how they work with the people that are planning to use them, is crucial to further the use of IoT. 

“If you are doing analysis, you can measure one system and put it right next to the analysis of another system, look at how you’ve understood the systems and by figuring out how they treated the human element, you’ll be able to understand those two systems can be linked together or if any additional things need to happen.”

Contact the reporter on this story via email at greg.otto@fedscoop.com, or follow him on Twitter at @gregotto. His OTR and PGP info can be found here. Subscribe to the Daily Scoop for stories like this in your inbox every morning by signing up here: fdscp.com/sign-me-on.

Pentagon IT services consolidation nearing full operation

The Pentagon could be less than six months away from completing the consolidation of IT services for its Washington, D.C.-area offices, the “toughest” change management project underway in the department, the general in charge said Thursday.

Officials are working to merge the two separate IT units that serve Defense Department facilities across the national capital region into the Defense Information Systems Agency’s Joint Service Provider, a transition that will conclude when senior DOD IT officials designate the program with full operational capability.  

Brig. Gen. Brian Dravis, DISA’s director of the JSP, told AFCEA’s Army IT Day he’ll likely recommend to DOD Deputy CIO Dave DeVries in the next six months that full operational capability has been attained. 

“I’ll be able to, I think, give Mr. DeVries a recommendation, maybe in the next six months, that says, ‘Sir, I think we’ve matured this organization enough that you can make a decision of if you want to declare [full operational capability] or not,” Dravis said at the event in Tysons Corner, Virginia.

The JSP, which has also been referred to in the past as the Joint Information Technology Service Provider-Pentagon, reached its initial operational capability in July 2015. The program is in the process of consolidating the common services provided by the Army’s Information Technology Agency and DOD’s Washington Headquarters Services’ Enterprise Information Technology Services Directorate into a unified shared delivery unit housed by DISA, driving efficiencies, reducing duplication and saving money.

“We were directed to take the two core IT organization inside the Pentagon … and push them together. That’s what we did, and it was about as hard as you could imagine,” he said.

Dravis continued: “We will have done all the work planning those changes and other things needed to say, ‘OK, we’re out of our [initial operational capability] phase,’ which we just entered in July of 2015, eight months ago, and in about 12 month’s time, we will have matured it to the point to be considered full operational capability.” 

Typically for such projects, he added, it can take up to a year to get to initial operational capability, and as many as three years to move on to full operation.

In terms of return on initial investments, Dravis said so far things have been great — particularly the savings. In fiscal year 2015, which had about two months left when JSP got its initial operational capability, the program ran at about 500 percent of its savings target of $580,000, according to a memo from DOD Deputy Secretary Bob Work last July. Those numbers tapered off “by about 200 percent or so” so far in fiscal year 2016, Dravis said; but even so, JSP met its savings goal of $14.6 million for the year by the start of the second quarter.  

And those savings have come without being really “Draconian about it,” Dravis said. 

“We’ve taken what I believe are common-sensical approaches to looking at areas of relatively easy consolidation and started consolidating,” he said.

JSP has also reduced the aggregate number of contracts between the merged organizations — which were in the hundreds, he said — by 20 percent.

That all didn’t come without challenges, though, Dravis admitted. Culture and organizational integration — that is, bringing people together to create a new organization when they’re physically displaced — were two major roadblocks in the unification effort. 

“We’re all over the [national capital region],” he said. “We have people in buildings all over the place. Its very hard to have a unity of effort, a unity of command, when you’re dispersed all over the place.”

Dravis added, “We’ve got to get everybody going in one central direction — let’s all head north, and we’ll be fine from there.” He said he was able to do so with a short list of specific priorities for the current fiscal year, like focusing on the departmentwide order to move to Windows 10, and building out Wi-Fi at the Pentagon and other installations.

All-in-all, it’s one incredible lift, Dravis said. “In my view, the JSP reflects the heaviest, toughest change management project happening in the Pentagon today.”

eSignLive added to FedRAMP-compliant cloud offering

Federal agencies looking to use digital signatures in a secure cloud environment now have an option: eSignLive has partnered with a FedRAMP compliant provider in order to bring its software to the government.

Distributed by VASCO, eSignLive will be available through Project Hosts, which gives agencies the ability to run various Windows and Linux applications on a private cloud.

eSignLive has been used in the government for over 20 years, utilized by the Joint Chiefs of Staff, U.S. Army, General Services Administration, U.S. Postal Service, and U.S. Department of Agriculture. The software is used in conjunction with a PIV or CAC card in order to electronically sign digital documents related to contracting, procurement, mobility, supply chain management and human resources, among others.

“eSignLive is making digital government a reality through our partnership with Project Hosts and running our e-signature solution in its FedRAMP-compliant cloud environment,” said eSignLive President Tommy Petrogiannis. “This partnership answers the demand from government agencies to be able to transact completely electronically with the highest level of data security, compliance and auditability – driving stakeholder and citizen engagement at the same time.”

Project Hosts runs a suite of different Microsoft SaaS products on an Azure cloud, including sharePoint, Dynamics CRM and Microsoft Office. It also offers various third-party SaaS offerings, including project management platform Innovate-e, financial management UMT360 and content management systems like Joomla and WordPress.

“By incorporating eSignLive directly into our Azure FedRAMP cloud environment, we perform the necessary due-diligence and create the requisite documentation to ensure all security controls are in place to maintain SaaS-level FedRAMP compliance,” said Scott Chapman, CEO and co-founder of Project Hosts.

Project Hosts cloud earned a FedRAMP ATO through the Department of Commerce. Their services can be purchased through NASA SEWP.

Come learn more about digital signatures at the eSignature Summit, presented by FedScoop, on May 3, 2016. 

Contact the reporter on this story via email at greg.otto@fedscoop.com, or follow him on Twitter at @gregotto. His OTR and PGP info can be found hereSubscribe to the Daily Scoop for stories like this in your inbox every morning by signing up here: fdscp.com/sign-me-on.

DOD missing data center closure goals — audit

The Defense Department closed fewer than one-in-five of its data centers last year, less than half its target, and is on course to miss its goal next year as well, according to a new watchdog report.  

The department closed 18 percent of its data centers — 568 of 3,115 total facilities — by the end of fiscal year 2015, compared with the 40 percent required under the Federal Data Center Consolidation Initiative, according to an inspector general audit released Tuesday. 

The department also probably won’t meet its own internal goal of 60 percent of its centers closed by fiscal year 2018 if things continue as planned, the report said.

The Army and the Defense Information Systems Agency are leading among DOD agencies and military branches, closing 30 and 29 percent of their data centers respectively before the end of fiscal year 2015, still well below the 40 percent goal for all federal agencies set by the Office of Management and Budget under the FDCCI.

As the cause of the missed goal, the inspector general points to CIO Terry Halvorsen’s decision not to overhaul the department’s consolidation strategy after OMB broadened its definition of data centers to include smaller ones.

“The DoD CIO did not revise its March 2010 data center consolidation strategy to account for the increased number of data centers after OMB revised the data center definition to include smaller facilities,” the report says. The change is what counted as data center “caused the number of data centers managed by DoD to increase from 772 to more than 1,000. The number continued to increase as DoD Components discovered more data centers based on the revised definition.”

According to DOD officials, when the definition was revised, a large number of “special purpose processing node” data centers suddenly fell under that new definition and were added to the count, effectively preventing meet the 2015 target. Officials also told auditors “SPPNs would prevent DoD from reaching its 60-percent data center reduction goal” for fiscal year 2018. “Specifically, of the 1,663 data centers not scheduled to close by FY 2018, 1,096 (67 percent) are SPPNs,” the report states.

In his response to the report, Halvorsen said the SPPNs could not be moved to the cloud, as were the other data centers being consolidated. He therefore unsuccessfully sought “relief from OMB to exclude SPPNs from its data center consolidation metrics because SPPNs could not be severed from the facilities or equipment they supported.” 

Additionally, the audit states that information on many of the closed data centers was not reportedly accurately because Halvorsen and his office did not issue clear guidance on how components should report and update data on their closures. Of 119 centers the IG reviewed, it found inaccurate information in the system for 68 of them. 

Despite these shortcomings, Halvorsen still has his sights set high for DOD data center consolidation. 

In written testimony for a recent hearing with the House Armed Services Subcommittee on Emerging Threats and Capabilities on DOD’s budget request for fiscal year 2017, Halvorsen said though “DoD continues to reduce the number of physical sites and administrators needed to operate facilities to not only save money and reduce our footprint, but to also improve security,” he is “not yet satisfied with the savings achieved or the current savings projections to-date.”

“While DoD has projected $1.8 billion in cumulative savings through FY2018, the Department is taking steps to aggressively drive more savings,” his testimony says. 

DISA took its time with its latest cloud security guide

The Defense Information Systems Agency has issued a long-awaited update to its cloud security guide, further refining the process by which DISA plans to assess cloud service providers beyond the guidelines laid out in the Federal Risk and Authorization Management Program, or FedRAMP.

DISA has taken more than a year to provide refinements to the guide, which was initially released in January 2015. Among the changes are further revisions that differentiate the six impact levels to help evaluate how sensitive a given set of data is.

Additionally, the new version of the guide clarifies how cloud service providers are assessed beyond FedRAMP, including enhancements to FedRAMP Plus, while also making tweaks to privacy protocols.

“The new version fittingly represents the evolution we are going through to refine our processes and better position the department to enable secure options to migrate systems and data to the cloud,” said DISA CIO John Hickey.

The guide was also released with a published revision history, showing how the document has evolved over the past year. An Excel spread, referred to as a “comment matrix” has also been included to allow further feedback.

“This on-going public comment period will allow our mission partners to offer changes as they become necessary,” said Robert Vietmeyer, associate director for cloud computing and agile development in the enterprise capabilities directorate at the DOD CIO’s office. “This is in direct support of the DOD CIO’s vision of ‘agile policy development.’”

The memo, revision history and comment matrix can be found on DISA’s website.

Contact the reporter on this story via email at greg.otto@fedscoop.com, or follow him on Twitter at @gregotto. His OTR and PGP info can be found hereSubscribe to the Daily Scoop for stories like this in your inbox every morning by signing up here: fdscp.com/sign-me-on.