FedScoop http://fedscoop.com Federal technology news and events Fri, 28 Nov 2014 01:30:58 +0000 en-US hourly 1 FBI’s Jennifer Sanchez discusses women in tech http://fedscoop.com/fbis-jennifer-sanchez-discusses-women-tech/ http://fedscoop.com/fbis-jennifer-sanchez-discusses-women-tech/#comments Thu, 27 Nov 2014 17:24:34 +0000 http://fedscoop.com/?p=65990 Jennifer Sanchez, assistant director for the FBI's IT customer relationship and management division, talks to FedScoop TV about increasing the number of women in the federal tech workforce.

The post FBI’s Jennifer Sanchez discusses women in tech appeared first on FedScoop.

]]>

Jennifer Sanchez, assistant director for the FBI’s IT customer relationship and management division, talks to FedScoop TV about increasing the number of women in the federal tech workforce.

The post FBI’s Jennifer Sanchez discusses women in tech appeared first on FedScoop.

]]>
http://fedscoop.com/fbis-jennifer-sanchez-discusses-women-tech/feed/ 0
NSA releases first-ever open source software product http://fedscoop.com/nsa-open-source-nifi/ http://fedscoop.com/nsa-open-source-nifi/#comments Wed, 26 Nov 2014 22:04:59 +0000 http://fedscoop.com/?p=66051 The NSA announced Tuesday the public release of "Niagrafiles" or "Nifi," which can automate data flows across multiple computer networks.

The post NSA releases first-ever open source software product appeared first on FedScoop.

]]>
NSA Headquarters in Fort Meade, Md. (Photo: Wikipedia)

NSA Headquarters in Fort Meade, Maryland. (Credit: Wikipedia)

It may not be much of a surprise that the National Security Agency has expertise in data management. What’s surprising is that the agency has released one of its data management tools to the public, with the software completely open source.

The NSA announced Tuesday the public release of “Niagrafiles,” or “Nifi,” which can automate data flows across multiple computer networks, even if data formats differ.

Released in partnership with the Apache Software Foundation, the agency said Nifi could help companies “quickly control, manage, and analyze the flow of information from geographically dispersed sites.”

Nifi “provides a way to prioritize data flows more effectively and get rid of artificial delays in identifying and transmitting critical information,” lead developer Joseph Witt said in a statement.

For those out there thinking that NSA and open source go together as well as the combination of politics and Thanksgiving dinner, this is not the first time the agency has worked with open source software. The NSA created the first Security-Enhanced Linux kernel in 2000 and worked with Apache on its Accumulo data storage system.

Even if the wide coverage of the NSA’s internal surveillance programs makes some people uneasy about anything attached to the agency, an open source advocate tells FedScoop its “extremely unlikely, bordering on impossible” that the agency would use this as a way to dupe people into a new form of snooping.

NSA

Nifi will help automate data flows among multiple computer networks. (Credit: NSA)

“The NSA knows perfectly well that if they plant some of kind of secret backdoor in the code in some obscure way, if anyone did find that out, their trustworthiness in the open source world would be shot forever and they know that,” said Karl Fogel, a former board member of the Open Source Initiative, in an interview with FedScoop. “They’ve chosen a very well-known, very prominent organization to release this project in association with. People at Apache would be really upset if the NSA planted backdoors in the code.”

Fogel said the combination of NSA’s big data expertise with an open source release should be a positive for the public.

“If anything, open source code from [the NSA] has a higher degree of capability than it would from a lot of places,” Fogel said. “You can bet that the NSA has a lot of experience with [data flows]. The NSA has a good budget. Unlike many government agencies, they’re adequately funded and they’re able to hire the best, and they have been [able to] for a long time.”

Tony Wasserman, a professor of software management practice at Carnegie Mellon University and board member of OSI, told FedScoop that the NSA might be looking for some good will by releasing the software publicly.

“My take is that the NSA decided to open source their project because they wanted to get some favorable recognition in the open source community and elsewhere,” Wasserman wrote in an email to FedScoop. “Perhaps they are also hoping that software developers outside the NSA or other secretive agencies would contribute to their project. Good developers are hard to find, so drawing on the community is a smart idea and might even help them recruit people who want to work for them.”

The NSA says Nifi will be followed by a series of releases through its Technology Transfer Program.

The post NSA releases first-ever open source software product appeared first on FedScoop.

]]>
http://fedscoop.com/nsa-open-source-nifi/feed/ 0
Patent office should continue to boost IT funding – report http://fedscoop.com/patent-office-continue-boost-funding-report/ http://fedscoop.com/patent-office-continue-boost-funding-report/#comments Wed, 26 Nov 2014 19:11:15 +0000 http://fedscoop.com/?p=66024 As it emerges from 2013’s sequestration, the U.S. Patent and Trademark Office should continue to increase funding for information technology, according to a new report put out by the 2014 Patent Public Advisory Committee.

The post Patent office should continue to boost IT funding – report appeared first on FedScoop.

]]>
(iStockphoto.com)

A new report from the Patent Public Advisory Committee urges the USPTO to protect funding for information technology projects. (Credit: iStockphoto.com)

As it emerges from 2013’s sequestration, the U.S. Patent and Trademark Office should continue to increase funding for information technology, according to a new report from the 2014 Patent Public Advisory Committee.

“[W]hile the IT funding situation has improved and work is progressing with deliberate speed, the PPAC notes the ever urgent need for aggressive development in IT,” the report said.

The group, whose members represent various patent-related interests, also recommended that the agency shield IT from future cuts, continue to update or replace its old IT systems, expand efforts to coordinate with patent offices abroad, and continue to develop connections with the intellectual property community.

The Office of the Chief Information Officer’s budget was a casualty of major budget cuts last year: In 2013, funding for several of IT projects came to a halt, and work was deferred or delayed.

But in fiscal year 2014, IT spending increased by 71 percent (from nearly $368 million to nearly $630 million) over the previous year, allowing the office to hire more than 100 new IT workers and restart dozens of tabled projects.

“It must be acknowledged, however, that this funding has simply put the Office back on the right track as far as IT is concerned and that continued and even faster progress must be made in order to overcome the deficiencies that have been repeatedly stated in these reports and in the Office’s plans and budget requests,” the report said.

Nominated patent office chief Michelle Lee previously has touted improving IT as a means of streamlining the agency. The office faces a backlog of 600,000 patent applications. Lee specifically expressed the need to fully deploy Patents End-to-End, a set of programs that would make processing patent applications more efficient.

But budget cuts had forced the agency to table such projects. The report noted that key parts of a patent application database and tracking system called Patent Application Location and Monitoring system, or PALM, were supposed to be modernized under Patents End-to-End. However, cutbacks put plans to replace components of PALM, which was designed for mainframe computers in the 1980s, on hold.

“The longer legacy replacements and enhancements wait, the more resources are required simply to keep obsolete systems running and the greater the risk of serious system failure,” the report said.

During last week’s PPAC meeting, Lee commended members of the group for the “countless hours” they spent putting together the report. She said the patent office is currently reviewing the report.

“The PPAC recommendations are always valued by our senior leadership team and by myself, and we very much appreciate every opportunity we get to interact with all of you,” Lee said.

The post Patent office should continue to boost IT funding – report appeared first on FedScoop.

]]>
http://fedscoop.com/patent-office-continue-boost-funding-report/feed/ 0
Watch VA CIO Stephen Warren’s FedTalks 2014 presentation http://fedscoop.com/watch-va-cio-stephen-warrens-fedtalks-2014-presentation/ http://fedscoop.com/watch-va-cio-stephen-warrens-fedtalks-2014-presentation/#comments Wed, 26 Nov 2014 13:45:04 +0000 http://fedscoop.com/?p=65980 Stephen Warren, chief information officer for the Department of Veterans Affairs, spoke at FedScoop's 2014 FedTalks on how the VA turned itself around with agile development.

The post Watch VA CIO Stephen Warren’s FedTalks 2014 presentation appeared first on FedScoop.

]]>

Stephen Warren, chief information officer for the Department of Veterans Affairs, spoke at FedScoop’s 2014 FedTalks on ow the VA turned itself around with agile development.

The post Watch VA CIO Stephen Warren’s FedTalks 2014 presentation appeared first on FedScoop.

]]>
http://fedscoop.com/watch-va-cio-stephen-warrens-fedtalks-2014-presentation/feed/ 0
NRC struggles with continuous monitoring, recurring security weaknesses http://fedscoop.com/nrc-struggles-continuous-monitoring-years-old-security-weaknesses/ http://fedscoop.com/nrc-struggles-continuous-monitoring-years-old-security-weaknesses/#comments Tue, 25 Nov 2014 21:31:09 +0000 http://fedscoop.com/?p=65973 The Nuclear Regulatory Commission is failing to perform required continuous monitoring measures and update other security weaknesses it's known about for years, a new report from NRC's Office of the Inspector General found.

The post NRC struggles with continuous monitoring, recurring security weaknesses appeared first on FedScoop.

]]>
The Nuclear Regulatory Commission is failing to perform required continuous monitoring measures and update other security weaknesses it’s known about for years, a new report from NRC’s Office of the Inspector General found.

With the help of a third-party independent auditor, the OIG found that NRC continues to improve its IT system security and apply recommendations from prior Federal Information Security Management Act-based evaluations. Despite that, the commission still lacks many vital security practices and therefore “NRC cannot ensure the effectiveness of information security controls for NRC systems and cannot identify and control risk,” the report states.

Of most concern is NRC’s struggle with continuous monitoring. The IG report found the commission failed to comply with updated continuous monitoring standards, in particular neglecting to complete annual security control assessments. Likewise, NRC did not update its systems to reflect the new standards in the National Institute of Standards and Technology’s Special Publication 800-53, Revision 4, “Security and Privacy Controls for Federal Information Systems and Organizations,” released in April 2013.

“For systems operating under [a continuous authorization to operate], continuous monitoring is essential for determining risk associated with systems and for ensuring risk-based decisions are made concerning continued system operation,” the OIG report states. “If continuous monitoring activities are not performed as required, NRC cannot ensure the effectiveness of the information security controls for NRC systems and cannot identify and control risk.”

To remedy the situation, the OIG recommends NRC update all noncompliant continuous monitoring operations to reflect the NIST standard.

The OIG also was troubled that a few recommendations from prior FISMA evaluations were still not in place. The audit found issues with the consistent configuration management of several NRC systems, an issue addressed in prior reports. These vulnerabilities were discovered in a fiscal year 2011 report and many still linger. The report shows the agency was aware of the configuration issues.

“Vulnerability scanning performed as part of security control assessment activities identified numerous vulnerabilities that demonstrate non-compliance with required baseline configurations in half of NRC’s operational systems,” the report says. “These are vulnerabilities that have been identified by the agency as actual weaknesses requiring remediation and most are being tracked on the agency’s [plan of action].”

Additionally, NRC failed to address all known security vulnerabilities in its FISMA- and NIST-required plan of action and milestones documentation. And when the weaknesses were documented, the report says they weren’t dealt with in a timely manner. The report recommends NRC address this issue based on its findings in 2012 and 2013 audits; without the plan of action, it says corrective efforts cannot measure the program’s effectiveness.

A commission spokesman said NRC is reviewing the report and plans to respond to the Office of Inspector General.

“The NRC takes its information security responsibilities seriously,” the spokesman said. “While we are pleased that the report’s conclusions are mostly positive, we value any recommendations for improving our performance in this important arena.”

The post NRC struggles with continuous monitoring, recurring security weaknesses appeared first on FedScoop.

]]>
http://fedscoop.com/nrc-struggles-continuous-monitoring-years-old-security-weaknesses/feed/ 0
What feds need to know about Regin malware http://fedscoop.com/regin-malware-federal-agencies/ http://fedscoop.com/regin-malware-federal-agencies/#comments Tue, 25 Nov 2014 21:06:49 +0000 http://fedscoop.com/?p=65985 Cybersecurity professionals should play closer attention to Web browsing, email services and Windows domain controllers in the wake of Regin, a highly sophisticated malware application.

The post What feds need to know about Regin malware appeared first on FedScoop.

]]>
Regin

A diagram from Kaspersky Lab shows multiple stages of the Regin malware exploit. (Credit: Kaspersky)

Cybersecurity professionals should be paying close attention to Web browsing and email services in the wake of a highly sophisticated malware application being compared to some of the most elaborate threats in recent memory.

Regin (pronounced REE-jin) was discovered over the weekend by security firm Symantec, which concluded that the malware is a “highly complex threat” that’s been used “for large-scale data collection or intelligence gathering campaigns.” The firm found that two versions of the software have been moving through the Internet, but the worm’s complexity has kept it hidden for years.

“One of the things that makes Regin unique is that it is very difficult to detect, due to its modular architecture and specialized encryption,” said Liam O’Murchu, a security response manager for Symantec Corp. “We believe the attacks may have originated via browsing the Web or via email, which are currently two of the most popular attack vectors we see used. Looking at these avenues is important while still understanding that attacks like these are sophisticated and can take advantage of weaknesses in many parts of an organization.”

Kaspersky Lab Inc., which also put out its own study on Regin, said information security professionals should pay particular attention to Microsoft Windows domain controllers, large databases, systems with Internet connectivity and proxy servers.

“A few things to consider are to install a modern security suite on all endpoints and servers,” a spokeswoman for Kaspersky Lab told FedScoop. “Log events and set up a centralized logging system. Keep everything updated. Also, use whitelisting and default deny policies as much as possible.”

Symantec found that targets included private companies, government entities and research institutions across the globe. Regin has been found in more than 10 different countries, with the majority of Symantec’s findings pinpointing the exploit in Saudi Arabia and Russia. The firm also said telecom industry systems have accounted for more than a quarter of reported instances.

Regin

A breakdown by country of the instances of the Regin malware exploit. (Credit: Symantec)

Kaspersky took the telecom angle even further, with researchers stating in a blog post that the malware has the ability to intercept cell phone calls and text messages by manipulating antennas on GSM networks:

The ability of this group to penetrate and monitor GSM networks is perhaps the most unusual and interesting aspect of these operations. In today’s world, we have become too dependent on mobile phone networks which rely on ancient communication protocols with little or no security available for the end user. Although all GSM networks have mechanisms embedded which allow entities such as law enforcement to track suspects, there are other parties which can gain this ability and further abuse them to launch other types of attacks against mobile users.

Neither firm could pinpoint the origin of the worm, but they said they have logged instances of Regin as far back as 2008. Both Symantec and Kaspersky said Regin is specifically used for intelligence gathering by a nation-state with the ability to facilitate other types of attacks.

Multiple reports have security experts claiming Regin is the work of the NSA and Britain’s Government Communications Headquarters. A technical analysis on The Intercept finds the malware was found on European Union systems that were targeted by the NSA. A Wall Street Journal report said the bug is tied to “Operation Socialist,” which was detailed in the documents leaked by former NSA analyst Edward Snowden.

Because of its intricacy and the fact that only fragments of the worm were studied, Symantec said other versions may remain undiscovered. The firm plans to continue to examine Regin and will release updates if more information is uncovered.

Both Kaspersky and Symantec have released white papers with more information on Regin.

The post What feds need to know about Regin malware appeared first on FedScoop.

]]>
http://fedscoop.com/regin-malware-federal-agencies/feed/ 0
Bobbie Stempfley explains DHS’ new network scanning authority http://fedscoop.com/bobbie-stempfley-explains-dhs-new-network-scanning-authority/ http://fedscoop.com/bobbie-stempfley-explains-dhs-new-network-scanning-authority/#comments Tue, 25 Nov 2014 17:45:24 +0000 http://fedscoop.com/?p=65977 Bobbie Stempfley, deputy assistant secretary for cybersecurity and emergency communications at the Department of Homeland Security, joins FedScoop TV at FedTalks 2014 to discuss her agency's new authority to scan agency networks for cybersecurity vulnerabilities.

The post Bobbie Stempfley explains DHS’ new network scanning authority appeared first on FedScoop.

]]>

Bobbie Stempfley, deputy assistant secretary for cybersecurity and emergency communications at the Department of Homeland Security, joins FedScoop TV at FedTalks 2014 to discuss her agency’s new authority to scan agency networks for cybersecurity vulnerabilities.

The post Bobbie Stempfley explains DHS’ new network scanning authority appeared first on FedScoop.

]]>
http://fedscoop.com/bobbie-stempfley-explains-dhs-new-network-scanning-authority/feed/ 0
Glitch takes down DOD’s open source IT collaboration environment http://fedscoop.com/glitch-takes-dods-open-source-collaboration-environment/ http://fedscoop.com/glitch-takes-dods-open-source-collaboration-environment/#comments Tue, 25 Nov 2014 16:11:57 +0000 http://fedscoop.com/?p=65967 Forge.mil is a family of enterprise services, including SoftwareForge, ProjectForge and community collaboration pages that support the Defense Department's technology community.

The post Glitch takes down DOD’s open source IT collaboration environment appeared first on FedScoop.

]]>
The Defense Information Systems Agency announced today that a technical glitch forced all of its Forge.mil collaboration and project management sites offline since Sunday, and the agency cannot “reliably predict” when the systems might be back online.

Screen Shot 2014-11-25 at 10.53.13 AMForge.mil is a family of enterprise services, including SoftwareForge, ProjectForge and community collaboration pages that support the Defense Department’s technology community. The Forge.mil service provides for collaborative development and IT project management throughout the full application life cycle and also enables the reuse of open source and DOD community source software.

“On Sunday, 23 Nov, our hosting provider executed an automated script that inadvertently corrupted all Forge.mil systems,” DISA said in an email notification obtained by FedScoop. “We’ve been working with their SysAdmins since then to bring Forge.mil back into service. We were expecting all systems back up Monday afternoon but problems with recovering from our backup systems have prevented that. We continue to work with our hosting provider to escalate issues and apply necessary resources to achieve resolution until all systems are back up and available.”

Forge.mil is designed as an enabler to improve the ability of the DOD to rapidly deliver dependable software, services and systems in support of net-centric operations, according to an official description of the site.

The Forge.mil community is a collaborative content and knowledge management site for Forge.mil users to connect and share information using social collaboration tools, such as group blogs, discussions, wikis, documents and polls.

SoftwareForge enables the collaborative development and distribution of open source software and DOD community source software.

ProjectForge provides the same application life cycle management tools to DOD projects and programs as SoftwareForge but for programs and projects that are not doing DOD community source development and need to restrict access to specific project members.

Once the main Forge.mil server has been restored, DISA said it plans to post a status message for the other sites on that main page.

The post Glitch takes down DOD’s open source IT collaboration environment appeared first on FedScoop.

]]>
http://fedscoop.com/glitch-takes-dods-open-source-collaboration-environment/feed/ 0
NextGen components come to D.C., Texas http://fedscoop.com/highways-sky-nextgen-components-come-d-c-texas/ http://fedscoop.com/highways-sky-nextgen-components-come-d-c-texas/#comments Tue, 25 Nov 2014 13:56:54 +0000 http://fedscoop.com/?p=65935 Parts of the Federal Aviation Administration's Next Generation Air Transportation System, or NextGen, are coming to the Washington, D.C., metro area and to airports in northern Texas, according to two recent announcements from the agency.

The post NextGen components come to D.C., Texas appeared first on FedScoop.

]]>
NextGen

The air traffic routes coming in and going out of Washington, D.C.-area airports. (Credit: FAA)

Parts of the Federal Aviation Administration’s Next Generation Air Transportation System are coming to the Washington, D.C., metro area and airports in northern Texas, according to two recent announcements from the agency.

NextGen, a decadelong effort to improve U.S. air transportation, is composed of several different elements — however, the two new additions to the NextGen network use Optimized Profile Descent technology, which allows aircraft to descend from their cruising altitude to the runway in a continuous arc.

The OPD systems opened this week and use satellite-drawn departure paths to create specific departure lanes for each airport.

Through OPD, aircraft will descend at a more steady pace, instead of the previous staircraft system.  Source: FAA Video

Through OPD, aircraft will descend at a more steady pace, instead of the previous system. (Credit: FAA)

 

Airports in the D.C. Metroplex around the D.C. metropolitan area are now officially equipped with the technology, centered around Baltimore-Washington International Thurgood Marshall Airport, Dulles International Airport and Ronald Reagan Washington National Airport. The system also accommodates and integrates with flights from Andrews Joint Base Airport, Richmond International Airport and other small airports in the region.

“The national capital region is reaping the benefits of NextGen and this announcement further highlights how the federal government is making a difference,” Transportation Department Secretary Anthony Foxx said in a statement. “These new and improved highways in the sky mean increased safety, more on time arrivals and departures, reduced fuel consumption and reduced pollution-causing emissions.”

The FAA also established OPD systems at Dallas/Fort Worth International Airport and Dallas Love Field. In addition, the airports have also developed alternative routes to help planes navigate around inclement weather to maintain normal arrival times.

The Dallas Love Field airspace now contains a dedicated arrival route from the northwest, in addition to global positioning system-based arrival and departure paths.

“Using NextGen satellite-based technology, the FAA and its workforce have collaborated with the industry to convert the busy and complex airspace around North Texas into some of the most efficient in the nation,” Michael Huerta, the FAA administrator, said in a statement. “The result is a solution that not only benefits the National Airspace System, it benefits the aviation industry, the environment and the traveling public.”

With the OPD procedures in place in D.C. and northern Texas, airliners will be able to reduce flightsn by as many as 1.1 million miles annually in Texas and burn at least 2.5 million fewer gallons of fuel in the skies above D.C.

The announcement of NextGen’s arrival in the D.C. Metroplex comes just a week after the House of Representatives held its first hearing to start the 2015 FAA reauthorization process. During the hearing, the FAA’s inspector general said it would be difficult for the agency to put the full extent of NextGen in place across the country by 2020.

“FAA is in the midst of a multibillion-dollar effort to improve the efficiency of the nation’s air traffic control system through NextGen,” FAA Inspector General Calvin Scovel said at the hearing. “FAA’s acquisition reforms have fallen short in improving the delivery of new technologies and new capabilities.”

OPD technology is not the first NextGen related technology to be put in place, either. According to a late October release from the FAA, the ground-based portion of the Automatic Dependent Surveillance-Broadcast technology had been installed at airports across the nation. In addition, in late October, FAA and the aviation industry partnered to form Equip 2020, a group led by the nonprofit NextGen Institute, to help the parties get the process on track to “revolutionize the national airspace system,” according to Michael Whitaker, the FAA’s deputy administrator.

The post NextGen components come to D.C., Texas appeared first on FedScoop.

]]>
http://fedscoop.com/highways-sky-nextgen-components-come-d-c-texas/feed/ 0
Commentary: Cyber threats demand executive not just IT skills http://fedscoop.com/commentary-cyber-threats-demand-executive-skills/ http://fedscoop.com/commentary-cyber-threats-demand-executive-skills/#comments Mon, 24 Nov 2014 23:00:41 +0000 http://fedscoop.com/?p=65939 Governmental and private sector organizations must rethink their responses to cyber incidents and data breaches, and treat them as a strategic management imperatives, not just a forensics and mitigation project.

The post Commentary: Cyber threats demand executive not just IT skills appeared first on FedScoop.

]]>
It seems that every week we read about another cyber incident or data breach on the front pages of online or print news publications. While breaches of banks and retailers are now routinely part of that news, so are more worrisome threats.

Consider the latest acknowledgement from the Department of Homeland Security that Trojan software has successfully penetrated the critical infrastructure of the U.S., dating back to 2011. This is just another indicator of the scale and scope of the constant cyber threat the entire nation is under — and the fact that while business remains the lead target, hackers are actively penetrating the core of American enterprises.

What’s not getting a lot of attention is how top management at organizations continue to treat these incidents as an IT problem rather than a strategic challenge that, among other things, requires the kind of project management resources that routinely go into critical investments.

Businesses are starting to get the message, but government agencies need to as well. A Risk Based Security study of 2013 data breaches found that the business sector was the biggest target of cyber attacks, followed by the government, then health care and education.

Let’s consider two incidents in the government sector. The first incident involved hackers who breached the computer networks at the White House. The second breach occurred at the U.S. Nuclear Regulatory Commission.

While no classified documents appear to have been stolen from either of these breaches, the unsettling fact that hackers were able to penetrate systems in these organizations, which demand best-in-class security measures, speaks volumes about the ability of cyber attackers to crack even the most sophisticated defenses and the cyber threat level as a whole.

A better indicator comes from a Government Accountability Office report released in April. The GAO report disclosed security incidents involving personally identifiable information reported by federal agencies had more than doubled over the past five years to 25,566 in 2013.

Information security incidents at federal agencies involving personally identifiable information, Fiscal Years 2009 – 2013. Source: GAO Report April 2014.

Information security incidents at federal agencies involving personally identifiable information, Fiscal Years 2009 – 2013. Source: GAO Report April 2014.

Many attribute the frequency of cyber incidents and data breaches to the sophistication of the cyber attacks. But what’s less apparent is the cost of the incidents to organizations and the economy as a whole.

The Ponemon Institute’s 2014 “Cost of a Data Breach” study released in May estimated that on a global basis, the mean annualized cost for organizations to respond to cyber attacks averaged $7.6 million per year. The average in the U.S. was a bit less, at $5.9 million. Those figures are for cyber incidents and data breaches involving 100,000 PII records, not the mega breaches that involve tens of millions of records that have received the vast majority of media attention.

All this signals the fact that data breaches are more than a chief information officer or even a chief financial office issue. They have become a management issue that impacts entire organizations and their partners. As a result, more cyber incidents and data breaches, in fact, are not only landing on the desks of CEOs and senior management within breached organizations but also their boards of directors.

It’s obvious all organizations — governmental and private sector — must improve their responses to cyber incidents and data breaches, and begin to treat them as a strategic management imperative not just a forensics and mitigation project.

But think of it another way: How many initiatives have an annual budget of $7.6 million and go without formal management practices being applied?

Multimillion-dollar projects usually call for a dedicated program or project manager. This isn’t an option. And it shouldn’t be an option managing a host of decisions that must be made in responding to cyber incidents.

For years, the usual response to data breaches fell under the purview of the technology department. But those days are clearly over. The costs, complexity and overall consequences of these events have grown to the point where they now demand — or should elicit — the appropriate attention of the senior executives within most organizations.

Experienced professional program and project managers are beginning to be put in place to manage the complexity of these and related initiatives, and hopefully reduce the overall risk.

Given the complexity, scope and potential costs of cyber attacks, PMs will certainly have their hands full; in many ways, their challenges are far greater than for typical projects in part because the span of players that inevitably must respond to a cyber attack. However, their role is essential to keep organizations focused on the right things, getting those things done correctly and making sure they’re addressed in the proper sequence.

Organizations also must begin thinking about creating a cyber incident response team that answers the call to sudden requests to respond to suspected or confirmed cyber incidents and data breaches. Legal, communications, public relations, operations and finance departments and, of course, the IT department have become common participants, all play major roles in the cyber incidents and data breaches that occur today.

While most incidents share some common factors, the truth is, each also has unique characteristics that influence the way the organization responds to and manages these events.

This much you can count on: Sooner or later, your organization will get attacked, it will take time to respond and recover, and it is going to cost a fair amount of money.

Cyber incidents and data breaches are a fact of the modern online, technologically sophisticated and connected world in which we live and work. Failure to enact formal response practices, including project management disciplines in response to these costly events is clearly a material weakness that must be rectified.

As one management consultant put it, it’s not hard to understand how a $7.6 million project mysteriously becomes a $12 million disaster when proper project management is not applied.

The biggest challenge we all face at this point is the limited number of project managers who have actual experience dealing with the challenges of a cyber incident or data breach. That said, given the frequency of successful attacks, it won’t be long before organizations get the message and the shortage of cyber response project managers begins to correct itself — hopefully sooner than later.

Kevin Coleman is a senior level technology strategist, project and program manager, and cybersecurity adviser with experience leading multimillion-dollar projects across multiple industries. He is the former chief strategist of Internet pioneer Netscape and began his career as a management consultant at Deloitte.

The post Commentary: Cyber threats demand executive not just IT skills appeared first on FedScoop.

]]>
http://fedscoop.com/commentary-cyber-threats-demand-executive-skills/feed/ 0