Taking a page for cybersecurity from the counterterror playbook

The Obama administration is taking a page out of the counterterrorism playbook to help it secure the nation against hackers, cyberspies and other online threats, a White House official said Tuesday.

Just as the National Counter-Terrorism Center is an all-source intelligence fusion hub for terrorist threats, so the new Cyber Threat Intelligence Integration Center, or CTIIC, is for cyber threats, Andrew Grotto, senior director for cybersecurity policy at the White House National Security Council, told FedTalks.

“From threat summaries to in-depth analysis, CTIIC has become the place to go for policymakers” needing actionable intelligence, Grotto added.

And CTIIC was just one example of the way the administration was tying together roles of various federal agencies, he said, learning from the experience of coordinating counter-terrorism policy.

“One of the key lessons here is developing muscle-memory between and across federal agencies,” Grotto explained, so that in a crisis, everyone knows what their role is.   

That’s why President Barack Obama in July signed Presidential Policy Directive 41 on cyber incident coordination — “Improving our ability to respond [to] and recover from cyberattacks,” as Grotto put it.

He reminded a packed house at the Sidney Harman Hall in Washington that, when Obama was first campaigning to be president, the iPhone was brand new, and we lived in a very different technological world.

Over the eight years of the Obama administration, he said “I’ve watched cybersecurity evolve from a niche subject of concern to intelligence agencies and IT specialists, into a major national issue.

The secret to USDS, 18F hiring success

The 18F and U.S. Digital Service teams have each now grown to more than 200 members on their teams — most of them young technologists early in their careers. 

During a time when the federal government struggles to attract, hire and retain a young, talented IT workforce to replace those baby-boomer public servants retiring in record numbers, the leaders of these two organizations took the stage Tuesday at FedScoop’s FedTalks 2016 to pass on the secret to their hiring success over the past few years.

USDS founding member Haley Van Dyck remembers when her team was just six or seven people with a big idea to help fix government. 

“We weren’t sure we’d even be able to recruit 20 people to come and join this team,” said Van Dyck, whose team hit the 200-member mark just this week. “It was an experiment — we just wanted to see if this would be able to work.”   

Eric Hysen, another early USDS member who’s since gone on to head a team outpost within the Department of Homeland Security, said recruitment is “something that we spend a lot of time — USDS overall and our team at DHS — working on.” 

Hysen’s own family in a sense serves as a microcosm of the changing state of the federal government’s ability to bring in new tech talent. His father is a careerlong General Services Administration employee who’s set to retire in coming years whereas Hysen is just beginning his own tour of duty.  

“The path that he took and the path that I’m taking, as well as the people we are recruiting, could not be more different,” said Hysen, who participated in a FedTalks panel with 18F’s Aaron Snow.

And that’s because both the modern work environment and recruiting process have largely changed, the panelists said. Private technology firms are able to offer relatively massive salaries and sexy office spaces loaded with perks — things the U.S. government just can’t offer.

Those companies can also move much quicker to hire available talent. 

“You need to be able to actively go out and recruit. The types of people that the government needs in many cases aren’t browsing USAJobs looking for listings,” Hysen said. “They need to be found where they already are, and then be able to bring them on very quickly. You’re competing with processes from private companies that might try to bring somebody in in a week or two.”

So what does the federal government have that private companies can’t offer? 

Impact is the most common element that attracts technologists to join federal digital services team like 18F and USDS, said Snow, who just stepped down from his post as executive director of 18F last week.

 “Comfy environment [or] not comfy environment, all the right tools [or] not all the right tools — the common thread I think that brings great folks into this endeavor is: Am I going to be able to be productive and have and impact and serve?” he said. 18F’s success in recruiting has been linked to its ability to answer yes to that question, “no matter what the room looks like or what kind of computer you give them.” 

“Ultimately it’s, ‘If I come, is it going to be worth it, and am I going to be able to make change?'” Snow said. “Folks in this business, they’re all impact junkies.”    

Snow stole the words out of Hysen’s mouth. 

“The process, the workspace, are what a lot of people when they’re saying how to attract digital talent they go to first,” he said. “And it’s important, but it is so insignificant compared to giving them an environment that allows them to get things done, that allows them to question sources of authority, allows them to truly have the type of impact that a lot of tech companies like to promise but really do not deliver on anywhere close to the scale of the federal government.” 

Microsoft slates new cloud centers dedicated to Defense Department

This article was updated to correct the location and timing of Microsoft’s rollout of it’s Azure Government cloud regions.

Microsoft plans to open two new regional Azure cloud computing facilities dedicated exclusively to the Defense Department, company officials announced Tuesday.

The new cloud facilities will be the first in the nation to operate as physically isolated data centers dedicated for the exclusive use of the DOD and able to meet its most demanding “Impact Level 5” cloud security controls, according to Microsoft officials.

The two new DOD-only facilities will be located physically nearby, but isolated from, existing Azure Government regional facilities in Iowa and Virginia, and are expected to be operational by the end of 2016. The facilities will host Microsoft’s Azure Government and Office 365 US Government Defense, but also enable Defense Department agencies and qualified contractors to manage DOD National Security System data.

Microsoft also announced it will open two additional Azure Government cloud regions in Texas and Arizona slated to come online sometime in 2017, bringing to six the total number of dedicated government cloud centers operated by Microsoft.

According to Jason Zander, Microsoft’s corporate vice president for Microsoft Azure, nearly 6 million users across more than 7,000 federal, state and local government customer accounts now use Microsoft’s Government Cloud.

A key factor driving Microsoft’s growth in the government sector has been the company’s ongoing pursuit of demanding security certifications for its cloud services for federal customers and other regulated industries.

“Microsoft has signed Criminal Justice Information Services agreements in 23 states, covering more than 60 percent of the U.S. population,” Zander said. “This is nearly six times the number signed by our nearest cloud competitor.”

Another factor is Microsoft’s widening embrace of open source software platforms that work in the cloud.

“The cloud has to meet people where they are,” Zander said in an interview with FedScoop.

He pointed to Microsoft’s recent agreement with Red Hat that permits many government customers using Red Hat Enterprise Linux on premises to “lift and shift” their applications and data to Microsoft’s Azure cloud platform.

Another less recognized factor is the extent to which Microsoft is adding new features to its cloud services on a regular basis.

“We’ve been shipping new features every 32 hours over the last four months,” Zander said.

That’s especially challenging with cloud certification programs like the Federal Risk and Authorization Management Program, which require providers to go through a reauthorization process when cloud services change in a material way.

Zander, however, pointed most enthusiastically to the benefits government agencies are seeing from the speed and scale of hypercomputing now available through the cloud along with the emerging potential of artificial intelligence.

Microsoft now operates 38 regional cloud centers worldwide, he said. “Some are the size of 16 football fields… with hundreds of thousands of servers and zettabytes of storage,” he said.

That makes it possible to “translate the entire contents of the Library of Congress to another language in just two seconds,” Zander said.

That capability, combined with “deep learning” programs, is bringing the world of computing to “what I think is a new inflection point,” Zander said.

Feds need clarity on cyber structures

The federal government needs to get its act together on cybersecurity, and there needs to be a public debate about the proper role for agencies like the Department of Homeland Security and the National Security Agency, public and private sector leaders said Tuesday.

“We really need to define what we want our government to do in cybersecurity, former Rep. Mike Rogers told an audience at FedScoop’s FedTalks. “We have lots of capability. The NSA has lots of capability,” he told a packed auditorium in Washington at the annual event.

By giving DHS, rather than NSA, the lead in defending civilian government networks and working with the private sector to protect the nation’s vital industries, the U.S. had “take[n] our best players off the field,” complained Rogers, who chaired the House Permanent Select Committee on Intelligence.

“Candidly,” Rogers said, that decision “was politically driven and not policy driven. People were a little nervous about having NSA … dealing directly with them” and their networks — even companies that had a prior relationship with the NSA or the Pentagon were nervous.

Despite the ongoing furor over the government’s role in protecting the private sector from foreign state-sponsored cyberattacks, speakers said there wasn’t clarity about the respective roles of the intelligence and law enforcement in the cybersecurity space.

There is an inherent difference in — even conflict between — the missions of intelligence, law enforcement, and network defenders in the federal cybersecurity space, RSA President Amit Yoran said in a morning keynote.

Intelligence agencies “watch … and want to keep watching,” malicious actors in cyberspace, Yoran said, whereas law enforcement want to watch only to gather evidence to prosecute them. Meantime network defenders “may not care at all who attacked them,” he said.

“We need more clarity about roles, responsibilities and authorities between agencies,” Yoran concluded.

“We have not yet fully engaged — on the public side — [in a discussion about] what we want our government to do. How engaged do you want the NSA to be in defending private sector networks?” Rogers added.

Is DHS the agency that should be in the lead [in cyber defense] in the US, given the level of threats? We could probably debate that for an hour and a half,” he said.

Rogers said that while working on cyber threat information sharing legislation in the last congress, the intelligence committee had conducted a great deal of outreach to the private sector to see who they would prefer to deal with in the government when it came to cybersecurity.

“Candidly … we did not find one example of someone saying yes, I want to deal with DHS,” he said.

Another example of an unresolved issue, speakers said, is the dual-hatted job that Adm. Michael Rogers has as director of the NSA and commander of U.S. Cyber Command.

“This structure is now over six years old,” Adm. Rogers said, joining the former Congressman of the same name for a cybersecurity chat at FedTalks.

“The reason we got this structure is, we were building Cyber Command and we wanted to harness the … significant investments the Department of Defense had already made in cyber .. at the NSA,” Adm. Rogers said.

There have recently been moves, both in Congress and in the executive branch, to separate the two jobs, and give Cyber Command its own commander.

“My position has always been, this is the right thing to do at the wrong time,” said Adm. Rogers, adding “It’s a reflection of the maturation of Cyber Command that we’re even having this discussion.”

“The challenge is: What’s the right time, what’s the right process, so that [we do it] with minimal risk,” he concluded.

“I have candidly been going back and forth on this issue,” said Rep. Rogers, who chaired the House Intelligence Committee. “The only thing I worry about is [if we split it up] does Adm. Rogers [of cyber command] have to talk to Director X of NSA to perform the same function he does today. If we can’t eliminate that question then I’m not sure I can support it.”

“We probably don’t have this right just yet,” he finished.

 Yoran called out the General Services Administration’s FedRAMP cloud security certification process as a successful effort to raise the cybersecurity bar in the federal government.

“It was painful at first, but it is driving security requirements into next generation of [IT] infrastructure,” he said.

Cyber takes center stage at FedTalks 2016

29785116843_93a0920dd7_k

There’s a greater need than ever for cybersecurity to play a central role in discussions at the highest levels of government, federal IT officials and industry tech executives championed Tuesday at FedScoop’s FedTalks 2016.  

Look no further than the appointment of Greg Touhill as the first U.S. chief information security officer a month ago as the embodiment of this heightened demand for greater collaboration and focus on government IT security.

Touhill, named to the U.S. CISO role in September, keynoted the afternoon session of FedTalks, laying out his concerns with the security of the current federal IT enterprise and what initial steps must be taken to begin remediating those weaknesses — by educating the workforce, modernizing systems and collaborating with those outside of government, among other things.

“Life is full of risk,” Touhill told a packed house at Sidney Harman Hall in Washington, D.C. “You can never get to zero risk — but you can manage it.” 

Ann Dunkin, CIO of the Environmental Protection Agency, felt similarly about her agency’s ability to prepare for, fend off and mitigate cyberattacks. 

“You’ve … got to assume that you’ve been hacked,” Dunkin said in a fireside chat. “Now I’m not saying that you have been — I’m saying that you have to behave as if you have been. And you’ve got to have all of the abilities to detect and respond and mitigate those issues so that when you do have an issue, you can very quickly resolve it and you can mitigate the damage.”

Cybersecurity, she explained, is “just a constant arms race to keep up with.”

Touhill also described the importance of strong cyber hygiene at the frontlines of federal systems, and how that no longer applies just to agency IT personnel.

“Frankly, the entire workforce are now part of what I consider to be the cyber frontlines,” Touhill said. That workforce, he went on to say, “is our greatest asset and our weakest link.”

Cybersecurity, as it stands, is often an afterthought and not readily digestible to the average federal employee. But Touhill wants to change that, making it so simple that anyone can understand it.

“I have found that the best goals are the ones that are simple, concise and easy to understand,” he said.

Adm. Mike Rogers, the director of the National Security Agency who spent much of his FedTalks panel talking more technical elements of IT and national security, reiterated Touhill’s important point.

“Never forget the human dynamic on all of this,” he said.

Touhill hasn’t been on the job long, but he already has ambitions to form a CISO advisory panel, which he doesn’t want to limit to just his counterparts at federal departments and agencies, he explained Tuesday. His idea is to open that forum to state, local, academic and private sector members as well.

“I think if we’re having a closed conversation within the federal community, we’re not hearing all the voices that need to be hears,” Touhill said.

And as the presidential election and transition near, the importance of federal cybersecurity — and that of the greater public — should only amplify, said. Rep. Gerry Connolly, D-Va.

“This is really the first political campaign where we’ve seen cyber front and center,” Connolly said.

U.S. CIO: Think beyond the org chart

U.S. CIO Tony Scott made a plea Tuesday to agency officials to build systems that reach across organizational boundaries and keep the customer in mind.

“I want you to… look at the information systems architecture of your organization, what its boundaries are, the scope of the infrastructure and the applications,” Scott said at FedScoop’s 2016 FedTalks. “If it exactly matches the org chart of your agency, you know you’re in trouble.”

Scott says this “organizational paradigm” is a tell-tale sign of legacy government systems — and that’s why he wants legislation to push agencies to modernize their systems.

He noted that shifting this paradigm and modernizing legacy systems is particularly important for cybersecurity, something he has mentioned in past talks.

[Read more: Tony Scott: Time to change the ‘bubble wrap’ paradigm]

“Ninety-five percent of what we do exactly mirrors the org chart of the federal government. In a digital world that doesn’t make sense any more,” Scott said in September at the Billington Cybersecurity Summit in Washington, D.C. “We were expecting the Marine Mammal Commission to do the same kind of job the [Defense Department] does for protecting its systems and networks.”

The Modernizing Government Technology Act, which passed the House of Representatives in September, would create IT working capital funds in each agency and a centralized fund that agencies who most need it could draw from.

[Read more: IT Modernization bill passes house unanimously]

Scott said that idea to create a centralized modernization fund has “actually gotten better” as it has evolved through the legislative process from the first bill on the subject, which would have created a $3.1 billion centralized modernization fund.

“This IT Modernization Fund, or now MGT, is a better way of doing things,” Scott said. “What it allows is for agencies to come together and say, ‘here’s some common things that we can do, or we’d like to do, or we should do together,’ and build common infrastructure, and shared services upon which applications and citizen-serving capabilities can be built.”

Scott noted the bill doesn’t actually include money for the funds, but he said “that comes a little bit later.”

“But the construct and the framework of it is something that I think is quite workable,” he said.

Scott encouraged the audience to talk to senators about the bill and noted that “everyone understands what we’ve been doing hasn’t worked so well, and we need a new model or a new paradigm for this.”

An information system architecture that has a structure reflecting the organization shows an official has “not taken a customer-centric approach,” Scott said.

“You shouldn’t have to know how an enterprise is organized in order to find information or to get served or to do whatever it is you’re trying to do,” he said.

Scott added: “Let’s move on to a customer-centric model that leverages broad cloud infrastructure, broad cloud shared services, and then let’s focus the precious resources that we have in individual agencies on doing the mission-specific, mission-critical things that only those agencies can do.”

Federal CIOs prepare for the transition

Many federal Chief Information Officers are preparing for the inevitable on Jan. 20, when those who are political appointees will more than likely step away from their leadership roles in federal IT.

“There’s actually a relatively small number of CIOs who are political,” Environmental Protection Agency CIO Ann Dunkin said at FedScoop’s FedTalks 2016. “So there seems to be a lot of continuity.”

Despite that, Dunkin noted that agencies such as hers with a politically appointed CIO “are going to be more impacted.”

“I think we’ve all been trying to prepare our agencies,” she said. “I know I have been trying to prepare my own.”

One move Dunkin made was hiring a permanent deputy CIO a few months ago.

“You know I hired him with the transition in mind, you know, someone who can carry the agency forward for potentially two years,” she said.

Department of Veterans Affairs CIO LaVerne Council has been imparting confidence on her staff prior to her departure next year, particularly by building an IT framework that has all the pieces necessary for success in her absence.

“What’s going to happen to us?” she said her staff asks. “Nothing’s going to happen. We have a plan. You navigated it. You’ve seen that you can. You understand what’s next.”

She added: “If that plan existed when I arrived, I would have gotten on board with you and helped make it even better. Now it’s here — you go.”

Dunkin said her office has worked to build structures across the EPA that will keep the momentum going.

In particular, she said she doesn’t want modernization work “to fall by the wayside,” in a new administration, but she said she doesn’t think that will happen.

“I think that any administration that comes in is going to recognize the importance of IT modernization,” she said, noting it was a nonpartisan issue.

“No matter who’s in the White House, we’ll be able to successfully carry those things forward,” she said.

Council said the transition “is no big deal,” and she expects her successor to lead similarly when the time comes.

“You don’t restart — you just regain that energy and help them and guide them and move them forward,” Council said.

Ever since Council took over as CIO last May, she’s had the January end date in mind with no intention of leaving any sooner, despite the mountain of adversity she’s seen while in office.

“I’ve been saying from the beginning, I’m going to be here until Jan. 20,” Council said. “That was the commitment that I made — that was the tour of duty, and I take that tour and my responsibility seriously.”

In fact, she’s endured 15 hearings in a year’s time in Congress’ endless hunt for answers to VA’s history of IT and cybersecurity weaknesses, which predate Council’s time at the department.

“Would I have ever thought that that was even probable? No, I did not. Did I walk into any of those hearings thinking I wouldn’t be successful? No I did not. The fact is I had a team of people prepping me and telling me what I need to know, making sure I was OK.”

Council said, “For us the veterans are everything, and that mission propelled me to do any hearing, any adversity, any innovation.”

Billy Mitchell contributed to this report. 

ISACA urges centralized cyber regulation for next president

The next president should move urgently to centralize cybersecurity regulation of key industries and modernize federal IT to help secure it from online threats, one of the largest associations for information security professionals said Monday.

ISACA, previously known as the Information Systems Audit and Control Association, is out with a list of five “top critical cybersecurity priorities” that the incoming president needs to focus on within the first 100 days — a common benchmark for urgent achievements in a new presidency.

ISACA, which boats 140,000 members throughout the world across various IT governance disciplines, highlights the following five broad issues, but provides little by way of specifics:

They line up in some measure with the priorities outlined by U.S. Chief Information Security Officer Greg Touhill, who spoke about his forthcoming plans at the AFCEA Cyber Summit in DC this month.

He too stressed the importance of human capital issues, and has been working with the Department of Education to help build the future cybersecurity workforce.

But the other measures he outlined — including “new capabilities that have not been there before; such as actively looking with hunt teams through .gov for hackers,  … improv[ing] our pen testing, … incorporating software assurance and perhaps a bug bounty across the federal government” — don’t appear anywhere in the ISACA list.

Meet the U.S. Data Federation: A new hub for standardized, coordinated open data

The General Services Administration is working to create a place where data providers can go to see if their data fits into a set of standards others might be already using.

This new effort, the recently announced U.S. Data Federation, is a step forward in the open data movement toward not just publishing data on Data.gov but also coordinating it among specific topics to be interoperable and standardized, experts say.

Philip Ashlock, chief architect of Data.gov, talked to FedScoop about that goal and the overall vision for the federation, which launched in late September as a place data publishers can look for examples of successful standardized multi-agency data initiatives.

Future plans for the federation include tools to help publishers coordinate their efforts and use a preferred data standard, and a maturity model to monitor the progress of some of these initiatives, Ashlock said.

“Part of the challenges within government is just knowing that these initiatives exist, knowing what the technical details are as far as data specifications or standards around them,” Ashlock told FedScoop. “It’s increasingly more of a coordination challenge when you’re not just talking about federal agencies but potentially working with state and local governments as well.”

Data.gov is not only a catalogue of federal agency data — it has recently been getting more data from the state and local levels as well.

“The main concept is it’s a catalogue to explore open data resources from across government,” Ashlock said of Data.gov. “The U.S. Data Federation, on the other hand, is actually to identify and highlight initiatives that are focused on specific problems using data from multiple sources.”

The U.S. Data Federation was launched in conjunction with the White House’s first Open Data Innovation Summit, and Ashlock said it will help officials “contextualize [their] data publishing into these broader initiatives but also to show that there’s a particular way to do it that’s kind of considered a best practice, or that there may even be some requirements around.”

A long-term goal, Ashlock said, is to develop a “maturity model” to show where initiatives are in achieving their goals and what “the next phase … should be for those involved.”

“The concept … of data federation is basically how do you coordinate among multiple data publishers so that you can pull all the data together in one place so that it’s sort of one cohesive whole?” Ashlock said. “So this gets around sort of the concept of data standardization, or just the basic coordination of how information is published.”

One of the biggest lingering obstacles in the open data movement is siloed data, Socrata CEO Kevin Merritt said.

In an interview with FedScoop, Merritt explained how that problem is caused in part by the way government program funding works — that it often doesn’t include money for interoperability efforts.

“There’s an enormous data silo problem and it’s real and it exists in every government,” Merritt said. “And there’s no silver bullet for getting the data out of those systems; it takes work, it takes effort, it takes people to go in there and connect to those underlying systems and build conduits to get the data from those data silos into an environment where the data can be shared externally.”

When a new program gets funded in government, often new systems are created to support it, Merritt said.

“Those systems were never designed to talk to each other,” he said. “And if you want to do some analysis that has data from three or four different programs, it’s actually really hard to do unless you have got some sort of way to kind of stage the data and pull it together.”

Hudson Hollister, founder and executive director of the Data Coalition, said it is exciting to see the GSA put an emphasis on standardization.

“Data.gov up until now has been about publishing data sets, but not about standardizing them. Without data standards, published data might not be useful because it’s got to be extensively translated and transformed to be comparable across different agencies,” Hollister said. “It looks as though with the announcement of the U.S. Data Federation, GSA is recognizing that.”

When publishing data sets, agencies should look to see if a standard structure or format exists that they can use, Hollister said, “because that makes the data sets more likely to be comparable with things other agencies or other offices have published.”

Data.gov’s focus, Hollister said, was “let’s get as much stuff published as we can, and at least start people thinking about data publication and using the data, or trying to.”

“Moving towards standardization is really the next stage,” he said. “By highlighting standardization projects in specific verticals, the U.S. Data Federation encourages agencies that are publishing open data sets to pay attention to those standardization efforts and maybe put in place a preference for following whatever data standardization effort is in their vertical.”

The federation’s focus on getting initiatives with standards that reach across more than one agency is “the right framing,” Hollister said, particularly in the context of making data more useful to agencies for their own use.  

Hollister said government data has the most value to its own internal users to help them make better decisions. And so the next phase of the open data movement will be where agencies are using open, standardized data from across government to make decisions.

When thinking about the federation’s audience, Alex Howard, senior analyst for the Sunlight Foundation, told FedScoop it is important that the federation is aimed at third parties who might reuse the data.

“The third-party reuse is where the greatest amplification and opportunity to inform people comes from,” Howard said. “That’s why it matters to get the people who build products, who do data science… the ones who know the data that they need and can put it to use.”

“Orienting the federation at those people is critical, just like orienting any data portal at those people is critical because you want to make sure the reuse happens so that there’s an exposure to what the data can provide at the point of decision,” Howard said.

Howard said having something like the federation that is provider-focused and emphasizes standardization is “critical.”

The goal for the federation, Ashlock said, is that government information feeds into a “national strategy that allows tools and applications to be developed that work nationally, as opposed to just for that one agency, or just for that one local government.”

“I think it’s kind of a point of maturity in the open data space where we’re not just talking about publishing data, but being a little bit more coordinated and thoughtful about how we do that at a national scale,” Ashlock said.

The DevOps and security tribes need to come together

Security professionals need to “stop resisting the empathy that comes with teamwork” and embrace their colleagues and partners from the DevOps community, argues white-hat hacker Josh Corman.

“The DevOps tribe is willing to give us a big gushy hug,” Corman said Friday at AppSecUSA 2016, the annual gathering of the Open Web Application Security Project — an online community of developers devoted to building more secure software.

DevOps is the management philosophy that combines IT development and IT operations, and typically employs agile design methods to deploy new software iteratively in what critics deride as a “permanent beta.” Generally DevOps is seen as prioritizing flexibility, speed and time to market, and to be opposed to or derisive of security.

“The typical approach,”said one participant in a recent federal IT forum, to laughs of recognition, “is ‘We don’t need to have all this security risk management stuff, we don’t need to have cybersecurity, we need a solution now.’”

Corman made what was effectively a pitch to the security tribe to make peace with DevOps. OWASP is one of the oldest and most established volunteer security organizations that produces consensus open standards and develops best practices.

Corman said mutual misunderstanding between the two tribes was a matter of language as much as anything else: “You call it mitigation and patching; they call it unscheduled critical work.”

Either way, “It is time and effort doing something that adds nothing to the bottom line,” he said, referring to the time spent mitigating major vulnerabilities like Heartbleed.

Maintaining good security hygiene and following security best practices reduces the amount of time and effort required, Corman said, because it makes the whole application ecosystem more secure.

He said there were an average of 106 open source programs or software libraries incorporated into a typical app, and pointed out that a single vulnerability in one of them might be present throughout a whole sector via a piece of industry-standard software.

“Just one vulnerability in JBOSS [software allowed hackers to] shut down Hollywood Presbyterian Hospital. One vulnerability can do that and because our [software] hygiene is so poor, we have hundreds and hundreds more out there,” he said.

Only by improving software production practices and securing the software supply chain could DevOps achieve their goals, he said. And they were now seeing that clearly, hence the hug.

The security tribe had to stay true to its principles, though. “We have to raise the alarm without being alarmist,” he said.