DHS, DISA cyber chiefs: Network monitoring is still ‘a challenge’

(DHS)
Maintaining continuous monitoring of large computer networks remains a challenge for government departments despite a Homeland Security program to supply commercial, off-the-shelf cybersecurity tools to federal agencies at no cost, two cybersecurity officials said.
Speaking in a panel at the Security through Innovation Summit, sponsored by Intel Security, Roger Greenwell, chief of cybersecurity for the Defense Information Systems Agency, underscored the logistical difficulties of maintaining a constant eye on a diverse enterprise.
The Department of Defense “is an enterprise of enterprises. Every service has their own unique way of doing things,” Greenwell said.
The proliferation of different software and legacy systems across the DoD renders traditional Continuous Diagnostics and Monitoring, or CDM, solutions, like automation, difficult, he added.
“What we’ve really tried to is leverage automation, and to be honest, it’s a challenge… [DoD] is such a heterogenous environment. We have one of everything. Someone made it, DoD bought it, and we might still be running it today.”
One solution is an appeal to the basics. The DoD Scorecard initiative, under which each service or agency within the sprawling department receives a cybersecurity grade based on their fulfillment of core security tenets, has eased DISA’s ability to gauge performance and worked to hold individual services accountable for simple and effective defenses, he said.
“[The scorecard] ties back to the cyber discipline implementation plan, to get back to some of these basic principles which, if they’re taken care of, we can mitigate many of our systems from the attack surface,” Greenwell said.
But internal hurdles are only the beginning. Shaun Khalfan, Chief Systems Security Officer at DHS Customs and Border Protection, said that the multiplication of third party threats puts a large strain on CDM programs.
“We have to screen massive data: airplane manifests, millions of passengers, deliveries. We have a lot of third party customers, and breaches are now coming from third party entities,” Khalfan said. “With the advanced threats coming out there, how do you start to tease that out from the noise? You’re not looking for a needle in a haystack, you’re looking for a needle in a stack of needles.”
CDM is further complicated by the sheer quantity of potential security controls available, making it difficult to determine which areas to prioritize.
“We’re looking at NIST 853 family and saying, what should we be monitoring? Which of these should we be focusing [on] from a CDM perspective?” said Greenwell. “With 900 controls as part of the catalogue, we have to select controls that are most applicable — try to figure out what data is out there that can come back and give you some level of assurance that your data is under protection.”
“There’s definitely room for improvement” in the process, he added.
House hearing will focus on cyber-incident response at State, Treasury
The Departments of State and Treasury will be in the hot seat next week during a planned House Committee on Oversight and Government Reform subcommittee hearing on federal cybersecurity practices.
In a memo obtained by FedScoop, full committee Chairman Rep. Jason Chaffetz, R-Utah, says the hearing will focus on detection and mitigation efforts related to two high-profile security incidents: the gaps found in the security of the State Department’s Consular Consolidated Database and the vulnerabilities revealed December in Juniper Networks’ firewall software. Treasury is one of many federal agencies that was using Juniper’s software, known as NetScreenOS, at the time.
Among the witnesses expected at the hearing are Treasury Chief Information Officer Sonny Bhagowalia, State CIO Steven Taylor, and Homeland Security Department’s Assistant Secretary for Cybersecurity and Communications Andy Ozment.
Last month, ABC News published a report stating that experts found security gaps in the State Department database that could have allowed hackers to doctor visa applications or pilfer sensitive data like photographs, fingerprints and social security numbers. That database — which holds biometric data from almost everyone who has applied for a U.S. passport or visa in the past two decades — has had technical issues in the recent past, going offline for a significant amount of time in 2014 and 2015.
[Read more: Congress demands info on Juniper backdoor]
In the wake of the Juniper vulnerability, the House Oversight committee sent letters to various government agencies asking if their IT security teams issued patches to systems using NetScreenOS. Last December, the company discovered “unauthorized code” that would allow sophisticated hackers to control the firewall of un-patched Juniper products and decrypt network traffic.
The company’s products are used by a number of government agencies, including the departments of Defense, Justice and Treasury.
Also scheduled to appear at the hearing are Richard Barger, chief intelligence officer for ThreatConnect, and Charles Carmakal, vice president of Mandiant. Barger previously served as a U.S. army intelligence analyst while Carmakal’s expertise is in large and complex security incident response.
The hearing is scheduled for Wednesday, April 20 at 9:30 a.m.
Contact the reporter on this story via email at greg.otto@fedscoop.com, or follow him on Twitter at @gregotto. His OTR and PGP info can be found here. Subscribe to the Daily Scoop for stories like this in your inbox every morning by signing up here: fdscp.com/sign-me-on.
VA teases plans for new ‘state-of-the-art’ digital health platform
The Department of Veterans Affairs hopes to unveil plans for its agile, “state-of-the-art” digital health platform later this year, CIO LaVerne Council said in testimony.
While the hearing before the House Committee on Veterans Affairs’ Subcommittee on Health generally focused on VA’s planned technical improvements to veterans’ ability to schedule medical appointments, Council took time to describe why it’s important now — in the middle of several contentious and ongoing major health IT projects, including driving interoperability with the Defense Department — to plan and build a veteran-centric digital health platform for the next 25 years and onward.
“The EHR [electronic health record] today is really just the heartbeat of the organism, and it does not have everything that is needed to mandate and manage care in the community, to deal with the needs of the female veteran, and also to support the overall veteran experience and the clinical management,” Council said at the hearing Thursday.
“We decided what the right thing to do was to lay out the new digital health platform, because that’s really what’s needed for the future. If [the Veterans Health Administration] is going to provide health care in the future and today, it needs to move into a digital platform, and that’s what we laid out,” she said.
She and VA Undersecretary for Health David Shulkin, after producing a business case last fall, decided to move forward with building a modern and integrated health care system that would incorporate best-in-class technologies and standards to give it the look, feel and capabilities users have come to expect in the private sector. Council plans to deliver to Shulkin later this summer a working prototype, which he will test from a clinical perspective.
“What I can tell you is it is incredibly responsive,” Council testified. “It is aligned with the world-class technology everyone’s seen today and using in things like Facebook and google and other capabilities. But it also is agile and it leverages what is called a FHIR [Fast Healthcare Interoperability Resources] capability … which means we can bring things in, we can use them, we can change them, we can respond.”
FHIR is an international open-source standard that is taking the health care interoperability world by storm. Created by nonprofit group HL7, the FHIR API and structured data format, which allow for secure and trusted information flow, have been promoted by numerous health care organizations, including the Department of Health and Human Services and the Office of the National Coordinator for Health IT, and medical providers. Council said she wants the VA, as the largest health care integrator in the U.S., to drive that platform forward.
Perhaps the biggest change in the new platform would be its all-in-one integration in the cloud. Whereas VA’s current EHR platform VistA, short for the Veterans Information Systems and Technology Architecture, is a collection of 130 separate systems.
“We have 365 data centers within the VA, 130 instances of VistA, and 834 custom systems,” Council said. “In addition, they’re spread out … And it was built as it went. It looks like nothing that you’d see in private industry, and fundamentally, what we’re laying out is a digital health platform that will get us there.”
She added that the problem with VistA was “an integration issue … Everything’s separate, and that is the root of the issue.”
“That’s why we need to get to a new digital health platform,” she explained.
Citing VA’s history of bouncing back and forth between building health IT systems and giving up on them after investing millions of tax dollars, lawmakers balked at the idea of VA developing another major platform. They seemed especially concerned that — like the current inflexible VistA system — the platform would be left behind by future changes in health care technology.
“The frustration we have is we keep hearing about great systems, and then we get halfway into it, and then it didn’t work, so then we gave that project up,” said Rep. Ann McLane Kuster, D-N.H.
Rep. Raul Ruiz, D-Calif., said flexibility was a requirement of any system VA builds or buys — the system used today shouldn’t be the same exact one used 15 years later, he said. “We don’t want it to last 15 years. We want it to change with the needs of the patients and the community.”
The VA senior officials agreed, but Council maintained the need to start out with a unified vision for a standard platform.
“You don’t have to get 100 percent of the solution in place to start,” she said. “You can start with 20, 30, 40 percent and just get better over time. But that requires a standard platform, it requires one instance, one solution, and a process that everybody uses, and that was not how the VA was built.”
Dollars weren’t discussed for the new system during the hearing, but Council said the cost-benefit analysis checked out.
Her team looked at “the overall long-term ability to maintain it, what it would take if we wanted to change it, how could we get it on to an architecture that is more agreeable and agile so it can move and change as health care is changing, and also how would we work with the care in the community,” Council said.
“Fundamentally,” she added, “four things need to be in a health care system: You need to have clinical management, you need to have hospital operations capability, you also need to have the veteran experience core to what we offer, and you have to have predictive analytics. We do not have that today with VistA. So we decided to pull and build the new digital health platform to address it after reviewing that business case last fall.”
With little time left under the current administration and likely as leaders in their respective positions, Shulkin and Council felt this was the right leap to take for the long-term benefit of the VA, a department whose health IT had been plagued by problems long prior to their arrival.
“We believe that VA is an innovator,” Council said. “We believe that VHA and health care should continue to be one, and we have provided innovative solutions based on industry experts coming back and assessing it as that.”
The Internet of Me
The notion of identity is changing. Once considered a fixed idea, with the self a static entity, “who I am” is increasingly being seen as the sum of a constantly shifting set of data that, taken as a whole, make up whom, and how, we are.
Of course, this may always have been true, to some extent. Early humans depicted their hunts and other experiences on cave walls. Journals and letters dating back thousands of years attest to our longstanding need to document the events of our daily lives as well as our hopes, dreams, fears, passions and angsts, and to learn about ourselves in the process.
But the “self-quantification” movement — also called “lifelogging” — uses technology to provide us with a more complete self portrait than ever before — one that promises to become increasingly insightful as more devices record more details of our daily lives.
Already, wearables are working in sync with our mobile devices to take careful note of where we’ve been, where we are, and where we might be going. Apps on phones and watches keep track of our every step. Social media sites tell us where we were one year ago and what we were doing, and with whom. Other tools record our sleep, work, dietary, health, recreational and personal activities as well as our feelings, thoughts and moods.
These apps generate data not only about us, but for us. How much, and how well, did we sleep last night? How many flights of stairs did we climb? How many glasses of water did we drink? Where did we go, and how did we get there? How much money did we spend today, and what did we buy? Orwellian though it may sound, all this information offers vast potential to improve and even extend our lives.
But one of the paradoxes of knowledge is that the more we know, the more we realize how little we know. Is tracking our activities and interactions enough to truly comprehend how we are living, and to fine-tune accordingly? So many questions remain unanswered, or even unasked.
A new level
Now, though, a new kind of data gathering stands poised to fill in the gaps: sensorization, or the Internet of Things.
Sensors, chips and other forms of digitization already are turning commonplace objects into data collectors. Thermostats know how warm we keep our homes. Cars know how fast we drive, and where we go. Televisions know how many hours we watch, and when, and which shows we like.
Soon, it’s said, our homes, work spaces, and even our clothing will gather data about us, enabling us to examine our lives in minute detail.

JR Reagan writes regularly for FedScoop on technology, innovation and cybersecurity issues.
Trying to cut down on coffee consumption? You might not remember to enter every cup of coffee you drink into your self-quantification app, but that’s OK: Your coffeemaker may do it for you, telling not only how many cups it brewed for you on a particular day but also how strong the coffee was and when you drank it.
Not as productive as you want to be? Your devices, social media sites, and “consciousness-hacking” tools such as brainwave-measuring headbands may be able to help, working together to show how often you get distracted and what you’re doing, how much and how well you’re sleeping, and more.
When our objects communicate with one another and with us, they become extensions of ourselves, like another set of hands or a second brain that performs the mundane tasks, leaving our minds free to imagine, to create and to dream.
“The unexamined life is not worth living,” Socrates famously said. How, then, might the data our connected things collect and analyze, and add value and meaning to our lives today and in the future?
JR Reagan is the global chief information security officer of Deloitte. He also serves as professional faculty at Johns Hopkins, Cornell and Columbia universities. Follow him @IdeaXplorer. Read more from JR Reagan
FBI appoints new CIO from within
Nine months after the last person in the role departed, the FBI has filled its chief information officer position.
The bureau confirmed to FedScoop that Gordon Bitko will assume the role, after former CIO Jerry Pender left last September for a role with Z Capital Partners, a New York-based private equity firm.
Bitko has worked with the FBI for eight years and is currently a Rand scholar. While at Rand, he has published multiple papers on radio frequency identification.
The bureau has been recruiting for the position since January. Bitko comes into the role as the bureau faces daunting technology challenges on several fronts. The G-men are at the center of the encryption debate in which FBI Director James Comey continues to press private technology companies for backdoors into their systems to locate and stop criminals and terrorists.
The FBI has asked for an additional $38.3 million in the 2017 budget to fund anti-encryption technology and research — more than doubling last year’s $31 million request to nearly $70 million. They are also asking for $85 million for the bureau’s overall cybersecurity program.
Federal News Radio was the first to report Bitko’s hiring.
Contact the reporter on this story via email at greg.otto@fedscoop.com, or follow him on Twitter at @gregotto. His OTR and PGP info can be found here. Subscribe to the Daily Scoop for stories like this in your inbox every morning by signing up here: fdscp.com/sign-me-on.
Feds weigh economics, security of cloud computing

Panelists at Security Through Innovation Summit April 14, 2016. (L-R) Robert Klopp, SSA; Joseph Ronzio, Department of Veteran Affairs; Paul Stephenson, VMware; Claudio Belloli, FedRAMP. (FedScoop photo)
Federal IT and cybersecurity officials explored the cost benefit equations inherent in using cloud computing last week, weighing cost savings, security and flexibility against each other as a major security vendor urged attendees to get past “emotional” assessments of the cloud.
“A few years ago, when cloud first came out, there was a very emotional response to it,” said Brian Dye, corporate vice president of global products at Intel Security. “And the emotional response was: The things I control have better security.”
Dye, who spoke at a press lunch on the sidelines of the 2016 Security Through Innovation Summit sponsored by Intel Security, explained that in reality cloud environments tended to be more secure than corporate networks “because they’re incredibly more standardized.” Heterogeneity, like multiple kinds of endpoints with individualized configurations, has long been recognized as making enterprises more vulnerable to cyberattacks.
In addition to a lingering fear of the unknown, speakers at a summit cloud adoption panel addressed common misunderstandings about the economics of the cloud, warning that agencies banking on savings from on-premises cloud computing are making a costly bet, and risk missing out on larger savings and agility available from large public providers.
“There is a false premise in the private cloud that you can get the same economic advantage if you take 200 racks of servers and create your own cloud,” said Robert Klopp, CIO and deputy commissioner for systems at the Social Security Administration.
Klopp acknowledged that private clouds still make sense for managing and safeguarding sensitive information. But they will never achieve the increasingly favorable economics of the large public and hybrid cloud service providers, such as Amazon, Microsoft and Google, that maintain hundreds of thousands of racks of servers, he said.
Even the best-managed private clouds are usually only utilizing 60 percent of their CPU capacity, Klopp said. “Imagine how much CPU utilization is sitting there unused,” Klopp said. “Amazon says, ‘I can go sell that unused CPU utilization to people who just use tiny chunks.’”
But that efficiency, in addition to driving down costs, can create vulnerabilities, warned Steve Grobman, chief technology officer at Intel Security.
“Cloud is a great asset to us, it’s a great asset to our customers, but it’s also a great asset to the attackers,” he said.
Grobman said he was mainly talking about how the large public cloud providers, by allowing small, less security-conscious businesses to have a public facing internet presence, created opportunities for hackers and online crime gangs to hijack cloud infrastructure and use its computing power for distributed denial-of-service attacks or other nefarious purposes.
Those kind of current attacks are distinct from what he referred to as “the next generation of privilege escalation attacks,” or “a virtual machine escape, where you have one cloud occupant essentially break out of their environment to infect the cloud provider” and potentially get access to other cloud tenants’ data. Although vulnerabilities have been discovered in some cloud software that might in theory allow such an attack, he told FedScoop afterward no example of it had ever actually been found.
“I’m a lot less worried about that, at least in the near term,” he told the press lunch.
In the meantime, panelists at the cloud discussion said, several factors are altering the economics of cloud computing.
Advances in application containerization and computing microservices are making it easier to break computing work into smaller payloads that can move in a “frictionless” way between hybrid and private cloud environments, Klopp said.
At the same time, the rise of dynamic pricing systems in the marketplace is making it easier to discover where and when unused computing capacity — and rock-bottom pricing — are available. When big cloud providers want to squeeze extra utilization out of their systems, the price of that spare computing power can fall to virtually zero for savvy users, Klopp explained.
In spite of the economic advantages that public cloud service providers can offer, Paul Stephenson, public sector field CTO at VMware, still suggested tech leaders in government agencies need to think about multi-cloud approach as the most practical way forward.
“We see that you need choice. Some people need to move in and back out,” Stephenson said. “How easy is it to get back [your data and applications] from Amazon right now? It’s not terribly easy.”
Stephenson, who previously worked with the Navy’s Space and Naval Warfare Systems Command, recalled trying to take about 200 applications out of the cloud service. Based on the amount of time it took to move one application, at that time, it would have taken five or six years to get the whole job done.
While that job is getting easier as cloud computing matures, he urged agencies to create virtual, insulated operating layers in their stacks to more effectively deploy and manage the movement of their applications to and from cloud providers.
Joseph Ronzio, special assistant to the chief health technology officer at the Department of Veterans Affairs, added that in terms of system architecture, clouds are more effective as “the endpoint for millions of devices versus the endpoint of one.”
For doctors at the VA, he said it’s often more important to have data and computing power on a doctor’s device rather than in the cloud — and with iPhones now having the computing power of Cray supercomputers of 20 years ago, that’s both possible and practical.
Ronzio suggested starting out with virtual sandboxes in the cloud and making sure vendors can operate in them before moving to larger scale production.
“In health care, vendors promise their solutions will work,” he said. “I tell them, here’s the cloud, let’s see if it actually works.”
The panelists urged government IT attendees in the audience to get past cultural barriers, security concerns and fear of the unknown that have held up adoption of the cloud, and to begin initiating projects.
“I would dare to say that the cloud today is often as secure as what you’re doing in a data center,” Stephenson said.
“We need to think about security differently,” added Intel Security’s Grobman, comparing the shift to the 1990s when businesses moved from mainframe to client-server architectures. “They needed to think about the way they secured their systems in a fundamentally different way.”
Wyatt Kash contributed to this report.
Federal IT officials weigh agility, security
Agile development can lead to more effective and efficient software delivery, but if security isn’t built into the mix at all stages, the process could leave applications and systems vulnerable, a panel of senior federal IT security officials warned Thursday.
“One of the buzzwords today is continuous integration, or the continuous evolution of the software, where you can actually make a change on the fly,” Rod Turk, chief information security officer for the Commerce Department, said at the Security through Innovation Summit, sponsored by Intel Security and produced by FedScoop.
“From a security perspective, that’s a concern. How do you make sure that that doesn’t change something else substantively down the road, down in the software?”
Rapid and continuous iterative development — what many in the IT space refer to as DevOps, a compound abbreviation for “development and operations” — can leave software vulnerable to bugs and possible intruders if security is not baked in, Turk said.
“Ponder what it is that happens once you make that continuous code change in the cloud to respond to the emerging requirements,” he said.
Despite that risk, Michael Hermus, chief technology officer of the Department of Homeland Security and a former software developer, said that shouldn’t be something to scare federal IT teams away from continuous development.
As DHS CTO, Hermus’ foremost focus is “how can we most effectively get products out, get capabilities out?” he said. “And security is now something that we have to make sure is taken into consideration.”
The key to improved security is creating a shift in culture. “Not that long ago, it was like, ‘Ah, I’ve got to deal with the security guys. Those guys are going to come in and make trouble for me,’” Hermus said.
“Now I think we all recognize that it is absolutely critical that that is factored in as early as possible. The way we look at DevOps and continuous integration is it’s an opportunity to do that — an opportunity to take current processes that are in many cases manual, that are sometimes document driven and compliance driven, and actually build security into the process, into the continuous integration development process.”
Security is so important now, Hermus said, that it deserves a spot next to the development and operations elements in DevOps. Should it be called “DevOpsSec” or “SecOpsDev?” he joked.
“There are lots of advantages that modern technology gives us to actually address security,” Hermus said. “The key is you have to actually consider security at every stage of the life cycle.
DHS’ Schneck: Einstein more a platform than a tool
The Homeland Security Department’s top cybersecurity official views the tool used to block attacks from federal civilian government networks as something the private sector can’t match — despite being built on twenty-five year-old technology.
Phyllis Schneck, deputy undersecretary for cybersecurity and communications for DHS, said the Einstein system should be viewed more as a platform than a single tool. Schneck, the most senior department official with solely cybersecurity responsibilities, said the department was working to build on top of it with the best technology the private sector has to offer.
Schneck, delivering the closing keynote at the Security through Innovation Summit, sponsored by Intel Security, said Einstein has granted the department great situational awareness used to protect agencies — while the National Protection and Programs Directorate works to improve the system’s intrusion detection and prevention capabilities.
“We are working very closely with our colleagues in the intelligence community to look at what additional information they can give us,” Schneck said, “What can they give us faster and how we can be better at customer service.”
Officially known as the National Cybersecurity Protection System, the program’s effectiveness has been heavily debated in the past few months. Homeland Security Secretary Jeh Johnson defended Einstein in February after a GAO audit found the system falling well short of its objectives.
Aside from Einstein, Schneck also highlighted the recently launched Automated Indicator Sharing or AIS program, which allows private companies to share threat data with DHS which the agency then pushes out to its information sharing partners at machine speed.
“Now we are getting from a vaccine-based system, which comes from signatures and intrusion detection, to pushing antibodies through the internet and creating an immune system,” she said.
Schneck noted that these efforts were just the operational piece of the department’s cybersecurity portfolio — with much more coming from investigators at Immigration and Customs Enforcement and the U.S. Secret Service and even from the department’s CIO office.
“If you think about dhs.gov, that’s the world’s best petri dish,” she said. “The things that they see help all of us get smarter.”
Overall, Schneck’s message to the private sector was one of trust — getting companies to buy into their new information sharing programs while also asking them to help DHS build new tools on top of Einstein.
It was similar to the message Schneck told FedScoop earlier this year that “trust is awarded. The way you earn trust is to demonstrate your capability to be trusted.”
“Trust is our new currency,” she said.
Contact the reporter on this story via email at greg.otto@fedscoop.com, or follow him on Twitter at @gregotto. His OTR and PGP info can be found here. Subscribe to the Daily Scoop for stories like this in your inbox every morning by signing up here: fdscp.com/sign-me-on.
NSA, other feds using innovation to improve security
One of the biggest problems federal cybersecurity officials face is moving fast enough to keep up with hackers — adversaries famed for their agility and flexibility.
At the National Security Agency, they’re trying an Innovation Corps approach: More than a dozen teams of five to 10 people are bootstrapping new capabilities or products over a six-week period, then pitching the results to agency leaders — who choose a handful to take further.
NSA Information Assurance Director Curtis Dukes Thursday compared the agency’s “I-Corps” to the popular television show Shark Tank, where entrepreneurs pitch business ideas for seed funding to skeptical venture capitalists. Dukes said the corps is creating capabilities that have been used inside the agency’s nuclear command and control missions.
Dukes was one of a number of top federal government information security officers who outlined how they are driving new ways of thinking, novel techniques for cyber defense and better ways of filing the talent gaps in their ranks at the Security Through Innovation Summit — sponsored by Intel Security and produced by FedScoop.
Not all the innovations highlighted were from the bleeding edge of emerging tech. Marianne Bailey, principal director and deputy CIO for the Defense Department, said one of the biggest drivers of better cybersecurity has been focusing on getting the basics right, citing the department’s new cybersecurity scorecard. The initiative, which was initially disseminated in October, leans on stronger authentication practices for employees and reducing attack surfaces both internally and externally.
“If you look at all the intrusions over the last decade, you will find the overwhelming majority of those are due to poorly implemented cyber basics,” Bailey said.
Cyber practitioners say basic cyber hygiene eliminates many of the commodity attacks that would otherwise suck up the time of security responders and specialists.
“It may seem like a simple thing, but it’s actually been pretty amazing to see the impact it has had,” Bailey concluded.
Yet even as these agencies are driving new ways of thinking, they are struggling with recruiting and retaining the talent needed to keep up with malicious threats.
Sherrill Nicely, chief information security officer for the CIA, said while they are recruiting all the time, new forms of technology mean that even seasoned cybersecurity professionals need to keep acquiring new skills. She described the efforts within the CIA as a “combination of training and support,” including purchasing vouchers for training classes through an outside training company.
When staffing up a cybersecurity organization, added Rod Turk, CISO of the Commerce Department, in addition to technical specialists, “I also need people who understand budgets, people who know how to communicate, people who can write.”
“At the end of the day I want people working for me to be able to put that [IT proposal] in a business case … to explain that in layman’s terms to financial people to senior executives and if I’m unable to do that because I’m too far in the weeds in the technology, the tendency is you don’t get the money,” Turk said.
Emery Csulak, CISO for the Centers for Medicaid and Medicare Services, said while he always searching for security talent, he also needs people who know how to bridge the knowledge gap between various offices.
“What I do have is a shortage of people who can bring pieces together. That’s what I put a lot more energy into,” he said. “You’ve left the burden of integrating these various acquisitions into these stovepipes. What we are doing is focusing on operations and security together and saying “let’s get out of the hands of programs and bring the right people together,’ so you don’t worry on the architecture and infrastructure, you are focused on turning [technology] into a tangible product.”
Whether it’s inside or outside an agency, officials are open to anything that helps them keep up with a rapidly growing problem.
“I don’t care what range you are, if you are civilian or military, if you have a good idea, bring it forward,” Dukes said.
Contact the reporter on this story via email at greg.otto@fedscoop.com, or follow him on Twitter at @gregotto. His OTR and PGP info can be found here. Subscribe to the Daily Scoop for stories like this in your inbox every morning by signing up here: fdscp.com/sign-me-on.
Former NSA chief, MasterCard CEO among those named to cybersecurity commission
The former head of the National Security Agency, the CEO of MasterCard and the chief security officer of ridehauling giant Uber are among those named to President Barack Obama’s Commission on Enhancing National Cybersecurity.
The ten appointments fill out the membership of the group, which meets for the first time Thursday. The commission has been tasked to investigate ways the nation can strengthen cybersecurity in both private and public sectors by researching topics such as new technology, best practices for identity assurance, federal IT governance, threat information sharing and education.
The commission was created as part of Obama’s $19 billion Cybersecurity National Action Plan earlier this year.
Gen. Keith Alexander’s long career in military intelligence culminated as director of the National Security Agency from 2005 to 2014, and he was the first Commander of U.S. Cyber Command from 2010 to 2014. Currently, Alexander serves as Chairman and CEO of IronNet, a firm he founded on leaving government.
MasterCard CEO Ajay Banga has been in his role with the credit card company since 2010. He took part in the White House’s cybersecurity summit at Stanford University in 2015 where he praised the NIST cybersecurity framework for raising the level of security discussion in boardrooms across the globe.
Joe Sullivan joined Uber as chief security officer earlier this month after a five-year stint at Facebook. He spent eight years at the Department of Justice, working in the Computer Hacking and IP Unit of the Northern District of California.
Steven Chabinsky, is general counsel and chief risk officer for the high tech cybersecurity firm CrowdStrike, which he joined in 2012 after a 17-year long career at the FBI, where he held positions including deputy assistant director for cyber.
In February, Obama named his former national security adviser, Tom Donilon, to chair the commission. Former IBM CEO Sam Palmisano will serve as the vice chair, while former Clinton-era White House official Kiersten Todt will serve as executive director.
The commission will meet Thursday at the Commerce Department building in Washington, D.C. A report featuring security recommendations will be given to the president by Dec. 1, and be published within 45 days after that.
The full list of members is:
- Keith Alexander – former Director of NSA, Chairman and CEO of IronNet
- Annie Anton – Professor and Chair of the School of Interactive Computing at the Georgia Institute of Technology
- Ajay Banga – CEO of MasterCard
- Steven Chabinsky – General Counsel and Chief Risk Officer for CrowdStrike
- Patrick Gallagher – Chancellor and CEO of the University of Pittsburgh, former director of NIST
- Peter Lee – Corporate Vice President of Microsoft Research, former vice chair of DARPA’s Information Science and Technology
- Herbert Lin – Senior Research Scholar for Cyber Policy and Security at Stanford University’s Center for International Security
- Heather Murren – private investor, member of Johns Hopkins University Board of Trustees
- Joe Sullivan – Chief Security Officer at Uber
- Maggie Wilderotter – Former CEO of Frontier Communications
Contact the reporter on this story via email at greg.otto@fedscoop.com, or follow him on Twitter at @gregotto. His OTR and PGP info can be found here. Subscribe to the Daily Scoop for stories like this in your inbox every morning by signing up here: fdscp.com/sign-me-on.