GSA solar contract estimates $5M savings

The General Services Administration says it will save $5 million by adding solar power systems on the rooftops of 18 federal buildings next year using a form of private sector partnership known as a power purchase agreement.

GSA awarded a contract this week to Washington, D.C.-based public utility company WGL Holdings Inc. to construct the solar panels on buildings in the capital that house “federal and quasi federal agencies,” according to an agency announcement by Ron Allard, energy branch chief for GSA’s National Capital Region. At a price of less than $0.04 per kilowatt-hour, a significant reduction from the traditional grid power price of $0.11 per kwh, GSA estimates it will save participating agencies more than $5 million over the term of the contract.

Construction will begin on the systems by spring 2016, with expectations that they’ll be operational by the year’s end, producing 3.5 million kwh of electricity per year and reducing carbon dioxide emissions by more than 2,400 metric tons, according to Allard’s post.

Such power purchasing agreements, the announcement says, can be a “win-win-win” for federal government.

“Not only are they completely financed by the private sector, they also generate two sources of revenue for the developers: a fee for electric service and payments for Solar Renewable Energy Certificates (SRECs) from public utilities,” Allard says.

This project was spurred by President Barack Obama’s 2014 Capital Solar Challenge, which urged GSA and the Energy Department to assist federal agencies and military services in the nation’s capital to identify opportunities to deploy renewable energy sources.

Reach the reporter on this story at billy.mitchell@fedscoop.com. Follow him on Twitter @BillyMitchell89.

NSA chief promises biggest shake-up in 20 years

The National Security Agency, the sprawling surveillance enterprise that epitomizes both the promise and the threat of cutting edge technology in the hands of the government, will next month begin its largest reorganization in 20 years, aimed at making it more innovative and agile, Director Adm. Michael Rogers said.

“In January, you’ll see us rolling out something we’re gonna call NSA21, which is NSA trying to position ourselves for the 21st century, which’ll be a pretty comprehensive set of changes we’re going to make,” he told an audience of national security and intelligence contractors Tuesday evening. The changes were in four areas, he said at the Intelligence and National Security Alliance annual dinner: “How we develop our workforce, how we develop collaboration and integration, how we innovate, [and] what’s the organizational structure we need to inculcate those things.”

He said the overhaul would be “among the most comprehensive set of changes NSA has undergone probably since the late 1990s.”

He declined to give further details, citing a promise he made to the workforce that they would the first to know about the changes, but he said the planning for the reorganization had taken “literally 10 months” and had been driven by a newly straightened fiscal climate. “We’re going to reallocate resources internally for what I think are the problem sets of today and tomorrow,” he said, adding that involved “tradeoffs … what you lose by pulling it out versus what you gain by investing it somewhere else.”

The work had exposed some gaps at the agency, he said, including that it lacked a “formal and repeatable process for assessing risk.”

He had asked senior officials to address 12 questions in the four areas, he added, so the agency could “position ourselves for success, not just today, but five or 10 years from now.”

VA, DOD show Congress progress in e-health records interoperability

Weeks after a tongue-lashing by a pair of House subcommittees, the officials working on electronic health record interoperability between the departments of Defense and Veterans Affairs say they’ve finally opened lawmakers’ eyes to the progress they’ve been making.

Last week, IT staff from the VA and DOD, as well from the departments’ joint Interoperability Program Office, visited members of Congress to showcase the capabilities of the Joint Legacy Viewer, an interoperable EHR platform that lets the DOD send soldiers’ records to the VA as they move on to veteran status, according to Elaine Hunolt, co-director of the Interoperability Office for the Veterans Health Administration.

In earlier hearings, the two departments described to Congress the progress they’d made on JLV, and the lawmakers mostly met the explanations with ridicule. But they really didn’t grasp the system the agencies’ officials were describing, Hunolt said.

“They all admitted that they hadn’t actually seen the actual successful exchanges we have going on,” she explained Tuesday at the 2015 AFCEA Health IT Day.

It’s the job of officials to make the systems behind what are essentially the nation’s two largest health care providers interoperable — a requirement they failed to meet by the Oct. 1, 2015, deadline established in the National Defense Authorization Act for 2014. But it’s crucial that they also “try to continue to educate members of Congress on what we’ve actually achieved and what these capabilities are,” she said.

“We think even if one person sees that, one individual acknowledges it, that’s a step in the right direction, ’cause we have done a lot of work,” Hunolt said.

Lauren Thompson, director of the Interoperability Program Office, said “we spend a lot of time and effort trying to educate our congressional colleagues on what exists.”

“When they see the capabilities, the light goes on for them,” Thompson said. “We’re trying to really help them understand what it means to be clinicians in that environment and have access to all of that data.

“[JLV] really does enhance the patient experience,” she said.

With the recent progress the VA and DOD have made convincing Congress that the JLV is an in-progress, working solution — and especially now that Frank Kendall, undersecretary of defense for acquisition, technology and logistics, confirmed last month that the two departments have met the requirements of the 2014 NDAA — the battle is anything but finished.

Both behemoth departments are moving to modernized electronic health record-keeping systems — the Defense Healthcare Management Systems Modernization and the Veterans Health Information Systems and Technology Architecture Evolution — that will also have to be interoperable. That challenge, which drew the bulk of the concern from lawmakers during the hearing earlier this fall, must still be overcome.

“Two separate modernizations are a mistake,” Rep. Ted Lieu, D-Calif., said in October.

“We’ve had a lot of concerns about the starts and stops of this,” said Valerie Melvin, director of the Government Accountability Office’s Office of Information Management and Technology Resources Issues, which developed a report in August criticizing the departments’ inability to set strong goals for achieving interoperability. At the same October hearing, she told lawmakers, “It has been a history of the two departments going down particular paths that they wanted pursue for this, changing at certain points, and there has not in our view been the accountability for them doing it.”

Tech thinkers: U.S. needs national IoT strategy

The U.S. public sector must set a national strategy for developing the Internet of Things if the technology is to reach its fullest potential, a think tank said Wednesday.

The Information Technology and Innovation Foundation’s Center for Data Innovation in a new report urges policymakers around the world, and particularly those in the United States, to craft national strategies for the emerging IoT landscape. Likening it to the public sector’s involvement in the development of the Internet, CDI says IoT technologies will not have as transformative an impact if it is left to private industry to cultivate them.

“Our report makes the case that no country will successfully capture the benefits of the Internet of Things by leaving its development solely up to the market,” said Joshua New, policy analyst for CDI and the report’s co-author. He spoke during a panel discussion about the report Wednesday on Capitol Hill. “Just as no country can capture the benefits of the Internet of Things without a robust private sector unencumbered by restrictive regulations that is free to innovate.”

While private sector innovation in the U.S. has led to many of the apps and devices — like FitBits and Internet-connected refrigerators — that drive the prevalence of IoT, there are higher-order benefits that the market alone cannot capture, New argued. For instance, smart thermometers can help energy consumers save on their power bills, but if the federal government and the Energy Department build a strategy around the use of the devices, it could produce larger scale effects, like less stress on the power grid and fewer emissions.

“That is a value the market cannot capture,” he said.

ITIF President and CEO Rob Atkinson has no doubt that the Internet of Things will play a critical role in federal agencies’ operations in the future, and they must coordinate or risk missing out on the hyper-connected benefits of IoT.

“IoT is different…because it is a technology that not just the private sector is going to adopt, but the government is going to adopt,” Atkinson said. “Governments of all levels are going to be involved in it. IoT is something that each agency is going to have to do, and without a strategy, agencies may sub-optimize and think about only what they’re doing.”

Rep. Darrell Issa, R-Calif., who co-founded the Internet of Things Caucus with state of Washington Democrat Rep. Suzan DelBene in early 2015 because “there’s so little information in Congress in this area,” applauded CDI’s report.

“We have to find a way to create strong, safe and reliable connectivity,” Issa said in his opening remarks, speaking broadly of his caucus’ efforts to educate Congress and drive toward a national strategy on IoT. “And if we do, then there’s an almost unlimited potential for efficiencies and a better life for our families.”

Industry representatives on the panel agreed unanimously in their support of a national strategy for IoT.

“There are bits and pieces being done across the federal government, which is great, but there really is a need to coordinate all of these activities under one national plan,” said Steve Crout, vice president of government affairs for Qualcomm Inc.

Jeff Brueggeman, vice president of global public policy at AT&T, said “the time is right to do this.”

“Particularly on some of the large projects like transportation, public safety and smart cities, you really do need that network effect to pull together,” Brueggeman added. “In the absence of a strategy, we run the risk of a patchwork quilt” of regulation and standardization.

At this point, New said, no federal agencies have really taken formal steps to involve IoT in their long-term planning. Some agencies are adopting it with a piecemeal approach, he said, adding that the U.S. Postal Service was the only government body “that has formally analyzed how the Internet of Things can improve their operations,” New said. “But that is simply not good enough.”

Contact the reporter on this story at Billy.Mitchell@fedscoop.com. Follow him on Twitter @BillyMitchell89.

Massive tax and spending bill passes Congress with cyber, other riders

This story was updated Friday to reflect the passage of the bill.

The omnibus spending bill and tax package passed by the House and Senate Friday and heading to the president’s desk will include a new version of the Cybersecurity Information Sharing Act, plus a host of other policy riders and legislation.

Lawmakers reached a deal late Tuesday night on a 2000+ pages long, $1.1 trillion bill that will fund the government until the end of the current Fiscal Year on Sept. 30, 2016. Alongside that is a $680 billion tax package.

The massive bill is filed with a litany of policy riders, which dictate new policies affecting matters from oil exports to alternative energy credits, as well as law related to health programs for 9/11 responders and meat labeling, among others.

Tucked into the legislation is a version of CISA that was hammered out by a small group of lawmakers from three separate cybersecurity information sharing bills that passed the House and Senate earlier this year.

The final bill devotes 135 pages to CISA. Just before its release early Wednesday morning, a bipartisan group of four lawmakers circulated a letter complaining that the bill’s provisions were being finalized behind closed doors.

“Legislation encouraging cybersecurity information sharing between industry and government is complicated and will have hugely negative ramifications on user privacy if done improperly,” reads a letter from Reps. Justin Amash, R-Mich., Zoe Lofgren, D-Calif., Ted Poe, R-Texas, and Jared Polis, D-Colo.

“Reports indicate a new bill is being negotiated by just a handful of members for inclusion in the omnibus…We cannot cast such a consequential vote with no input.”

Among the provisions in the bill is complete liability protection for companies who share threat indicators, even if they fail to scrub personally identifiable information before turning them over to the government.

Also, DHS is allowed to share indicators with other government agencies, including the FBI and National Security Agency, given that PII is scrubbed from that information. However, the bill grants the president the ability to create data portals at other agencies if the DHS portal is found to be flawed.

Privacy groups came out against the provisions in concert, with groups like the ACLU, Electronic Privacy Information Center, Electronic Frontier Foundation and Access Now blasting the bill.

Robyn Greene, Policy Counsel at New America’s Open Technology Institute, said the strong-arm actions behind closed doors led to “a race to the bottom on privacy and operational effectiveness.”

“On several fronts, this bill is significantly worse than the two House-passed bills,” Greene said in a statement. “Representatives should demand that it be stripped from the omnibus so that they can debate it and vote on the record, to reject this deeply flawed bill.”

The bill also codifies a number of efforts taken up by both DHS and the White House’s Office of Management and Budget to protect federal IT systems. Similar to directives put forth in OMB’s Cybersecurity Implementation Plan, the bill calls for agencies to identify mission critical data, encrypt it while in transit and at rest, and assess the access controls related to that data.

The bill also asks OMB to prepare a report on the Einstein intrusion detection system, which will detail what agencies are agencies are using the tool and how many intrusion the system detected and turned away.

There is also a significant portion of the bill dedicated to improving the cyber workforce within the federal government. The Director of the Office of Personnel Management, Secretary of Homeland Security, Director of NIST, and Director of National Intelligence are tasked will be working to implement the National Initiative for Cybersecurity Education, which will help the federal government streamline the hire of cybersecurity professionals. The National Initiative for Cybersecurity Education was created under the Cybersecurity Enhancement Act of 2014.

Elsewhere, the bill also extends the identity theft and fraud protection OPM is offering to victims of its data breaches from three years to a full decade, covering up to $5 million in damages. The National Treasury Employees Union, a union which represents federal employees, called the measure “a significant improvement” over the current offering, which was 3 years of coverage up to $1 million in damages.

To contact the reporter on the this story, email him at greg.otto@fedscoop.com or follow him on Twitter at @gregotto

5 cybersecurity trends to watch for in 2016, part 1

This commentary is the first of a three-part series featuring what cybersecurity thought leaders expect to see in the coming year.

This past year marked a strategic shift from a maniacal focus on prevention, toward greater balance on monitoring, detection and response capabilities. Indeed, it became a cliché to say that breaches are inevitable and that faster detection and more accurate incident scoping are the way forward.

2015 also saw continued acceleration of threat evolution. What was considered an “advanced” threat in years past has become a commodity today, with sophisticated malware and exploits available for the price of a movie ticket.

As troublesome as these observations seem, the most impactful kind of evolution has gone almost entirely unreported and misunderstood. The threats that matter most, today’s pervasive threat actors are now conducting attack campaigns composed of multiple exploit methods and multiple backdoors to assure persistence. Incomplete incident scoping has become a critical mistake repeatedly made by security teams.

This year was also notably characterized by security vendors claiming to be able to prevent advanced threat breaches when the reality is, they can’t. It was characterized by organizations recognizing the need to monitor and defend their digital environments differently, but continuing to center their security programs on the same technologies and approaches they have been using — not acting differently, but hoping for a different outcome, nonetheless.

Here are some of the emerging trends that our industry and organizations need to be ready for in 2016:

Strategic data manipulation and disruption

Organizations will begin to realize that not only is their data being accessed inappropriately, but that it is being tampered with. Data drives decision-making for people and computer systems. When that data is unknowingly manipulated, those decisions will be made based on false data. Consider the potentially devastating consequences of misrepresented data on the mixing of compounds, control systems and manufacturing processes.

[Read more: 5 cybersecurity trends to watch for in 2016, part 2]

Increasing attacks on application service providers

As organizations become more comfortable with the “as a service” model, more of their most sensitive applications and data reside in the cloud. The aggregation of this valuable data from many companies is creating an incredibly lucrative target for cybercriminals and cyber espionage. A deeper appreciation of third party risk is needed.

Hacktivism and the attack surface

As cyberattack tools and services become increasingly commoditized, the cost of attacking an organization is dropping dramatically, enabling more attacks that do not have financial gain as the primary focus. Sophisticated hacktivist collectives like Anonymous have been joined by relatively unsophisticated groups, or even individual cyber vigilantes. Organizations need to realize that financial gain is no longer the only nor even perhaps the biggest driver of their adversaries. Security operations and risk managers should evolve their understanding not only of the threat, but also of what, why, where and how they are being targeted.

Industrial control systems pushed to the breaking point

Intrusions into systems that control operations in the chemical, electrical, water and transport sectors have increased 17-fold over the last three years. The advent of connected and automated sensors aggressively exacerbates these issues. The growth in the use of cyber technology by terrorists, hacktivists and other actors, combined with the weakness of ICS security generally, combined with the potential impact of bringing down a power facility or water treatment plant (hello, California), makes the critical breach of an ICS in 2016 extremely concerning and increasingly likely.

Shake-out of the security industry

Our industry has been awash in venture capital and as a result, foolish investments have been made in strategies and technologies that are little more than snake oil. As organizations’ security programs continue to mature, they are learning that claims of being able to prevent advanced threat breaches are nothing more than fantasy. Expect to see a shake-out in the security industry as organizations’ maturing understanding of advanced threats increasingly drives their investment decisions in this area.

Amit Yoran is the president of RSA. Follow him on Twitter at @ayoran and the company at @RSASecurity.

OMB hiring analyst to run PortfolioStat meetings

The White House’s Office of Management and Budget is staffing up as it gets ready to follow through on its IT management directives.

OMB is looking for a policy analyst to work in the office of Federal Chief Information Officer Tony Scott, poring over federal agency IT budgets and crafting future policy that will shape how the government uses its IT assets.

The analyst, who will serve for a term of at least two years, will be part of an oversight team that reviews budget requests from several agencies and will be tasked with leading PortfolioStat, TechStat and FedStat sessions with agency personnel.

Sessions related to PortfolioStat and TechStat are a key portion of the Federal IT Acquisition Reform Act. Part of the reason the Government Accountability Office has considered federal IT a “high risk” since February is due to the lack of TechStat meetings being held. Only one TechStat session — a face-to-face meeting held to terminate or turn around troubled agency IT investments — has taken place between March 2013 and October 2015.

The candidate must have managed one or more “major” IT investments and has experience operating within the Capital Planning and Investment Control process in one or more agencies. The job listing suggests the ideal candidate will “be a seasoned Project/Program Manager with both a Federal Acquisition Certification for Program and Project Management (FAC-P/PM) as well as certified as a Project Management Professional (PMP) from the Project Management Institute,” however those are not considered requirements.

Candidates must be at a minimum level of GS-11, or hold either a Ph.D or Master’s of Laws (LL.M) degrees.

The salary is listed between $63,722.00 to $158,700.00 per year.

Applicants have until December 22 to apply.

To contact the reporter on the this story, email him at greg.otto@fedscoop.com or follow him on Twitter at @gregotto.

FDA unveils open beta of precisionFDA

Food and Drug Administration officials Tuesday launched the open beta version of a Web portal that they hope will eventually help make possible personalized treatments based on a patient’s genes, environment and lifestyle.

Called precisionFDA, the portal provides a space where researchers can collaborate to figure out best practices for compiling massive amounts of data from a patient’s DNA, a process known as next-generation sequencing. The agency launched the portal in “closed beta” last month.

“Through such collaboration we hope to improve the quality and accuracy of genomic tests – work that will ultimately benefit patients,” Taha Kass-Hout, FDA’s chief health informatics officer, and Elaine Johanson, precisionFDA project manager, wrote in a blog post announcing the launch.

According to FDA spokeswoman Jennifer Dooren, precisionFDA users will have access to previously sequenced genomes — such as Genome in the Bottle developed by the National Institute of Standards and Technology — that they can use to compare to other samples. She also said the platform will offer genome sequencing bioinformatics tools and other software apps.

“Users will also be able to compare their results to previously validated reference results as well as share their results with each other, track changes and obtain feedback,” she said.

PrecisionFDA is part of a larger push from the White House to promote its Precision Medicine Initiative.

EPA defiant on social media ‘propaganda’ ruling

The Environmental Protection Agency is pushing back hard against an auditors’ opinion that found it broke the law in a social media campaign fostering support for a controversial new rule on American waterways.

In a blog post Thursday, the agency’s Director of Public Affairs Liz Purchia accused a “small but vocal group” of critics of stoking the row in an effort to “distract from and derail” EPA’s work on President Obama’s climate change policy and in particular the recent Paris deal on limiting carbon emissions.

Earlier this week, auditors from the Government Accountability Office issued an opinion on the EPA’s public awareness campaign about its Waters of the U.S. rule, in response to complaints over the summer from conservative lawmakers, headed by Sen. James Inhofe, R-Okla.

Even without the sharp political edge it gets from GOP attacks on climate science, the dispute highlights how new social media tools threaten to outrun the rules developed in an earlier era, creating a potential legal minefield for agencies as they seek to engage the public during the bitterly fought run-up to next year’s presidential election.

The GAO found most aspects of the EPA’s campaign were lawful but came down on the agency for engaging in “covert propaganda” because it failed to identify itself as the author of an automated social media message launched by an application called Thunderclap.

Thunderclap allows an organization to recruit social media supporters for a campaign, and once their numbers reach a certain threshold, Thunderclap sends out a single message from all supporters’ Facebook, Twitter and Tumblr accounts. But that message did not identify EPA as its author. “While EPA’s role was transparent to supporters who joined the campaign, this does not constitute disclosure to the 1.8 million people potentially reached by the Thunderclap,” the GAO said.

Auditors also hammered the agency for “grassroots lobbying” by hyperlinking one of its blog postings to two advocacy group webpages which displayed or linked directly to messages urging visitors to lobby Congress in support of clean water rules.

“When EPA hyperlinked to the NRDC and Surfrider Foundation webpages using an official communication channel belonging to EPA and visually encouraged its readers to visit these external websites, EPA associated itself with the messages conveyed by these self-described action groups,” GAO found.

“It is this association combined with the clear appeals actually contained in the webpages that form the prohibited conduct.”

But EPA’s Purchia insists the agency was within the law: “At no point did the EPA encourage the public to contact Congress or any state legislature about the Clean Water Rule. Plain and simple. The rule is an agency action, promulgated by EPA. It’s not even about congressional legislation.”

[Read more: EPA social media post was ‘covert propaganda’ — GAO]

“It’s almost 2016. One of the most effective ways to share information is via the Internet and social media. Though backward-thinkers might prefer it, we won’t operate as if we live in the Stone Age,” she declared.

“We will continue to work with GAO and members of Congress to explain what this is and isn’t,” she concluded.

But it is indeed almost 2016, and this is Washington, D.C. And given the increasingly stormy political weather expected here in the runup to next year’s presidential election, the GAO decision could make social media managers at other agencies a little gun-shy, experts including a former official told FedScoop.

‘A highly charged political environment’

“It’s a highly charged political environment in which any potential infraction is magnified,” said Beverly Macy, an instructor at the UCLA Anderson School of Management who specializes in social media marketing.

The repercussions of the GAO opinion could ripple through the executive branch, said Nathaniel Lubin, a former White House digital strategy director.

“I think any time there’s negative feedback, that could impact people’s decisions to use something. They could be worried their managers are more hesitant to give people who want to push the envelope the space to try new things.”

But he added: “If they’re not willing to push the envelope, they’re not going to be effective. Just because there’s the negative feedback in this particular case doesn’t mean that not taking advantage of digital challenges is an option.”

Lubin also suggested that the latest social media tools might have outrun the rules written for an earlier generation of technology.

“Regardless of the decision in this particular case, these platforms are dynamic and beyond [the technology that] was considered when these initial rules were thought about — and [they] probably need to be updated,” he said.

Tools and training

Uncertainty about exactly how old rules and principles should apply to new technology was famously the reason so many officials were hesitant about adopting social media in the first place. Ines Mergel, an associate professor of public administration at the Maxwell School at Syracuse University who has written about social media in Obama administration, said she was surprised to learn how EPA used Thunderclap for its campaign because a regulatory agency should not be experimenting with lobbying and campaigning social media tools.

Mergel said the General Services Administration negotiates terms of use for social media platforms across the U.S. government — and then offers training webinars for agency staff.

GSA’s spokeswoman did not respond when asked whether the agency’s webinar covered the issue that put EPA in hot water. But a YouTube video of the nearly 50-minute talk, which was run by Thunderclap staff, does not mention the need to identify virally proliferating social media messages as products of a U.S. agency, to avoid falling of afoul of rules designed to prevent deceptive media campaigns seeking to mask the government’s hand.

The EPA also held its own webinar on best practices, but that also didn’t deal with the issue of identifying messages as coming originally from the agency.

“The use part is the interesting piece here — and what you do with the tool,” Mergel said. That’s not to say that the government shouldn’t be using Thunderclap. Even so, she said other agencies may be scared away from using Thunderclap in the future or bring in a legal team to help with their strategy.

“While we’re not familiar with the specific legal details of this particular case, we are approved for use by federal agencies, and we’ve worked with numerous government agencies in the past,” Chelsea Orcutt, head of strategy and outreach at Thunderclap, said in an email.

[Read more: Trying to archive that tweet? One startup has a way]

And even the GAO auditors said the EPA’s hashtag campaigns #DitchtheMyth and #CleanWaterRules were lawful, noted Rob Enderle, an Oregon-based technology analyst. As that shows, federal agencies can advocate for their policies but social media managers there need to be cognizant of additional legal restrictions, he said.

“I think they just need to be aware of the laws that restrict them more than if they were a public company or a individual,” he said. “Like anything else a public agency does, there are rules.”

For the most he said, auditors found the EPA had followed those rules and run a legal campaign. The violations, he added, were more likely to be an oversight on EPA’s part and not elements in a “despicable plot to promote clean water.”

“I think, generally, the EPA did it right,” he said. “Just in this one instance, they did not.”

Reach the reporter at whitney.wyckoff@fedscoop.com, follow her on Twitter @WhitneyWyckoff. For stories like this every day, subscribe to the Daily Scoop at fdscp.com/sign-me-on.

Editor’s note:This story has been corrected to fix a mischaracterization of one interviewee’s view of the FDA’s use of Thunderclap.

5 cybersecurity trends to watch for in 2016, part 2

This commentary is the second of a three-part series featuring what cybersecurity thought leaders expect to see in the coming year.

Remember when weather forecasts were often wrong? Rain fell on many a picnic before technological advances finally enabled meteorologists to make accurate predictions.

Predicting cybersecurity’s future is likely to be at least as humbling as weather forecasting was in the old days. Everything could change in the twinkling of a virtual eye: a new hacking technique catching us unawares, or a disruptive innovation altering the landscape permanently (or until the next innovation comes along).

As with weather forecasters before and now, though, we don’t need a crystal ball to reasonably guess where cybersecurity is headed in the near future. Paying close attention to what’s happening now can give us a good idea of trends to follow, and where they’re most likely headed.

Here are some important developments I see occurring in 2016:

Cell phones as ‘pass phones’

In the never-ending quest to replace the password, we’ll likely see more opportunities to authenticate using our cell phones, instead —adding convenience and increased security to the user experience. Here are some “pass phone” technologies that may become more prevalent in 2016:

A connected new world

Chances are greater than ever that many of you will adjust your thermostats using your phone in 2016, or keep an eye on your home via remote-access security cameras. You may have a refrigerator that alerts you when you run out of milk, or an app that lets you start your car remotely for a warm-up. Driverless cars almost certainly won’t become commonplace — yet — but your new vehicle should be able to talk to you, helping you to avoid traffic jams, perhaps, with alerts and alternate-route suggestions.

JR-Reagan-Deloitte-portrait

JR Reagan writes regularly for FedScoop on technology, innovation and cybersecurity issues.

As many as 5.5 million additional objects will be connected to Internet every day in 2016, one security firm predicts — an increase of 30 percent over 2015 connections. The new horizon in the coming year: the workplace. As connectivity extends into our offices and industrial plants, we can expect not only more efficient lighting and climate control but also safer work environments, with sensors detecting when equipment needs repair or replacing, for instance. And, with companies racing to develop technologies to support this connected new world, we should see major advances in data processing and visualization, communication, and, yes, security.

[Read more: 5 cybersecurity trends to watch for in 2016, part 1]

Islands’ of data

Data, data everywhere: The staggering proliferation of data should continue, especially as devices on the “Internet of Things” collect and disseminate it for private and public benefit. Properly analyzed, all this data could be incredibly useful. Why do women live longer in Japan than in any other country in the world? Gleaning information regarding eating and exercise habits in that country as well as incomes, environmental factors and more, could help the rest of the world to understand what the Japanese are doing right — and to adjust our own behaviors accordingly.

International privacy laws, however, will continue to limit our ability to share data across borders. Already more than 100 nations have adopted laws governing the transfer of citizens’ personal information. With the E.U. set to issue its Data Protection Regulation in early 2016 and governments, including the U.S. adopting or at least considering similar (but unique) laws, sharing could become more difficult.

Instead, discrete “data islands” may form, separated by a morass of laws and regulations. Not only are these restrictions making it harder for us to see the big picture, but they could also undermine the way we conduct international business. Firms with offices in multiple countries will have to work hard to keep abreast of laws and regulations in all the locales we serve — and we’ll need to put on our thinking caps to figure out how best to balance the needs for privacy and security while serving our customers and clients.

More CISO ‘boarding’ calls

The chief information security officer is coming of age. In the rush to protect their data — now among an organization’s most valuable assets — a number of major corporations added cybersecurity experts to their boards in 2015.

In the coming year, I think we’ll see many more organizations sprinting to add CISOs to their boards as they realize the risks data breaches pose to their brands and bottom lines. Savvy CISOs will take note, increasing their knowledge and understanding of business so that, when asked to move from the backroom to the boardroom, they’ll be ready.

[Read more: Déjà vu for the CISO — and why, like CIOs, they need to start thinking like business people]

A brightening ‘cyber poverty line’

As major security breaches continue to make the news, many organizations are realizing the importance of strong, effective, comprehensive cybersecurity — and reeling at its price.

Good cybersecurity doesn’t come cheaply — nor, with the demand for cybersecurity specialists outpacing the supply, will the price likely drop in the coming year. But can any organization afford to skimp? Data breaches are said to cost companies on average $3.8 million for each incident.

Investing in a strong digital security program could make the difference for many companies in 2016. In the competitive race, we could see loss of reputation, litigation, and recovery costs hamstringing the cybersecurity “have nots,” while the “haves” — strong, secure, and resilient — pull far ahead. In which group will your organization or agency be?

These trends and others foretell exciting advances in cybersecurity in the coming year, and a possible shift from reactive to proactive; from rigid to resilient. Which developments do you think will have the greatest impact in 2016? How will you stay ahead of the cybersecurity curve?

JR Reagan is the global chief information security officer of Deloitte. He also serves as professional faculty at Johns Hopkins, Cornell and Columbia universities. Follow him @IdeaXplorer. Read more from JR Reagan.