Advertisement

In cybersecurity, it’s physics to the rescue

Commentary: As computing technology evolves, how will cybersecurity need to change to keep up?

Do you remember when a computer filled an entire room? Not only were they behemoths, they were also slow by today’s standards. Over time, however, the computer became increasingly compact even as its capabilities and speed increased. Today, a device that fits in the palm of a hand can perform sophisticated tasks in the blink of a virtual eye.

Computers are getting smaller and more powerful all the time, as predicted by Moore’s law. That 1965 theory, now an established truism, says computers should double in power every 18 months as transistors shrink, enabling more of them to fit on a processing chip.

Now, though, transistors can be the size of a single atom. Has computer technology hit its limit?

JR-Reagan-Deloitte-portrait

JR Reagan writes regularly for FedScoop on technology, innovation and cybersecurity issues.

Advertisement

We simply can’t get any smaller, not without going down to the subatomic realm. Doing so brings a new set of challenges: In the tiny universe, the laws of physics differ dramatically from what we experience on the human scale. Atomic particles such as electrons and protons, for instance, can exist in two places at the same time. How do we design reliable, predictable information systems in such an unpredictable environment?

Science leads the way

But technology’s progress marches inexorably on, and scientists today are developing tomorrow’s alternatives to our current, silicon-based computing. Physicists, biologists and others are researching not only computing using subatomic particles — “quantum” computing — but also the use of magnetic waves, DNA and other materials to transmit the 0s and 1s that make up digital information.

Although any of these technologies is a long way from our laptops and phones — the only quantum computer in use fills an entire room — some promise processing speeds and complexities exponentially greater than our computers produce today. Quantum computing, for example, is said to process information more than 100 million times faster than the contemporary PC.

Computing is in its infancy, it seems safe to say — and so, by extension, is cybersecurity. The advent of any of these new systems could obliterate current information-security technologies, even providing data that protects itself. Hacking a silicon-based computer network involves cracking the codes written in binary digits, or “bits,” to access the information stored as all those 1s and 0s. But what if there were no code to crack?

Advertisement

Our ‘rules’ don’t apply

Quantum computing works with information stored not in bits but in “qubits,” using the smallest forms of matter such as electrons or photons. According to the laws of quantum physics, these “qubits” don’t necessarily have an assigned value. They can act as 1s, 0s, something in between — or all of the above, at once. If these slippery laws of subatomic nature challenge physicists, how would hackers work around them?

For example, scientists recently developed a technique for “detangling” photons, which occur in pairs, and isolating them from each other. Although separated, they retain their bonded quality — what affects one, affects the other. Heralded as a major breakthrough, this new technique for generating single protons could lead to quantum computing in which tightly bonded photon pairs transmit and receive information — a process likened to communicating via tin can “telephones.” Because any interception would interrupt the bond — like cutting the string between the cans — both senders and recipients would know instantly when a breach occurred.

DNA computing, which uses strands of DNA to process information and magnetic-wave computing, using “solitons,” or invisible, naturally occurring solitary waves to transmit data, hold promise, too, for faster, more efficient computing that is also more secure.

On the human timeline, the Information Age warrants barely a blip: The World Wide Web, or internet, was introduced in 1991. Information security, too, is new, with large-scale thefts of personal and business data occurring only in recent years. Where it’s headed next seems fairly clear — away from device-centered approaches to ones focused on securing data, no matter where it comes from or where it is going.

Advertisement

Seizing the day for cyber

Until now, cybersecurity has consisted mainly of putting “locks” on existing devices to protect a known universe of things, including, now, the Internet of Things. But with computing technology poised on the brink of momentous change, we in the profession have a unique opportunity to start again, and get security right this time.

We can’t know what tomorrow holds, device-wise, but we do know there will be data — and that’s what we’ve always needed to protect. Keeping that aim in mind may help us transform cybersecurity, from an afterthought “tacked on” to protect a known universe of things, into an essential feature designed for the great, technological unknown.

Whether the next big technology turns out to be quantum, DNA or magnetic-wave computing — or something else — the time is now for the cybersecurity profession to join the conversation. We could be on the ground floor of something big. We would do well to educate ourselves, and to work with scientists, the government and the private sector to ensure that the next generation of computing technology is at least as secure as it fast and effective.

JR Reagan is the global chief information security officer of Deloitte. He also serves as professional faculty at Johns Hopkins, Cornell and Columbia universities. Follow him @IdeaXplorerRead more from JR Reagan.

Latest Podcasts