NIST studies algorithms to protect data from advanced computing

The National Institute for Standards and Technology is considering recommendations on algorithms that could secure information that could be manipulated by quantum computers.

But beyond that, the goal is to use the algorithms to create a set of standards for protecting electronic information from attack by the computers of today and tomorrow.

Work is continuing on the development of the 26 algorithms for potential standardization to protect sensitive electronic information across multiple industries, including healthcare, NIST researchers say.

The advent of wider use of quantum computers is raising new urgency to the NIST work. For two decades, researchers have been working on developing a quantum computer with the potential to have offer exceptional power to process information significantly faster than any computer can do today.

Such information technology would have the ability to analyze data simulations significantly quicker than is possible now. That could bring benefits, such as accelerating the pace at which new medicines and medical devices or other innovations are designed, developed and marketed, says Dustin Moody, a mathematician at NIST.

Also See: NIST says providers need more awareness of IoT security risks

NIST is considering the 26 algorithms to ramp up data protection. “We want to look at how these algorithms work, not only in big computers and smartphones, but also in devices that have limited processor power,” Moody notes. “Smart cards, tiny devices for use in the Internet of Things and individual microchips all need protection, too.”

NISTNorth-crop.gif

The goal is to use the algorithms to create a set of standards for protecting electronic information from attack by the computers of today and tomorrow.

“These 26 algorithms are the ones we are considering for potential standardization, and for the next 12 months, we are requesting that the cryptography community focus on analyzing their performance,” Moody explains. “We want to get better data on how they will perform in the real world.”

At present, all the algorithms look promising, Moody adds, but scientists, researchers, corporations and other entities now will spent a couple years to pick out one or two algorithms for development and potential widespread implementation. Then, there will be a transition period to pull out the old algorithms and put in the new ones.

For reprint and licensing requests for this article, click here.