Oops! Wearables can leak your PINs and passwords

The security nightmare posed by the Internet of Things isn’t just related to the lack of expertise in the types of companies adding connectivity to gizmos and gadgets.

It’s the sensitivity of the connected sensors, strewn hither and thither, opening up potential attack vectors for determined hackers. Hence the need for really robust security thinking to lock down the risks.

To wit: wearables.

Collaborative research conducted by a team from the department of electrical and computing engineering at the Stevens Institute of Technology and Binghamton University in New York State, has demonstrated how a wearable device such as a smartwatch could end up compromising a user’s PIN thanks to the motion sensing data it generates.

The team combined wearable sensor data harvested from more than 5,000 key entry traces made by 20 adults with an algorithm they created to infer key entry sequences based on analyzing hand movements, applying the technique to different types of keypads (including ATM style and Qwerty keypad variants) and using three different wearables (two smartwatches and a nine-axis motion-tracking device).

The result? They were able to crack PINs with 80 per cent accuracy on the first attempt, and more than 90 per cent accuracy after three ties… Ouch. Albeit, I guess you can say wearables are useful for something then.

Here’s a description of the work from their research paper:

In this work, we show that a wearable device can be exploited to discriminate mm-level distances and directions of the user’s fine-grained hand movements, which enable attackers to reproduce the trajectories of the user’s hand and further to recover the secret key entries. In particular, our system confirms the possibility of using embedded sensors in wearable devices, i.e., accelerometers, gyroscopes, and magnetometers, to derive the moving distance of the user’s hand between consecutive key entries regardless of the pose of the hand. Our Backward PIN-Sequence Inference algorithm exploits the inherent physical constraints between key entries to infer the complete user key entry sequence.

The research was reported earlier by IEEE Spectrum. One of the researchers, professor Yan Wang, told IEEE it’s the volume of sensors in wearables that enables the technique to work by providing “sufficient information” of hand movements. So clearly more can in fact mean less (secure).

To eliminate errors when trying to calculate distance moved based on acceleration he said the team worked backwards from the final movement in an input sequence, which was likely to be pressing enter on the keypad. Allowing them to translate the rest of the key presses.

The attack method would not require a hacker to be nearby when a person inputs their PIN, rather the necessary data packets could be stolen by a wireless sniffer placed close to a keypad to capture Bluetooth packets being sent from the wearable to a smartphone. Or via malware installed on the wearable or smartphone to intercept the data and send it on to the attacker.

And while most PINs are just a handful of digits, the team believes the technique could actually be used to power a full keylogger.

“This can be extended to snoop keystrokes and interpret people’s passwords or what has been typed,” professor Yingying Chen, another of the researchers involved in the project, told TechCrunch. “We have another research project about this.”

“Both smart watches and fitness bands pose a risk,” she added of the overall vulnerability.

One way to avoid the risk of your smartwatch or fitness bangle leaking your PIN to a determined hacker is to input the digits with your other, non-wearable-wearing hand. Chen confirmed this would prevent the technique from working.

An alternative strategy for those who do wear a wearable on the hand they enter PINs and passwords is to add some ‘noise’ to the operation — by randomly jerking their hand between key presses, said Wang. Which won’t look at all weird.

Fixing the vulnerability at source would require wearable manufacturers to better secure sensing data being generated by the devices, according to Wang.

He added they could also obscure the signal being leaked by the sensors by injecting noise into the data so it could not be so easily reverse engineered.

On the signal obfuscation front, at its WWDC developer conference this summer wearable maker Apple announced it would be using a technique called differential privacy on the forthcoming version of its mobile OS, iOS 10, to help obscure individuals’ personal data but still allow for large-scale trend patterns to be inferred by analyzing the data in bulk.

Safe to say, in a security sense, having more noise with your signal can actually be a boon.