Ride the Lightning

Cybersecurity and Future of Law Practice Blog
by Sharon D. Nelson Esq., President of Sensei Enterprises, Inc.

Siri, Google Assistant and Bixby Hacked Using Ultrasonic Waves

March 5, 2020

Naked Security reported some unsettling news on March 2nd. Researchers have demonstrated how Siri, Google Assistant and Bixby can be secretly activated and commanded to make a call, take photos and read text messages – without ever touching the device.

Known as the SurfingAttack, a US-Chinese university team has proven that they can remotely control ultrasonic waves. As readers know, voice assistants will "wake" when they hear the owner's voice and a "trigger phrase" such as "Ok, Google."

When we say these commands, they are really just sound waves, They can be replicated using ultrasonic waves which we can't hear. Two caveats – the attacker must have a line of sight on the device and be fairly close to the device.

SurfingAttack adds the ability to transmit the ultrasonic commands through a solid glass or wood table on which the smartphone was sitting using a circular piezoelectric disc connected to its underside.

Mind you, while the distance in the demonstration was only 17 inches, hiding the disc under surface is a major advance since it can conceal the attack.

If you read the story, you can watch a video of a remote laptop generating voice commands using text-to-speech (TTS) Module to produce simulated voice commands which are then transmitted to the disc using Wi-Fi or Bluetooth.

The researchers tested the method on 17 different smartphones models from Apple, Google, Samsung, Motorola, Xiaomi, and Huawei, successfully deploying SurfingAttack against 15 of them.

And what could they do once they activated the voice assistants? They could command them to unlock devices, take repeated selfies, make fraudulent calls and even get the phone to read out a user's text messages, including SMS verification codes.

Clever till the end, the researchers recorded responses using a concealed microphone after turning down the device's volume so this communication would not be heard by a nearby user in an office setting. Like a law firm for example.

Yes, I hear you saying that voice assistants should only respond to the owner's voice. Not so fast. The owner's voice can now be cloned using machine learning software such as Lyrebird, as was the case in this test. It's a defense of course – the need to capture and clone the victim's voice. But not much of a defense to a determined attacker with access to the owner's voice.

The two phones that did not fall to SurfingAttack were Huawei's Mate 9 and Samsung's Galaxy Note 10. Why? The materials from which they were constructed dampened the ultrasonic waves. Putting the smartphone on a tablecloth amplified the defense.

While hacking voice assistants has not, to my knowledge, been seen in the wild, it's just matter of time. This demonstration may spur smartphone manufacturers to come with better countermeasures.

Sharon D. Nelson, Esq., President, Sensei Enterprises, Inc.
3975 University Drive, Suite 225|Fairfax, VA 22030
Email: Phone: 703-359-0700
Digital Forensics/Cybersecurity/Information Technology
https://senseient.com
https://twitter.com/sharonnelsonesq
https://www.linkedin.com/in/sharondnelson
https://amazon.com/author/sharonnelson