Sounds like an easy fix would be to ignore commands given in frequencies higher than what the human ear can hear. Similar to how Google fixed the burger king commercial “hack”
The slightly reassuring news here it is that to succeed, the attacker must get physically close to the target device, and put another speaker capable of transmitting ultrasonic sound near it, according to the Princeton group.
Would be cool to use this hack and something like tasker to take advantage of the Simon says feature on echo to finally get notifications from Alexa.
The solution to this is fairly simple: voice identification. Once inestablish myself as the admin of the device, I authorize new voices.
Then a hacker would have to accurately mimic my voice to access it.