Google Assistant may be vulnerable to attacks via subsonic commands

Smart Android And Trik-Commenting on Andorid indeed never endless, because smart devices this one is often updated every certain amount of time. So that the market can always be garapnya menerinya with pleasure. And it is not denied if this device has become the lifestyle of each society. To not wonder if the 6th business information and many are turning to mobail smartphone. With Android which thoroughly dominated the mobile industry, choosing the best Android smartphone is almost identical to choose the best smartphone, period. But while Android phones have few real opponents on other platforms, internal competition is intense.

From the sleek devices impress with the design premium, up to a full plant furniture features, to a very good device, and affordable mobile phone has a heavy weight, the Android ecosystem inhabited by a diverse range of attractive mobile phone Google Assistant may be vulnerable to attacks via subsonic commands Google Assistant may be vulnerable to attacks via subsonic commands,But "oversize" are subjective, and sometimes pieces of the specification and a list of features is not enough to get an idea of how good a phone. In this roundup, we look at the absolute best-the Android phone you can't go wrong with. The habits of young people or to accentuate trand blindly lifestyle, make this a medoroang this clever device industry vying to do modifications to the device, with a distinctly vitur vitur-tercanggihnya. So it can be received over the counter Google Assistant may be vulnerable to attacks via subsonic commands

Google Assistant

  • A new study claims that Google Assistant, and other voice command-based AI services like Alexa and Siri,may be vulnerable to subsonic commands.
  • The study says that while these commands cannot be heard by humans, they can be detected by Google Assistant, Siri and Alexa.
  • In theory, cybercriminals could use these commands to order these services to purchase products, launch websites and more.

We have already seen that voice-based AI services like Google Assistant can accidentally be turned on just by listening to a TV commercial. Now a new study claims that Google Assistant, along with its rivals like Apple's Siri and Amazon's Alexa, could be vulnerable to sound commands that can't even be heard by humans.

According to The New York Times, the research was conducted by teams

at Berkeley and Princeton University in the US, along with China's Zhejiang University. They say that they have created a way to get rid of sounds that would normally be heard by Google Assistant, Siri and Alexa, and replace them with audio files that cannot be heard by the human ear. However, they can be heard and used by the machine learning software that's used to power these digital assistants.

So what does that mean? In theory, the researchers claim that cybercriminals could use these subsonic commands to cause all sort of havoc. They could put in audio in a YouTube video or website that could cause Google Assistant to order products online without your consent, launch malicious sites and more. If a speaker like Google Home is connected to smart home devices, these kinds of stealth commands could possibly order your security cameras to shut down, your lights to go off and your door to unlock.

The good news is that there is no evidence that these kinds of subsonic commands are being used outside the university research facilities that found them in the first place. When asked to comment, Google claims that Assistant already has ways to defeat these kinds of commands. Apple and Amazon have also commented, claiming they have taken steps to address these concerns. Hopefully, these companies will continue to develop security measures to defeat these kinds of threats.



from Android Authority https://ift.tt/2G7U1Ch
via IFTTT

Read:


Subscribe to receive free email updates:

Related Posts :

0 Response to "Google Assistant may be vulnerable to attacks via subsonic commands"

Post a Comment