Smart virtual assistants can make daily life more organized and can connect all your schedules within your other smartphones or tablets. Others, however, are simply used for music, streaming, or news.
While Amazon’s Alexa and Google Home listen to your commands and comments, is the information recorded? Moreover, who has access to these sound files? What is done with sound files that do not include commands?
Journalist of Belgian broadcasting station VRT, Tim Verheyden, approached the Waasmunsters, a Belgium couple, who owned a Google Home, with a segment of audio from their home. This particular clip from the Waasmunster couple was of their son and grandson, and the information contained was their actual address. All findings were covered in a report by VRT.
Google hires contractors who comb over thousands of audio files collected from people’s homes. This is where the journalist gained access to over 1,000 audio files. Such files can be drawn from smart speakers, home security systems, or cell phone assistants like Siri.
These contain sensitive information which is simply brought up in private or on a phone call. If you pay bills over the phone, your assistant has likely heard your card or account info. Clearly, these sound files would need to stay in the correct hands.
Is It Accidental?
Google Assistant smart speakers usually are set to ‘wake up’ from certain command words. However, they are always listening to catch the command word when needed. Since speech patterns are different in all people, they often turn on at the sign of a fake word, something that sounds similar.
They automatically stop once they determine whatever it is you’re talking about is not a command. This information is then saved for analysis to sort out bugs that caused it to start or misunderstand your command.
Having clips of information that is not directly spoken to the device is alarming, but it is sometimes accidental. Any data questioned through AI smart speakers is commonly thought of being stored. You search once for houses for sale in Burlington and ads for these homes will pop up on any connected accounts.
Some other digital files from homes within the region were private phone calls, and even conversations on their child growing, or recent injuries healing. Love life discussions were heard, and even just declarations that someone was using the bathroom. These examples are obviously not things comfortable to share with strangers.
Google’s Need For Audio Clips
Why does Google need these clips of information? Google uses these audio files to investigate its ability to process human voices through the machine. They check the quality of the Google Assistant’s responses, mainly. Also, the tone of the voices is added to a bank, where accents and tonal changes in speech are noted to be understood electronically.
The sensitivity of the information is the primary concern among those who own the devices. There are even those who strongly believe this goes against the new data privacy laws, GDPR, submitted to the European Union last year. These laws are specifically meant to protect sensitive information, from medical to personal issues, and require those with access to the data to share how and what it is used for.
Google uses the app Crowdsource to send contractors assignment to access the audio sequences. It is completely private, clips are not associated with user accounts and are only shared between one or two people, but is that still too many for private information? Unfortunately, the technology to fully analyze this information digitally is not yet available, and people are still required to accurately process all information.
The Google contractor who spoke out cited several recordings in particular that had caught his attention. One example involved a woman who seemed to be in distress, likely involving physical violence. While the incident had long since past, Google had not given instructions on what contractors should do in a situation such as this.
Since the contractor gave the audio file to Verheyden, he could be breaching his contract with Google. Also, it causes, even more, a publicization of private audio files. Google has responded, saying only 0.2 of Google Assistant conversations are kept to analyze.
Data Privacy Laws
The GDPR absolutely forbids the disclosure of health information of users. There were several health-related clips with the Google leak. Explicit consent from the user is required to collect health information, let alone sending it out to contractors or releasing it to the press.
GDPR also requires these companies to be totally transparent on how information is gathered and what is done with it. However, on the Google Home’s webpage, it only specifies that clips are used to improve the language recognition software, but now exactly how that is done.
The company also states that only conversations following a wake word are recorded, but as we know, these devices can make mistakes recognizing the wake word. Technology Policy Researcher at London’s Alan Turing Institute, Michael Veale, has said this may not be specific enough to cover GDPR restrictions.
Google has not claimed to forgo continuation of their recordings but does promise to make it clearer to users when and how their data is being used. This means changes may be made to the company site addressing these issues, but they will continue using sound clips to improve the software’s capabilities.
Apple has faced issues with GDPR because users cannot access their own recordings, though they say all information is stored securely. Amazon, on the other hand, now gives users the option to have everything for the day deleted, as long as you remember to command Alexa to do so. None of these companies fully describe the analysis process, though it is claimed to be used for the same quality assurance.
Know Your Data’s Privacy
Alexa, Siri, and Google Assistant are in questioning for their data habits. While no evidence has come out showing malicious use of this data, the suspicious mostly reflects the public’s inability to know what is being recorded. All of this information is stored somewhere, and if not completely secured, it results in breaking several data privacy laws and putting the public at risk.
If you are interested in even more technology-related articles and information from us here at Bit Rebels, then we have a lot to choose from.