Google Assistant users’ private records have arrived on the Internet. Google blames another company.
Private audio recordings of over 1,000 Google service users have reached the Internet.
These are done using the Google Assistant virtual assistant, which seems to record the voice of users, even when the phrase “OK, Google” is not spoken.
Thus, new details have emerged about how virtual assistants work and what was the source of these recordings.
The Belgian publication VRT NWS has been in possession of the 1,000 records made by Google Assistant, and their listening revealed some very interesting details.
First of all, not all registrations were made to the order of 1,000 users, 153 were “accidental” records, with things that users said in private, without knowing that they were “spied on” by Google services. The “OK, Google” command had not been activated.
Google’s audio recordings include conversations between partners in the bedroom, conversations between parents and children, but also disputes or important business calls.
Those registered have thus disclosed private or secret information, believing that they are in privacy. More serious is that all these records go to Google contracted centers for text listening and transcription, to better train the virtual assistant.
Even if the data is transmitted to these centers anonymously, through the details that users provide believing that they are in privacy, but also through the content of certain messages they dictate to Google Assistant, users could be identified by those who listen daily. by day their records.
Google has confirmed the veracity of these records and blamed the employee of the Dutch company who posted these records on the Internet:
“Our security teams have been alerted to this problem, they are investigating and they will take action. We are also conducting an internal report on safety measures in this area to prevent such behavior in the future. ”
When it comes to unauthorized registrations, without the “OK, Google” command, the company says these cases are internally called “false accept” and occur when the phone thinks it heard the command because of similar words or sounds in the environment.
In general, these situations should not happen as there are protections against them. Of course, these situations exist, being proven by the records that have arrived on the Internet.
In case you are wondering which devices are recording you, well, any Google Assistant compatible service is activated. We are talking about smartphones, tablets, smart speakers, compatible light bulbs or anything including a microphone and Google Assistant support.
A similar situation has been reported for Amazon Alexa and it is likely that such situations apply to all devices equipped with virtual wizards.
