Google faced a backlash earlier this year when it was revealed that human workers were listening to audio recordings made by Google Assistant users. The practice, done in order to improve the service, reportedly captured users’ personal conversations too.
Now, the search colossus has published a blog post detailing privacy improvements for its virtual assistant. The company reiterated that it doesn’t retain your audio recordings by default, saying it asks users to opt in to the Voice and Audio Activity (VAA) initiative when first setting up Assistant. Fortunately, it’s also making things easier for existing users in this regard.
“If you’re an existing Assistant user, you’ll have the option to review your VAA setting and confirm your preference before any human review process resumes,” Google noted. “We won’t include your audio in the human review process unless you’ve re-confirmed your VAA setting as on.”
What else is the company doing?
Google says Assistant already automatically deletes any recordings in the event of unintentional activation. But it added that it’s bringing “additional measures” to better identify accidental activations and exclude them from being reviewed by humans. In fact, Google says it’s working on the ability to adjust Assistant’s sensitivity to wake-words.
“We’re also updating our policy to vastly reduce the amount of audio data we store,” the company wrote on its blog. “For those of you who have opted in to VAA, we will soon automatically delete the vast majority of audio data associated with your account that’s older than a few months. This new policy will be coming to VAA later this year.”
Finally, Google says it’s adding better security protections to the human review process, including more privacy filters.
What do you think of human workers listening to Assistant recordings? Give us your thoughts below! You can also access your Assistant data and controls via this link.