Google is returning to having humans analyze and rate anonymized audio snippets from its users. However, it’s also taken the major step of automatically opting every single user out of the setting that allows Google to store their audio. That’s why you might be getting an email today: Google would like you to opt back in to the program, and it’s trying to provide clearer information detailing what it’s all about.
Those are very big moves that affect a huge number of people — though Google says the precise number of users getting the email is confidential. It should land in the inbox of anybody who has interacted with a product that uses Google’s voice AI, including apps like Google Maps and services like Google Assistant.
Here’s a PDF of the email that is being sent to virtually everybody who’s spoken into a microphone with a Google logo next to it, which reads in part:
To keep you in control of your audio recording setting, we’ve turned it off for you until you are able to review the updated information. Visit your Google Account to review and enable the audio recordings setting if you choose.
It will link to this URL (which I’m listing out because you should never just click a URL to an account setting without double-checking it): https://myactivity.google.com/consent/assistant/vaa
It is difficult to remember now, but last summer, one of the biggest stories in tech was how every major company was using humans to review the quality of their AI transcriptions. When some of those audio recordings began to leak, it rocked Google, Amazon, Apple, Microsoft, and Facebook.
That meant tech’s 2019 summer of scandal was characterized by technical explanations of how machine learning works, apologies, outrage, walkbacks, and ultimately every company finally started making it easier for users to know what data was being stored and how to delete it. I’ll put a bunch of the stories in a sidebar just to give you a sense of how intense it was.
All of those companies got significantly better at providing real disclosures about how audio data was used and made it easier to delete it or opt out of providing it entirely. Most of those big tech companies also went back to using human reviewers to improve their services — with disclosures and / or asking users to consent again.
But Google didn’t bring back human reviewers after it paused the practice globally last September. When it did, it promised: “We won’t include your audio in the human review process unless you’ve re-confirmed your [Voice & Audio Activity] VAA setting as on.” Today’s email, then, is that promise made real — albeit it much later than everybody else…...Read More>>