A section in the policy on how the company uses personal data now reads (emphasis ours):
Our processing of personal data for these purposes includes both automated and manual (human) methods of processing. Our automated methods often are related to and supported by our manual methods. For example, our automated methods include artificial intelligence (AI), which we think of as a set of technologies that enable computers to perceive, learn, reason, and assist in decision-making to solve problems in ways that are similar to what people do. To build, train, and improve the accuracy of our automated methods of processing (including AI), we manually review some of the predictions and inferences produced by the automated methods against the underlying data from which the predictions and inferences were made. For example, we manually review short snippets of a small sampling of voice data we have taken steps to de-identify to improve our speech services, such as recognition and translation.
Multiple tech giants’ use of human workers to review users’ audio across a number of products involving AI has grabbed headlines in recent weeks after journalists exposed a practice that had not been clearly conveyed to users in terms and conditions — despite European privacy law requiring clarity about how people’s data is used.
Such workers are typically employed to improve the performance of AI systems by verifying translations and speech in different accents. But, again, this human review component within AI systems has generally been buried rather than transparently disclosed.
Earlier this month a German privacy watchdog told Google it intended to use EU privacy law to order it to halt human reviews of audio captured by its Google Assistant AI in Europe — after press had obtained leaked audio snippets and being able to re-identify some of the people in the recordings.
On learning of the regulator’s planned intervention Google suspended reviews.
Apple also announced it was suspending human reviews of Siri snippets globally, again after a newspaper reported that its contractors could access audio and routinely heard sensitive stuff.
Facebook also said it was pausing human reviews of a speech-to-text AI feature offered in its Messenger app — again after concerns had been raised by journalists.
So far Apple, Google and Facebook have suspended or partially suspended human reviews in response to media disclosures and/or regulatory attention.
While the lead privacy regulator for all three, Ireland’s DPC, has started asking questions.
Microsoft told Motherboard it is not suspending human reviews at this stage.
Users of Microsoft’s voice assistant can delete recordings — but such deletions require action from the user and would be required on a rolling basis as long as the product continues being use. So it’s not the same as having a full and blanket opt out.
We’ve asked Microsoft whether it intends to offer Skype or Cortana users an opt out of their recordings being reviewed by humans.
The company told Motherboard it will “continue to examine further steps we might be able to take”.
Update: Microsoft has now sent us this statement:
Microsoft collects voice data to provide and improve voice-enabled services like search, voice commands, dictation or translation services, and we get customer permission before collecting and using voice data. We sometimes engage vendors to assist in improving our voice services. We take steps to de-identify the content provided to vendors, require non-disclosure agreements with all vendors and their employees to protect our customer’s privacy, and require that handling of this data be held to the highest privacy standards set out in European law. At the same time, we’re always looking to improve transparency and help customers make more informed choices. We realized, based on questions raised recently, that we could do a better job specifying that humans sometimes review this content. We’ve updated our privacy statement and product FAQs to add greater clarity and will continue to examine further opportunities to improve.