Last Updated on 29/07/2019 by TDH Publishing (A)
Voice assistants are growing in popularity, but the technology has been witnessing a parallel rise in concerns about accuracy and privacy. Apple’s Siri is the latest one to enter into this grey space of tech.
This week, a report said that contractors who review Siri recordings for accuracy and to help make improvements might be hearing personal conversations.
One of the contractors said that Siri did sometimes record audio after mistaken activations. The wake-up phrase is “Hey Siri”, but according to an anonymous report, it could also get activated by similar-sounding words or with the noise of a zipper. The report also said that whenever an Apple watch is raised and speech is detected, Siri automatically gets activated.
“There have been endless instances of recordings featuring private conversations between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on,” said the source. “These recordings are accompanied by user data showing location, contact details, and app data.”
Apple has said that it takes steps to safeguard users from being connected with the recordings sent to contractors. the audio is not linked to Apple ID and less than 1% of regular Siri activations are reviewed. It also receives confidentiality requirements for those contract workers.
Apple, in conjunction with Google and Amazon, all have similar policies for the contract workers it hires to review those audio snippets. But all three voice AI manufacturers have additionally been the subject of comparable privacy breaches, either by whistle-blowers going to the press or through errors that give users access to incorrect audio files.
What will Apple and its colleagues do to better protect user privacy as they develop their voice systems? Should users be notified once their recordings are reviewed? What may be done to scale back or eliminate the accidental activations? How should the companies handle the accidental information that its contractors overhear? Who is accountable once a dangerous or illegal activity is recorded and discovered, all by accident?
Voice assistants seem to be yet another instance where technology has been developed and adopted quicker than its consequences have been thought-out.