Apple and Amazon have announced that they are curtailing the use of humans to review conversations on their digital voice assistants like Siri and Alexa. Such contractors listen in on users on the digital voice assistants to grade their accuracy or to improve its speech-recognition capabilities.
An Apple whistleblower reported that they often overhear doctors’ appointments, drug deals and even couples having sex. Furthermore, the recordings are accompanied by user data that give away important information like locations, contact details and app data.
Apple has thus announced that it would suspend the human analysis of these voice recordings globally and review how it grades Siri’s accuracy. Amazon’s new privacy policy now allows users to opt out of having humans review the recordings. Last year, Google quietly changed its default and the Google Assistant now no longer automatically records what it hears once it is activated.
Read the full article on The Straits Times: Hey, Siri reviewer, no more listening to sex talk
Analysis:
How do we balance between the right to privacy of consumers versus the move to improve the accuracy of the technology that has pervaded our daily lives? Additionally, by using the digital voice assistants, do users implicitly imply that their recordings may be used to improve their user experience?
An improvement in the accuracy of the digital voice assistants would better the users’ experience of the speech-recognition technology. In furtherance of this aim, Apple and Amazon have employees review the voice recordings so that it could offer better-matched patterns for the voice results.
Yet, consumers are mainly disturbed because they did not know that the privacy policies which they signed up for meant that there would be human reviewers of what they record in the digital voice assistants. On one level, how many of us read the privacy policies and agreements when we sign up for online services? Even if we do read the policies, it is not clear that data used to improve the recognition feature mean that there would be human employees going through what we search vocally.
This technological quagmire could have been averted if the users were given clear warning about how their voice recordings would be used. That said, one can imagine that most users would opt out of this if they could and there might be insufficient data to improve the voice-recognition technology. Moreover, users’ privacy could be protected if it were sufficiently anonymised and that the voice recordings were not linked to their user data.
Questions for further personal evaluation:
- Should users of online services start paying more attention to the terms of use that come with the services?
- In the pursuit of technological progress, how far can technology providers go in relying on users’ data?
Useful vocabulary:
- ‘curtailing’: to make less by or by cutting off or away some part
- ‘quagmire’: a difficult, precarious or entrapping position; predicament