Apple resumes human reviews of Siri audio with iPhone update | Inquirer Technology

Apple resumes human reviews of Siri audio with iPhone update

/ 08:27 AM October 30, 2019

A screen displays a notice when installing the update, iOS 13.2 on an iPhone on Tuesday  in New York. (AP)

Apple is resuming the use of humans to review Siri commands and dictation with the latest iPhone software update.

In August, Apple suspended the practice and apologized for the way it used people, rather than just machines, to review the audio.

Article continues after this advertisement

While common in the tech industry, the practice undermined Apple’s attempts to position itself as a trusted steward of privacy. CEO Tim Cook repeatedly has declared the company’s belief that “privacy is a fundamental human right,” a phrase that cropped up again in Apple’s apology.

FEATURED STORIES

Now, Apple is giving consumers notice when installing the update, iOS 13.2. Individuals can choose “Not Now” to decline audio storage and review. Users who enable this can turn it off later in the settings. Apple also specifies that Siri data is not associated with a user’s Apple ID.

Tech companies say the practice helps them to improve their artificial intelligence services.

Article continues after this advertisement

But the use of humans to listen to audio recordings is particularly troubling to privacy experts because it increases the chances that a rogue employee or contractor could leak details of what is being said, including parts of sensitive conversations.

Article continues after this advertisement

Apple previously disclosed plans to resume human reviews this fall, but hadn’t specified when. Apple also said then that it would stop using contractors for the reviews.

Other tech companies have also been resuming the practice after giving more notice. Google restarted the practice in September, after taking similar steps to make sure people know what they are agreeing to. Also in September Amazon said users of its Alexa digital assistant could request that recordings of their voice commands delete automatically.

Initially published on October 30, 2019. Updated on June 13, 2023.
Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our daily newsletter

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

TOPICS: Apple, Privacy, Siri
TAGS: Apple, Privacy, Siri

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our newsletter!

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

© Copyright 1997-2024 INQUIRER.net | All Rights Reserved

This is an information message

We use cookies to enhance your experience. By continuing, you agree to our use of cookies. Learn more here.