MobileNews.top
Mobile News is for people who likes to get updated on daily basis about Android, iOS, Windows Phones, Apps, Security, and much more articles. If you have any kinds of questions please free to ask by contacting us.

Apple halts practice of contractors listening in to users on Siri

Apple has suspended its practice of having human contractors listen to users’ Siri recordings to “grade” them, following a Guardian report revealing the practice.

The company said it would not restart the programme until it had conducted a thorough review of the practice. It has also committed to adding the ability for users to opt out of the quality assurance scheme altogether in a future software update.

Apple said: “We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”

The suspension was prompted by a report in the Guardian last week that revealed the company’s contractors “regularly” hear confidential and private information while carrying out the grading process, including in-progress drug deals, medical details and people having sex.

The bulk of that confidential information was recorded through accidental triggers of the Siri digital assistant, a whistleblower told the Guardian. The Apple Watch was particularly susceptible to such accidental triggers, they said. “The regularity of accidental triggers on the watch is incredibly high … The watch can record some snippets that will be 30 seconds – not that long, but you can gather a good idea of what’s going on.

Sometimes, the Apple contractor said, “you can definitely hear a doctor and patient, talking about the medical history of the patient. Or you’d hear someone, maybe with car engine background noise – you can’t say definitely, but it’s a drug deal … you can definitely hear it happening. And you’d hear, like, people engaging in sexual acts that are accidentally recorded on the pod or the watch.”

Although Apple told users that Siri data may be used “to help Siri … understand you better and recognise what you say”, the company did not explicitly disclose that this entailed human contractors listening to a random selection of Siri recordings, including those triggered accidentally.

“Too often we see that so-called ‘smart assistants’ are in fact eavesdropping, said Silkie Carlo, the director of the UK campaign group Big Brother Watch. “We also see that they often collect and use people’s personal information in ways that people do not know about and cannot control.”

She added: “Apple’s record on privacy is really slipping. The current iOS does not allow users to opt out of face recognition on photos, and this revelation about Siri means our iPhones were listening to us without our knowledge.”

The company is not alone in taking flak for its undisclosed quality assurance programmes. Both Amazon and Google also use contractors to check the quality of their voice assistants, according to reports in Bloomberg and on the Belgian TV channel VRT, and contractors from both companies have expressed discomfort at the nature of overheard recordings.

Leave A Reply

Your email address will not be published.