Meet the Daughter of Steve and Lauren Powell-Jobs, a Stanford University Graduate and Elite Equestrian
Apple Ordered to Pay a Group of Patent Trolls $300 Million in Marshal Texas Retrial

Apple advises staff to be ready to field questions from consumers about their upcoming features for limiting the spread of child pornography

1 x cover Craig Federighi interviewed by Joanna Stern WSJ on Child Porn safeguards

(Click on image to Greatly Enlarge)


Earlier today Patently Apple posted a report titled "Apple's new Rebellious Culture is now Against their Company's decision to join the fight against Child Abuse Material (CSAM). In an update to that report we added the Wall Street Journal's video interview with Joanna Stern and Apple's VP of software Craig Federighi. It's an excellent interview we've added to this report.



This afternoon, we're learning that "Apple Inc. had warned retail and online sales staff to be ready to field questions from consumers about the company’s upcoming features for limiting the spread of child pornography," according to Mark Gurman of Bloomberg.


The report added that "In a memo to employees this week, the company asked staff to review a frequently asked questions document about the new safeguards, which are meant to detect sexually explicit images of children. The tech giant also said it will address privacy concerns by having an independent auditor review the system." This week’s memo read:


"You may be getting questions about privacy from customers who saw our recent announcement about Expanded Protections for Children. There is a full FAQ here to address these questions and provide more clarity and transparency in the process. We’ve also added the main questions about CSAM detection below. Please take time to review the below and the full FAQ so you can help our customers.


The iCloud feature has been the most controversial among privacy advocates, some consumers, and even Apple employees. It assigns what is known as as a hash key to each of the user’s images and compares the keys with ones assigned to images within a database of existing explicit material. If a user is found to have such images in their library, Apple will be alerted, conduct a human review to verify the contents, and then report the user to law enforcement."


For more on this, read the Mark Gurman report on Bloomberg that's found on Yahoo! Finance.


10.0F - Apple News


The comments to this entry are closed.