Taiwan's Asus will be Entering the New Wave of Hybrid Tablet-Notebooks in the second half of the year joining HP and others
Russian Law Firm Sues Apple for halting Apple Pay Services

Apple is set to Roll out its Controversial 'Communication Safety in Messages' iPhone Feature in the UK

1 Cover - Apple feature set to roll out in the UK report

(Click on image to Greatly Enlarge)

 

On August fifth 2021 Patently Apple posted a report titled "Apple oddly forewarns child pornographers that their photos on iDevices will be scanned for illegal content starting this fall." On August ninth we followed-up with a report titled "Apple Categorically Denies that their Upcoming iOS Feature aimed at Exposing Sexual Predators is a Backdoor that reduces Privacy." Apple was under a lot of pressure to abandon this project.

 

Flash-forward to today and its being reported by The Guardian that a safety feature that uses AI technology to scan messages sent to and from children will soon hit British iPhones, Apple has announced.

 

The Guardian report states that "The feature, referred to as 'communication safety in Messages,' allows parents to turn on warnings for their children’s iPhones. When enabled, all photos sent or received by the child using the Messages app will be scanned for nudity.

 

If nudity is found in photos received by a child with the setting turned on, the photo will be blurred, and the child will be warned that it may contain sensitive content and nudged towards resources from child safety groups. If nudity is found in photos sent by a child, similar protections kick in, and the child is encouraged not to send the images, and given an option to 'Message a Grown-Up.'"

 

The report further noted that "All the scanning is carried out 'on-device,' meaning that the images are analyzed by the iPhone itself, and Apple never sees either the photos being analyzed or the results of the analysis.

 

Apple said in a statement that "Messages analyses image attachments and determines if a photo contains nudity, while maintaining the end-to-end encryption of the messages. The feature is designed so that no indication of the detection of nudity ever leaves the device. Apple does not get access to the messages, and no notifications are sent to the parent or anyone else."

 

Apple has also dropped several controversial options from the update before release. In its initial announcement of its plans, the company suggested that parents would be automatically alerted if young children, under 13, sent or received such images; in the final release, those alerts are nowhere to be found.

 

The company is also introducing a set of features intended to intervene when content related to child exploitation is searched for in Spotlight, Siri or Safari.

 

For more, read the full Guardian report.

 

10.0F - Apple News

Comments

The comments to this entry are closed.