Apple anti-child abuse features get delayed: will collect more input before release
Apple is pausing its CSAM (Child Sexual Abuse Material) features after the backlash it received. In a statement to 9to5Mac, the company states that it needs to look further into improving those features. Apple says it will gather more information from the different groups and advocates of those that were concerned.
This statement doesn’t come as a shock, as the company was heavily scrutinized about its iCloud and iMessage scanning measures. Apple didn’t provide a new release date for the CSAM features. They were supposed to come before the end of the year as part of iOS 15. One of the scanning features Apple planned to introduce involved iCloud Family. Namely, the company wanted to shield any children under the age of 13 from sending or receiving sexually explicit images in iMessage.
The feature would have worked by Apple blurring out the potentially explicit picture, stating it is sensitive and explaining why, and then letting the child decide whether to continue or not. If the child decided to see it anyway, their parents would receive a notification.
Separately, the Cupertino-based company announced that iCloud Photos would also scan for CSAM on-device. If Apple detected such materials, it would send a report to the National Center for Missing & Exploited Children (NCMEC) with the user’s account information.