Apple delays CSAM feature roll-out to make improvements and listen to feedback

Apple delays CSAM feature roll-out to make improvements and listen to feedback


It seems that we will have to wait a bit longer to see Apple’s Child Safety Features in action. The company has announced that it will delay the rollout of this new feature after tons of negative feedback.

Last month, Apple announced that it would introduce new child safety features that would be coming to iOS 15, iPadOS 15, and macOS Monterrey. These new changes would allow Cupertino to scan users’ photos in the iCloud Library and iMessages to search for any kind of content that may indicate potential child abuse. Now, Apple has explained that these scans would only happen on the device, but that didn’t stop Apple users and security researchers from being concerned.

Since then, Apple has been in the eye of the storm. It has been receiving a lot of backlash from security researchers. And it’s also been confirmed that the company has been scanning iCloud Mail for CSAM content and other data since 2019.
“Apple is dedicated to protecting children throughout our ecosystem wherever our products are used, and we continue to support innovation in this space. We have developed robust protections at all levels of our software platform and throughout our supply chain. As part of this commitment, Apple uses image matching technology to help find and report child exploitation. Much like spam filters in email, our systems use electronic signatures to find suspected child exploitation. We validate each match with individual review. Accounts with child exploitation content violate our terms and conditions of service, and any accounts we find with this material will be disabled.”

Now, Apple seems to believe that the best way to go is to put this new feature on hold, as the company has received feedback from customers, non-profit and advocacy groups, researchers, and others. The idea is to give Apple more time to improve how this new feature will work to avoid further issues.

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

Now, we will only have to wait and see when this new feature will rollout since it seems that Apple is still decided to include it in future updates of iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.

Source MacRumors




A former bilingual teacher that left the classrooms to join the team of Pocketnow as a news editor and content creator for the Spanish audience. An artist by nature who enjoys video games, guitars, action figures, cooking, painting, drawing and good music.





Source link

Leave a Reply

%d bloggers like this: