UK government praises Apple’s CSAM tech and raises concerns over end-to-end encryption


A few weeks ago, Apple announced a new technology that would be scanning iCloud Photos for instances of child abuse material, so it can fight against it. However, upon announcement, the new system was met with controversy as some cryptographers expressed concerns about the potential such a technology has of breaching users’ privacy. Then, Apple postponed its plans because of the backlash. Now, AppleInsider reports that the UK is expressing the need for child protection online and praised the tech.

UK Home Secretary thinks Apple’s CSAM scanning tech is a novel way to solve the issue of child abuse

Priti Patel, Home Secretary of the UK, highlighted recently that an organized procedure of battling against child sex abuse online is needed and should be incited and celebrated online. Patel is determined to work on the issue and curb the dissemination of child sexual abuse material (CSAM) and therefore ensure public safety online. The Home Secretary has urged international partners and allies to support UK’s approach. Patel’s thoughts were published by The Telegraph and stated that holding technology companies accountable and asking social media platforms to put public safety before profits are of the primary importance of dealing with the issue. The Home Secretary then moves on to admire Apple for taking the first step with the announcement of the CSAM scanning technology. The same technology that last month was the reason why many cryptographers and even the German parliament expressed concerns over potential abuse of the scanning tech and undermining of users’ privacy. Patel underlines that according to Apple, the new technology has a false positive rate of 1 in a trillion, and states that this means the privacy of legitimate users is protected, while criminals are caught.

However, Patel’s strategy also contradicts some of Apple’s policy of user privacy, as the Home Secretary targets also end-to-end encryption which is currently in use by Apple’s iMessage and many other chat services. Her strategy includes law enforcement having access to communications.

She states such a technology can be used to obscure vital information that may be needed in police investigations and complicates efforts authorities need to put in order to capture criminals. Patel also dislikes plans put forward by Facebook to put end-to-end encryption for FB Messenger and states that end-to-end encryption should not open the door to child sexual abuse. She highlights these actions are not because the governments want to spy on innocent citizens, but it is about keeping children, the most vulnerable in society, safe and preventing evil crimes.

Apple, on the other hand, has been following its privacy strategy (of course, following regular laws and legal requests for user data) and has been developing software services making it impossible for third parties to gain access to users’ information.

The CSAM initiative, according to some cryptographers, was something going against Apple’s general privacy policy.

Apple’s CSAM scanning tech is now postponed

Announced a few weeks ago, the new technology was going to use hashing systems to scan iCloud Photos for CSAM. Despite Apple stating the technology will not scan any locally stored photos on iPhones, nor the company will give any rights to governments to use the tech for anything different than scanning for child sexual abuse materials, a controversy was sparked.Cryptographers believed the new technology goes against Apple’s privacy principles and could be abused by authoritative governments as a means of spying on citizens. Cryptographers even send a letter signed by many specialists detailing their concerns with the introduction of the CSAM scanning system. But it was not only cryptographers that had concerns with it. Recently, Germany also sent a letter to Tim Cook, expressing concerns about it. Now, the deployment of the CSAM scanning tech has been postponed due to the backlash it received.



Source link

%d bloggers like this: