Apple Inc. AAPL has for the first time explained why it abandoned plans to scan iPhones for child sexual abuse material (CSAM), underlining the importance of privacy and the "unintended consequences" of CSAM detection that it had planned to roll out back in 2021.
What Happened: Apple has given a detailed response to justify its decision to abandon iPhone CSAM detection plans, after child safety group Heat Initiative said it would campaign against Apple's decision.
See Also: Google Pixel 8 Vs Pixel 7: Six Changes That Make Pixel 8 Worth The Wait
"Scanning every user’s privately stored iCloud data would create new threat vectors for data thieves to find and exploit. It would also inject the potential for a slippery slope of unintended consequences," Erik Neuenschwander, Apple’s director of user privacy and child safety told Wired.
It's rare for Apple to issue a detailed response to third-party questions, but it only further shows how important this is for the company.
Back when Apple announced its iPhone CSAM detection plans, cybersecurity and privacy experts criticized it, saying that it would harm user privacy. The Electronic Frontier Foundation, a digital rights group, called it opening a "backdoor" to the private lives of users.
After facing immense backlash, Apple quietly pulled references to CSAM detection on its website and subsequently called it off entirely in 2022.
‘Not Practically Possible' Without Impacting User Privacy
Apple's initial approach to CSAM detection would have violated user privacy and also introduced new vectors for malicious parties to attack. This is why the company decided to abandon the plan, according to Neuenschwander.
Instead, Apple has come up with a different solution with on-device CSAM detection within apps themselves – for example, apps like Messages, FaceTime, AirDrop, and more have on-device nudity detection systems.
Apple has also launched an application programming interface (API) for this so third-party apps can implement it without adversely impacting user privacy.
Apple says its approach now is to direct victims to law enforcement and other local resources, instead of acting as an intermediary.
Check out more of Benzinga's Consumer Tech coverage by following this link.
Read Next: From ‘Dog Eating Ice Cream’ To ‘Cat in Grass’: Bing Now Decodes Contextual Image Descriptions
Photo by WML Image on Shutterstock
© 2024 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.
Comments
Trade confidently with insights and alerts from analyst ratings, free reports and breaking news that affects the stocks you care about.