Смотреть в Telegram
🚨 Apple sued over abandoning CSAM detection for iCloud Apple is facing a lawsuit for not implementing a system to scan iCloud photos for child sexual abuse material (CSAM). The lawsuit claims that Apple's failure to take action forces victims to relive their trauma, despite the company previously announcing plans to enhance child protection measures. In 2021, Apple proposed using digital signatures from the National Center for Missing and Exploited Children to detect known CSAM in iCloud, but abandoned the initiative amid concerns from privacy advocates about potential government surveillance. The lawsuit is brought by a 27-year-old woman who, under a pseudonym, recounts being molested as an infant and continues to receive law enforcement notifications about charges related to images of her. Attorney James Marsh indicated that there could be a group of 2,680 victims eligible for compensation. Apple stated that it is actively working on solutions to combat these crimes while maintaining user security and privacy. This lawsuit follows another case from August, where a 9-year-old girl and her guardian accused Apple of failing to address CSAM on iCloud. 💬 Source 📌 Powered by V3V Ventures
Please open Telegram to view this post
VIEW IN TELEGRAM
Love Center - Dating, Friends & Matches, NY, LA, Dubai, Global
Love Center - Dating, Friends & Matches, NY, LA, Dubai, Global
Бот для знакомств