TechCrunch
1 min read

Apple is facing a lawsuit for not implementing a proposed system to scan iCloud photos for child sexual abuse material (CSAM). The plaintiff, a 27-year-old woman suing under a pseudonym, alleges that Apple's inaction forces victims to relive their trauma, as she continues to receive law enforcement notices about her abuse images being shared online. Apple initially announced the CSAM detection system in 2021 but abandoned it following concerns from privacy advocates about potential government surveillance risks. The case could involve a group of 2,680 victims eligible for compensation, according to attorney James Marsh. Apple, while declining specific comments, has stated its commitment to innovating solutions that combat CSAM without compromising user privacy. Continue here.


If you do need a website or your business needs a website, we’re here to bring your dreams to live. Contact us. We would give you the best in quality and the most affordable you would get on the market place. Enjoy our 100% refundable deals. You can’t loose let’s talk about your project.

Kindly reach out on WhatsApp directly and let’s make this decision your most important and best investment post 3 years.

Disclaimer: Full credit to the writer, and the associates.

Comments
* The email will not be published on the website.