Apple has been sued by the state of West Virginia over what it says is a failure to prevent child sexual abuse materials (CSAM).
"We are innovating every day to combat ever-evolving threats and maintain the safest, most trusted platform for kids," an ...
As Testut and Shane explain: As you may have heard, over the last few weeks X and Grok have made it possible for child sexual abuse material (CSAM) to be generated and widely distributed on their apps ...
The lawsuit accuses Apple of prioritizing privacy branding and its own business interests over child safety.
WASHINGTON, Dec. 18, 2025 /PRNewswire/ -- DNSFilter, a global leader in protective DNS and content filtering, today reported a record level of blocked child sexual abuse material (CSAM) across ...
West Virginia's Attorney General is suing Apple, alleging the tech giant prioritizes privacy over child safety by failing to prevent child sexual abuse material on iPhones and iCloud. The lawsuit ...
Brand safety isn’t always cut and dried. An alcohol brand, for instance, might look for content that other brands would instinctively steer clear of. But some media doesn’t leave room for nuance. On ...
After a massive outcry from privacy advocates, child safety groups, and governments, Apple dropped its plans for scanning iCloud photos against the CSAM database. Instead, it now relies on its ...
Content warning: This article contains information about alleged child sexual abuse material. Reader discretion is advised. Report CSAM to law enforcement by contacting the ICAC Tip Line at (801) ...