Big Tech companies still taking "voluntary action" to detect child sexual abuse material – despite no EU legal basis for proactive scanning ...
In August 2021, Apple announced a plan to scan photos that users stored in iCloud for child sexual abuse material (CSAM). The tool was meant to be privacy-preserving and allow the company to flag ...
Veritone, Inc. (NASDAQ: VERI), a leader in building enterprise AI and data solutions, today announced the integration of ...
Generative AI is amplifying online child abuse risks, prompting calls for stronger laws, reporting systems, and ...
BRUSSELS — Technology companies in Europe face a cliff edge on Saturday, as a law allowing them to scan online messages for ...
“Any numbers that we see, it’s the tip of the iceberg,” said Melissa Stroebel, vice president of research and strategic ...
Experts warn lapse could sharply reduce reports of abuse, echoing a 58% drop during a similar legal gap in 2021 ...
Lora Kolodny writes at CNBC: West Virginia’s attorney general has filed a consumer protection lawsuit against Apple, alleging that it has failed to prevent child sexual abuse materials from being ...
Major year-over-year increase in CSAM detection and prevention highlights expanded safety innovation in the wake of explicit GenAI content WASHINGTON, Dec. 18, 2025 /PRNewswire/ -- DNSFilter, a global ...
Over 500 cryptography scientists and researchers have signed a joint letter against the EU's controversial child sexual abuse (CSAM) scanning proposal Experts warn that the Danish version of the text ...
Add Yahoo as a preferred source to see more of our stories on Google. European flags in front of the European Commission headquarters in Brussels, Belgium. (Getty Images) This story was originally ...