← NewsAll
Amazon reports hundreds of thousands of child sexual abuse images in AI training data
Summary
Amazon says it found hundreds of thousands of suspected child sexual abuse images in AI training data during 2025 and reported them to the National Center for Missing and Exploited Children, while providing limited information about where the material originated.
Content
Amazon reported that it found hundreds of thousands of pieces of suspected child sexual abuse material in data collected to train its AI models during 2025. The company says it removed the material before using the data and reported the findings to the National Center for Missing and Exploited Children (NCMEC). Child safety officials have expressed concern because Amazon provided limited information about the material's origins. The discovery occurred amid an intensified industry effort to gather large datasets for AI training.
Key facts:
- Amazon detected the suspected material throughout 2025 and reported hundreds of thousands of items to NCMEC.
- The company says the content was removed and that, as of January, it is not aware of any instances where its models generated such material.
- NCMEC's CyberTipline director said Amazon provided little to no detail about where the material came from, which the clearinghouse described as making many reports inactionable for law enforcement.
- Amazon accounted for the majority of more than one million AI-related reports to NCMEC in 2025, a sharp rise from about 67,000 AI-related reports in 2024 and 4,700 in 2023.
Summary:
Child safety officials and experts have said the volume of reports and the lack of origin details limit the clearinghouse's ability to trace sources and pass actionable information to law enforcement. Amazon says it used an over-inclusive scanning threshold, removed the content, and is not aware of any model generation of such material. Undetermined at this time.
