contador pagina web Saltar al contenido

Apple scans iCloud photos for child pornography

Jane Horvath, Apple?s senior director of global privacy, confirmed that the company uses certain technologies to select and locate child pornography in photos stored on iCloud.

iCloud - child pornography

In his speech at CES 2020 dedicated to privacy, Horvath also mentioned a little-known aspect that allows Apple to find certain types of images on iCloud. In practice, the company scans photos uploaded to iCloud to make sure they don't contain anything illegal.

More specifically, Horvath mentioned child abuse, claiming that Apple uses certain technologies to help select images containing child sexual abuse .

Apple updated its privacy policy last year, but it's not clear when it started scanning images to make sure there was no such content. On the company's website, a special section is dedicated to the protection of minors:

We are committed to protecting children throughout our ecosystem and wherever our products are used. We continue to support innovation in this area. We have developed strong protections at all levels of our software platform and throughout our supply chain. As part of this effort, Apple is using image matching technology to help find and report the exploitation of minors. Like email spam filters, our systems use electronic signatures to detect suspected child abuse. We validate each photo with an individual opinion. Accounts whose content is related to the exploitation of children violate our terms of use and all accounts that we find with this content will be deactivated.

Apple is not the first company to digitize images this way. Many companies use software called PhotoDNA, a solution specifically designed to help prevent child exploitation.