Apple Inc (AAPL.O) on Thursday said it will implement a system that checks photos on iPhones in the United States before they are uploaded to its iCloud storage services to ensure the upload does not match known images of child sexual abuse.
Read and sign the open letter protesting against Apple's roll-out of new content-scanning technology that threatens to overturn individual privacy on a global scale, and to reverse progress achieved with end-to-end encryption for all.
A backlash over Apple's move to scan U.S. customer phones and computers for child sex abuse images has grown to include employees speaking out internally, a notable turn in a company famed for its secretive culture, as well as provoking intensified protests from leading technology policy groups.
Within a few weeks of announcing the technology, researchers said they were able to create “hash collisions” using NeuralHash, effectively tricking the system into thinking two entirely different images were the same.
The optional feature would keep most data secure that’s stored in iCloud, a service used to back up iPhones or save specific device data such as Messages. The data would be protected in the event Apple is hacked, and it also wouldn’t be accessible to law enforcement, even with a warrant.
This website uses affiliate links. This means that if you click on a link and make a purchase, we may receive a commission. This does not affect the price you pay for the product.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.