Apple is set to roll out Child Abuse Detection System and people are worried

Apple

Apple is now set to roll out its new Child Sexual Abuse Detection System to combat Child Sexual Abuse Material (CSAM) on its platform. This new feature will scan existing photos and photos before they are uploaded to iCloud to find out the potential signs of CSAM. Apple will also use on-device machine learning to warn children and parents about sensitive content received in Messages. But this system is already into controversy even before it could be available.

Many people think that Apple can easily expand the feature beyond the detection of CSAM to make it a broader surveillance tool for the government.

Apple says that its new system will “help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM).” On talking about making this feature a surveillance tool for the government, the company claims that it would refuse any such demands.

Apple has confirmed that it will first start scanning photos uploaded to iCloud from Apple devices to find out the signs of child sex abuse material. The system users a neural matching function called NeuralHash to find out if photos have unique digital fingerprints of CSAM. If the system detects enough images, they are then flagged for review by human operators. If a human operator confirms the presence of child abuse in photos, it will alert National Center for Missing and Exploited Children (NCMEC). The company also confirmed that the feature doesn’t work for users who have iCloud Photos disabled.

It looks like a good step to compact child sexual abuse material online. But many people aren’t happy. Some people have expressed concerns that this system could flag photos of their own children bathing naked. Apple confirmed that you don’t need to worry about your personal photos. There is also an appeal process in place if you feel your account was flagged and disabled in error.

Apple’s system also won’t learn anything about images that do not match the known CSAM database. Apple is also enhancing Siri and Search to help people find resources for reporting CSAM

For now, the system just checks photos but the company also has plans to check videos before they are uploaded to the company’s cloud.

It is worth noting that most cloud services including Microsoft, Google, and Dropbox also have their own systems in place to detect CSAM images. Now when Apple is joining them, people are worried because the company can access your personal photos. We have already seen Apple bending to local pressure. The company sells iPhones without FaceTime in Saudi Arabia.

Now when people are worried, there’s an online letter urging Apple to reverse course. Over 6,000 people have already signed. This letter also quotes several security experts. Edward Snowden also tweeted and called this feature a mass surveillance system.

Sarah Jamie Lewis, Executive Director of the Open Privacy Research Society, is also against this new feature.

Share this article
Shareable URL
Prev Post

Samsung Galaxy A12 Nacho with 6.5-inch Display, Quad-camera, 5000 mAh battery announced

Next Post

Motorola Edge 20 and Edge 20 Fusion launching in India on August 17

Leave a Reply
Read next
Subscribe to our newsletter
Get notified of the best deals on our WordPress themes.
0
Share