1. HABERLER

  2. NEWS

  3. Apple's New Statement About Scanning Photos on iPhone Phones Against Child Abuse!
Apple's New Statement About Scanning Photos on iPhone Phones Against Child Abuse!

Apple's New Statement About Scanning Photos on iPhone Phones Against Child Abuse!

If you announced the new system CSAM, which can detect child abuse photos uploaded to iCloud last week, Apple stated that they will not share the users it detects with the government in any way. Apple, which has made a great effort to ensure the safety of its users, announced its new feature, CSAM, last week to prevent child abuse.

A+A-

This feature will scan photos uploaded to the iCloud system since the last months of this year, and when it detects a child abuse photo, it will blur the photo and send a notification to the user.

Although Apple has emphasized that this system will only work on photos that trigger child abuse, people are worried that Apple may be more involved in the private lives of its users. In addition, while Apple is expected to report this to the government as soon as it detects a child abuser, a new statement has come from Apple that such an agreement will not be reached.

iphone-apple-usaa.jpg

Thanks to Apple Company, User Data Will Not Be Shared With The Government For Whatever The Reason!

In dozens of countries, such as the United States, there is a penalty by law for possessing photos or videos of child abuse. After this step taken by Apple to prevent child abuse, users who are found to be involved in child abuse from Apple by the government are expected to share information. However, Apple states that it will not be able to disclose its users, regardless of the reason, in the CSAM system, which will be implemented only within the borders of the USA for now.

Any photo of child abuse uploaded to iCloud is detected thanks to Apple's CSAM system, and the image is first blurred and a notification is sent to the user. Apple is starting an investigation of users who receive multiple notifications and only reports those it deems necessary to the National Center for Missing and Exploited Children.

apple-usa-icloud.jpg

The fact that the event is completely in Apple's hands, of course, worries many people and especially the states. Because states want to detect and take necessary action if there are people prone to child abuse within their borders. Stating that it cannot help the government in this context, Apple stated that they will not expand the CSAM system in any way in line with the demands from the government. Apple stated that this system was developed only for the purpose of detecting photos of child abuse, and user data will not be shared with the government in any way, regardless of the reason.

HABERE YORUM KAT

WARNING: Do not forget to comment so that your voices can be heard. We report the comments that attract attention to make your voice heard. Comments that contain swearing, insults, offensive sentences or allusions, attacks on beliefs, are not written with spelling rules, will not be approved.