Child Abuse Disclosure Received from Apple Company! We were misunderstood!
Apple, which has recently been criticized for the feature they introduced to reduce child abuse, made a clear statement on the subject. The echoes of the new feature that Apple introduced recently to minimize child abuse continue.
The feature, which scans the photos in the iCloud of devices running iOS, iPad, and macOS operating systems with the help of artificial intelligence and detects whether they are compatible with child pornography or similar materials, has recently started to receive criticism.
While many people who think that repressive governments around the world can exploit the feature, reacted seriously to the company, while Apple software chief Craig Federighi, who made a statement today, stated that the feature they introduced was completely misunderstood.
Apple's Child Abuse Feature Misunderstood!
In its previous announcement, the company announced that to prevent child abuse, the photos in the iCloud of the devices in its ecosystem will be scanned with the help of artificial intelligence, and if there is objectionable content, it will automatically notify the National Center for Missing and Exploited Children (NCMEC).
But according to Apple software chief Craig Federighi, the AI won't scan iCloud photos. Instead, it will check the photo codes against the database of available child pornography material. This means that artificial intelligence will not have access to the photos and will only scan through their code.
Craig Federighi stated that they felt very positive and powerful after the new feature they introduced to prevent child abuse, but they were also misunderstood about the working logic of the feature. At the same time, Federighi, who explained that the users would be comfortable and that there would be no problem in terms of security and privacy, stated that situations such as alleged censorship would not occur.