Apple has further detailed that its child safety mechanism will require at least 30 photos matching with Child Sexual Abuse Material (CSAM) flagged by organisations in at least two countries before an account is flagged for human review.
from Gadgets 360 https://ift.tt/2VUFt71
Home
Gadgets 360
Apple Says At Least 30 iCloud Photos Matching With Child Abuse Material Will Flag Accounts
https://ift.tt/eA8V8J
Apple Says At Least 30 iCloud Photos Matching With Child Abuse Material Will Flag Accounts https://ift.tt/eA8V8J
Published on: August 13, 2021