How Apple scans your phone (and how to evade it) Neural Hash CSAM Detection Algorithm Explained
, apple, icloud, privacy Apple recently announced scanning all images uploaded to iCloud for CSAM (child abuse material), and that this scan would happen locally on users phones. We take a look at the technical report and explore how the system works in detail, how it is designed to preserve user privacy, and what weak points it still has. OUTLINE: 0:00 Introduction 3:05 System Requirements 9:15 System Overview 14:00 NeuralHash 20:45 Private Set Intersection 31:15 Threshold Secret Sharing 35:25 Synthetic Match Vouchers 38:20 Problem 1: Who controls the database 42:40 Problem 2: Adversarial Attacks 49:40 Comments Conclusion Paper: ML News Episode about CSAM: Abstract: CSAM Detection enables Apple to accurately identify and report iCloud users who store known Child Sexual Abuse Material (CSAM) in their iCloud Photos accounts. Apple servers flag accounts exceeding a threshold number
|
|