Csam detection on iphone: apple considers local child porn scanner safe

Csam detection on iphone: apple considers local child porn scanner safe

Apple considers its controversial local child porn scanner, which will be part of the iPhone and iPad from iOS 15 and iPadOS 15, to be fundamentally secure. Over the weekend, the company published a Security Threat Model Review, which discusses possible attack scenarios on the system.

Apple defends its approach (more)

In the 14-page document, the company ares, among other things, that the system, which is the subject of massive controversy among security experts, is designed in such a way that a user can neither "Apple nor any other single entity" in terms of its functioning must trust. Even "any number of potentially colluding entities" from the same sovereign jurisdiction ("under the control of the same government") could not do so, the group claims.

This is achieved through "various interlocking mechanisms" . These include Apple’s uniform software update, which is distributed worldwide for export to the devices, and which is "intrinsically verifiable" . However, iOS and iPadOS are largely proprietary, and a code review by the CSAM ("Child Sexual Abuse Material") is currently not foreseen by the company. However, he could be referring to the test on specially equipped rooted iPhones, which the company sent out to selected security researchers as part of the "Apple Security Research Device Program" ies.

Users were allowed to check hash lists – at Apple

Furthermore, Apple emphasizes that it never uses lists with hash values of abuse material from only one organization, but "Overlapping hashes from two or more child protection organizations". This is also intended to prevent governments from foisting non-CSAM content on the system. Furthermore, auditing by external auditors shall be possible. Apple finally plans to publish a knowledge base article on its website containing the root hash value of the current encrypted CSAM hash database, which will be part of every operating system update in the future. This allowed a user to check the root hash – to make sure the database really came from Apple.

The problem with all these security measures, however, remains that Apple is dependent on market permissions from individual governments. This is also emphasized by an analysis of the netburger rights organization EFF. The company has opened a door for surveillance and censorship: once it exists, it will be used ("If You Build It, They Will Come"). Meanwhile, well-known security researcher Katie Moussouris argued on Twitter that Apple could not simply refuse to comply with legal rules in one country, because then access to the entire market would be blocked. "They are not going to operate illegally in a country and act as a corporation like a sovereign nation." Apple so far firmly claims it will defy government orders to use the local CSAM scanner for censorship and surveillance.

Like this post? Please share to your friends:
Leave a Reply

;-) :| :x :twisted: :smile: :shock: :sad: :roll: :razz: :oops: :o :mrgreen: :lol: :idea: :grin: :evil: :cry: :cool: :arrow: :???: :?: :!: