Apple CSAM Detection

Discussions center on Apple's on-device CSAM scanning using perceptual hashes from databases like NCMEC, raising concerns about privacy, false positives, hash collisions, and potential abuse by governments or adversaries.

šŸ“‰ Falling 0.4x Security
3,059
Comments
18
Years Active
5
Top Authors
#1079
Topic ID

Activity Over Time

2009
1
2010
3
2011
3
2012
12
2013
34
2014
26
2015
42
2016
33
2017
74
2018
39
2019
66
2020
38
2021
1,774
2022
264
2023
273
2024
176
2025
178
2026
23

Keywords

A1 e.g US OSS apple.com index.php hackerfactor.com NeuralHash AI HN csam hashes images hash apple image database photo attack match

Sample Comments

mannerheim • Aug 18, 2021 • View on HN

> Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child-safety organizations. Apple further transforms this database into an unreadable set of hashes, which is securely stored on users’ devices.<a href="https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf" rel=

olliej • Sep 1, 2023 • View on HN

No, the terrible misfeature that this group wants is ā€œgovernment provides a bunch of opaque hashes that are ā€˜CSAM’, all images are compared with those hashes, and if the hashes match then the user details are given to policeā€Note that by design the hashes cannot be audited (though in the legitimate case I don’t imagine doing so would be pleasant), so there’s nothing stopping a malicious party inserting hashes of anything they want - and then the news report will be ā€œperson x bought in for que

oldgradstudent • Aug 25, 2021 • View on HN

If he wins, will the hash of this picture be added to Apple CSAM detection system?

mrzimmerman • Aug 14, 2021 • View on HN

CSAM is a hash database. The images are converted to a hash and then compared to the hashes of known pornography of children, not directly viewed.The weirdly less discussed aspect of this is that anyone who is storing their images of any kind on someone else’s computer and network thinks that nothing could have been viewed before. If Apple or Google or Amazon want to scan the data you store with them they could be doing it, so if that was a concern for a person from the get go then they would

cratermoon • Aug 7, 2021 • View on HN

If I'm reading this right? Apple is saying they are going to flag CSAM they find on their servers. This article talks about finding a match for photos by comparing a hash of a photo you're testing with a hash you have, from a photo you have.Does this mean Apple had/has CSAM available to generate the hashes?

intricatedetail • Aug 13, 2021 • View on HN

Apple has no way to know what image hash has been derived from. So China can give a hash of Winnie the pooh and claim it is CSAM. Apple won't know.

shuckles • Aug 18, 2021 • View on HN

It might be useful to read the threat model document. Associated data from client neural hash matches are compared with the known CSAM database again on the server using a private perceptual hash before being forwarded to human reviewers, so all such an attack would do is expose non-private image derivatives to Apple. It would likely not put an account at risk for referral to NCMEC. In this sense, privacy is indeed preserved versus other server scanning solutions where an adversarial perceptual

yusefnapora • Aug 19, 2021 • View on HN

Imagine this scenario.- You receive some naughty (legal!) images of a naked young adult while flirting online and save them to your camera roll.- These images have been made to collide [1] with "well known" CSAM images obtained from the dark underbelly of the internet, on the assumption that their hashes will be contained in the encrypted database.- Apple's manual review kicks in because you have enough such images to trigger the threshold.- The human reviewer sees a b

EugeneOZ • Aug 13, 2021 • View on HN

I’m not even sure if it's a joke or you are serious.It is a check against existing hashes in a big database of confirmed CSAM. What are the chances that photos of your partner are in that database? If your partner is older than 12 - it's 0%.Who is taking more risk to be sued for the leakage of the photos, you or Apple?The last part doesn't worth to be discussed because children in that DB are younger than 12.

smellsinore • Aug 14, 2021 • View on HN

I don't think they should of included an image that triggers false positives in CSAMedit +reference https://www.hackerfactor.com/blog/index.php?/archives/929-On...