Conversation
Notices
-
バツ子(痛いの痛いの飛んでけ;; (shmibs@tomo.airen-no-jikken.icu)'s status on Monday, 09-Aug-2021 22:09:26 JST バツ子(痛いの痛いの飛んでけ;; @Moon @kev @sotolf @thewk (also seems pretty irrelevant, since once infrastructure exists it becomes used for other purposes by whatever government -
Infected Moomin (moon@shitposter.club)'s status on Monday, 09-Aug-2021 22:09:27 JST Infected Moomin @kev @sotolf @thewk daring fireball's distinction about image fingerprinting is pretty naive, in the end the image analysis is trying to overcome image cropping, framing, scaling, flipping and transformation, and color adjustment and still receive a match, that is going to entail some imprecision -
Infected Moomin (moon@shitposter.club)'s status on Monday, 09-Aug-2021 22:09:28 JST Infected Moomin @kev @sotolf @thewk it's not really a hash it's just image similarity. why would they need a neural network to just hash a file. also Apple does not really have checks because they have access to neither the image from the phone or the abuse image to compare. -
Kev Quirk :kubuntu: (kev@fosstodon.org)'s status on Monday, 09-Aug-2021 22:09:29 JST Kev Quirk :kubuntu: @sotolf hash collisions?? Even using MD5 they’re EXTREMELY rare (Apple aren’t using MD5). The chances of a collision are ridiculously minuscule. Plus, Apple have checks in place to combat that if it happens.
There’s a threshold that an account has to hit before it’s flagged. So if there’s a single collision (which is extremely unlikely) it’s nigh on impossible that there will be multiple collisions on the same account.
-
Sotolf :arch: :vim: :terminal: (sotolf@fosstodon.org)'s status on Monday, 09-Aug-2021 22:09:36 JST Sotolf :arch: :vim: :terminal: > I think Apple are setting a dangerous precedent here, but there are advantages too. When my kids are 12, if they send pictures of their junk to their peers, I would DEFINITELY want to know about it.
This technology won't stop that though, as it uses hashes for known child abuse material, and will not catch new things, only stuff that is known.
The biggest problem is with false positives here really as there are hash collisions to the algorithms that they use.
-
Kev Quirk :kubuntu: (kev@fosstodon.org)'s status on Monday, 09-Aug-2021 22:09:37 JST Kev Quirk :kubuntu: @thewk @sotolf absolutely. I too am concerned, but as you said, if you take a step back and look at the facts, it’s nowhere near as bad as many people are making out.
I think Apple are setting a dangerous precedent here, but there are advantages too. When my kids are 12, if they send pictures of their junk to their peers, I would DEFINITELY want to know about it.
Also, Google have been doing something very similar for years.
-
TheWK :ubuntu: :nextcloud: (thewk@fosstodon.org)'s status on Monday, 09-Aug-2021 22:09:43 JST TheWK :ubuntu: :nextcloud: @kev @sotolf After looking into what they are doing exactly I also took a deep breath and a step back. But it still concerns me, as some else pointed out here, iPhones, just like any other smartphone, have a ton of security issues and exploits could be used to put that kind of material onto someones phone.
On the other hand one could just as easily target ones mail or social media accounts....
-
Kev Quirk :kubuntu: (kev@fosstodon.org)'s status on Monday, 09-Aug-2021 22:09:45 JST Kev Quirk :kubuntu: @sotolf as I’ve said before, the background of the person writing it means nothing in this case. The technical information is accurate. Facts are facts.
-
Sotolf :arch: :vim: :terminal: (sotolf@fosstodon.org)'s status on Monday, 09-Aug-2021 22:09:46 JST Sotolf :arch: :vim: :terminal: @kev It would carry a lot more weight if it wasn't written by a known apple apologist.
-
Kev Quirk :kubuntu: (kev@fosstodon.org)'s status on Monday, 09-Aug-2021 22:09:47 JST Kev Quirk :kubuntu: This is an exceptionally good write up about what’s actually happening with the whole #Apple saga.
Far too much hyperbole and not enough facts.
https://daringfireball.net/2021/08/apple_child_safety_initiatives_slippery_slope
-