• Rai@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          1 year ago

          Totally fair hahaha.

          I gotta hand it to Apple for being one of the very few mega corpos that even try to advocate for privacy. Their idea of “scan your photos” was fucked, but at least they backpedaled. I’d like to hope it was a checksum scan and not, like, an AI scan that had human reviewers—that would be incredibly creepy to me.

          Well, I don’t store any photos on iCloud anyway cuz I don’t need the fucked up shit I do with my partner on the interwebs, but still. Not a good look, glad they went back on it.

      • deadcade@lemmy.deadca.de
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        3
        ·
        1 year ago

        Please use up to date sources. (Disclaimer: Apple has continued and cancelled this “feature” enough times I’m not 100% sure if it’s currently in iOS, but I’m certain enough to not trust any Apple devices with any photos.)

        The hashing algorithm they used had manually craftable hash collisions. Apple did state they would be using a different hashing algorithm, but it likely contains similar flaws. This would allow anyone to get your iPhone at least partially flagged, and have your photos sent to Apple for “human verification”. Knowing how this algorithm works also allows people to circumvent any detection methods Apple uses.

        Not every iPhone is going to include a list of hashes of all illegal material, which means the hash of every image you view is sent to Apple. Even if you trust them to not run any other tracking/telemetry on your iPhone, this alone gives them the ability to track who viewed any image, by only having a copy of the image themselves. This is a very powerful surveillance tool, and can be used for censorship of nearly anything.