A throwback to remind ourselves that apple is terrible for privacy

  • lemmyvore@feddit.nl
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    Apple applies E2E encryption for almost all iCloud data with Advanced Data Protection

    They only started doing that in December, it has not rolled out to everyone and everything yet, and like you said it won’t cover everything even then — mail, contacts and calendar will not be included. (And they considered backdooring it for a while before they relented.)

    Even the E2E aspect is misleading. The encryption ultimately relies on a password, which can be brute-forced because most people don’t use overly complex passwords for their iCloud account. Hardware keys are something Apple has only very recently made possible to use.

    https://www.theverge.com/2022/12/7/23498580/apple-end-to-end-encryption-icloud-backups-advanced-data-protection

    https://www.schneier.com/blog/archives/2022/12/apple-is-finally-encrypting-icloud-backups.html

    Bottom line, it would be more correct to say that Apple has recently made privacy improvements. But for the longest time they were nowhere near the privacy champion they styled themselves as.

    • octalfudge@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Apple’s stated reason for not covering mail, contacts and calendar is “Because of the need to interoperate with the global email, contacts, and calendar systems, iCloud Mail, Contacts, and Calendar aren’t end-to-end encrypted”. I think it’s worth mentioning that critical bit of context. https://support.apple.com/en-sg/guide/security/sec973254c5f/web. Apple does have to balance usability and security, though this might not be as secure / private as you or I would like.

      I think it’s a little misleading to say they considered backdooring it. They intended to scan images for CSAM before uploading it to iCloud Photo Library. A lot of speculation was they wanted to E2EE photos but were worried about the reaction from the FBI and other bodies, given the FBI had pressured them on this before, and so settled on this compromise. If they had managed to do this, they wouldn’t be able to access the photos after they had been uploaded, hence, they had to scan them prior to the uploading.

      They attempted to do this with a very complex (and honestly still relatively privacy-preserving) way of comparing perceptual hashes, but perhaps they realised (from the feedback accompanying the backlash) this could easily be abused by authoritarian governments, so they abandoned this idea.

      I would assume that a company like Apple is getting significant pressure behind back doors, and they cater to an audience that is unforgiving for any slight reduction in performance or ease-of-use, and wants security features that are almost fully transparent to them. Given these constraints, I’m not sure they can improve much faster than what they’ve demonstrated. Smaller, open-source projects probably don’t have these constraints.

    • Thann@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Thank you!
      Also FISA courts exist, and we have no reason to believe that apple doesn’t comply with their subpoenas by backdooring the supposed E2EE