A throwback to remind ourselves that apple is terrible for privacy

  • thann@lemmy.worldOP
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    22
    ·
    1 year ago

    what is misleading exactly? the part where every app you open gets sent to apple along with third parties along with your IP?
    because I’m pretty sure that’s all 100% true, and I think its been true for over 5 years…

    you’re just suggesting that because they do one thing well they do everything well, which is a fallacy.
    Also, any proprietary program that does “E2EE” is misleading you by omitting the part where they could totally steal anyones keys at any time with the push of a button, if they haven’t already. it is completely laughable to suggest any proprietary E2EE program is secure!

    so who is spreading the missinfo again?

    • xedrak@kbin.social
      link
      fedilink
      arrow-up
      11
      arrow-down
      2
      ·
      1 year ago

      I’m not going to touch your other points, but you clearly have no idea how encryption works if you claim that any proprietary program using end-to-end encryption is insecure.

      • thann@lemmy.worldOP
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        15
        ·
        1 year ago

        if you trust everything a sales person says, I have a bridge to sell you.

        there is no reason to believe any proprietary program does what is says, and even if you decompile it and convince yourself its not sending your keys home, they could update it at any moment.

        IDK where you get all of this trust from

        • steakmeout@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          So in your view because anything could change everything will? How do you cross a road or drive or eat food or well anything at all?

          You must be super paranoid and fearful.

          • thann@lemmy.worldOP
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            2
            ·
            edit-2
            1 year ago

            no, its just an additional attack vector, having the code to inspect makes validating updates much easier and more secure.

            I’m evaluating the security of the software I’m using? what are you doing casually excusing a massive security flaw? you must not look either way before crossing the street

            • xedrak@kbin.social
              link
              fedilink
              arrow-up
              4
              ·
              1 year ago

              Oh really? You read the entire codebase of a project before downloading it, and every time you update it, you go over every single change like you’re the Greek God of code review? Because if you’re not, by your own standards, you’re opening yourself up to “additional attack vectors”

            • steakmeout@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              You’re talking cross-purposes. By your reasoning Lemmy or any client you use could be an attack vector - are you diving deep on the servers, their clusters, the network, their content relays, the source code to all of the software from servers to client? See, I doubt you do any of that.

              I think all you do is play angels and demons and decide that what you don’t know isn’t important, what you think you know is.

              You’re the attack vector.

              • thann@lemmy.worldOP
                link
                fedilink
                English
                arrow-up
                0
                arrow-down
                3
                ·
                edit-2
                1 year ago

                yeah, I’ve considered the security model of lemmy, havent you?

                EDIT: Is your argument that nobody should care about security and just be happy with whatever apple sells us?

        • xedrak@kbin.social
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          What you’re describing is possible in certain circumstances , but it would expose the companies to an insane amount of liability. Also, open source software can introduce vulnerabilities that could be exploited to do the same exact thing. Open source software is not inherently more secure. Remember that time malware was introduced to the Linux kernel directly as a research project?

    • octalfudge@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      3
      ·
      1 year ago

      I’m sorry but did you read the article l linked to or the TL;DR I lifted from the article?

      They do not send the app you open to Apple, and there is no evidence they send it to third parties as the app information is not sent at all!

      Nevertheless, they do send information about the developer certificate for notarization and gatekeeper checks.

      https://support.apple.com/en-us/HT202491#view:~:text=Privacy protections

      Quote:

      We have never combined data from these checks with information about Apple users or their devices. We do not use data from these checks to learn what individual users are launching or running on their devices.

      To further protect privacy, we have stopped logging IP addresses associated with Developer ID certificate checks, and we will ensure that any collected IP addresses are removed from logs.

      In addition, over the the next year we will introduce several changes to our security checks: A new encrypted protocol for Developer ID certificate revocation checks Strong protections against server failure A new preference for users to opt out of these security protections

      • thann@lemmy.worldOP
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        8
        ·
        1 year ago

        The fact that existed for years is the problem. the fact that execs signed off on this at all means apple is terrible for privacy

        I read the article and the only pedantic detail that was wrong in the initial report was that gatekeeper didnt send the “appication hash” it sent the “applications certificate id” which is a worthless distinction and changes nothing. you’re acting like that somehow exonerates apple, and then just blindly believing what their PR person says. youd have to be a complete idiot or working for them to believe that crap.

        • Shikadi@wirebase.org
          link
          fedilink
          English
          arrow-up
          5
          ·
          1 year ago

          So they did one thing wrong and it means they’re terrible for privacy? Welp, guess I can’t have a phone because the alternative (Google) has a business model that depends on being terrible for privacy, and my work apps disallow custom ROMs.

          • thann@lemmy.worldOP
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            2
            ·
            1 year ago

            oh I guess none of us can have security because this guys work wont let us.

            no, they did a bunch of things wrong. they all do, so instead of burying my head in the sand, Im going to call it out and work to build a better future.

            • Shikadi@wirebase.org
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              1 year ago

              Not everyone even knows how to use custom ROMs, tech workers may have a huge presence online but we’re a tiny minority irl.

              Anyway, good, go build it. Saying one small mistake makes a company terrible privacy isn’t doing a whole lot for your credibility though, so I recommend you spend more time building than talking about it.

              • thann@lemmy.worldOP
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                1 year ago

                one small mistake

                ok this is not “one small mistake” this is a systemic failure

                They designed a security feature without considering security

                They kept this feature without encryption for years

                It is either a bafflingly huge mistake or they intentionally made spyware,

                Ill remind you of hanlons razor and let you make your own decision:

                dont attribute to malice that which is sufficiently explained by stupidity

      • thann@lemmy.worldOP
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        7
        ·
        1 year ago

        youre being misleading by saying why!
        unless you were in the room, your speculation is as good as mine, and Im not saying why, Im just stating facts!