https://tinyurl.com/rossmatrixLet's get Right to Repair passed! https://gofund.me/1cba2545👉 https://sneak.berlin/20201112/your-computer-isnt-yours/👉 This v...
A throwback to remind ourselves that apple is terrible for privacy
Thank you for sharing this, and I appreciate good, high quality information about privacy but please don’t spread misleading information about one of the few companies that provides easily accessible private tools for the not-so-tech-savvy, as well as the busy.
Apple applies E2E encryption for almost all iCloud data with Advanced Data Protection, applies something similar to Tor for web browsing, kills tracking pixels in your mail, uses differential privacy to avoid identifying you, and so much more.
No, macOS does not send Apple a hash of your apps each time you run them.
You should be aware that macOS might transmit some opaque3 information about the developer certificate of the apps you run. This information is sent out in clear text on your network.
You shouldn’t probably block ocsp.apple.com with Little Snitch or in your hosts file.
The author comments to the blog post you linked and it partially makes sense: if you fetch the developer’s certificate, Apple knows when you started an application of that developer (and which public IP address you have).
Whether or not there are many devs that only made one application, so you can identify this, I cannot estimate, I’m not an Apple user. But you don’t need to send a hash calculated in client side to get this info.
You’re absolutely right that it’s still an issue to transmit information about the developer certificate. Apple published a response to this, which admittedly is not ideal:
We have never combined data from these checks with information about Apple users or their devices. We do not use data from these checks to learn what individual users are launching or running on their devices.
These security checks have never included the user’s Apple ID or the identity of their device. To further protect privacy, we have stopped logging IP addresses associated with Developer ID certificate checks, and we will ensure that any collected IP addresses are removed from logs.
In addition, over the the next year we will introduce several changes to our security checks:
A new encrypted protocol for Developer ID certificate revocation checks
Strong protections against server failure
A new preference for users to opt out of these security protections
Apple applies E2E encryption for almost all iCloud data with Advanced Data Protection
They only started doing that in December, it has not rolled out to everyone and everything yet, and like you said it won’t cover everything even then — mail, contacts and calendar will not be included. (And they considered backdooring it for a while before they relented.)
Even the E2E aspect is misleading. The encryption ultimately relies on a password, which can be brute-forced because most people don’t use overly complex passwords for their iCloud account. Hardware keys are something Apple has only very recently made possible to use.
Bottom line, it would be more correct to say that Apple has recently made privacy improvements. But for the longest time they were nowhere near the privacy champion they styled themselves as.
Apple’s stated reason for not covering mail, contacts and calendar is “Because of the need to interoperate with the global email, contacts, and calendar systems, iCloud Mail, Contacts, and Calendar aren’t end-to-end encrypted”. I think it’s worth mentioning that critical bit of context. https://support.apple.com/en-sg/guide/security/sec973254c5f/web. Apple does have to balance usability and security, though this might not be as secure / private as you or I would like.
I think it’s a little misleading to say they considered backdooring it. They intended to scan images for CSAM before uploading it to iCloud Photo Library. A lot of speculation was they wanted to E2EE photos but were worried about the reaction from the FBI and other bodies, given the FBI had pressured them on this before, and so settled on this compromise. If they had managed to do this, they wouldn’t be able to access the photos after they had been uploaded, hence, they had to scan them prior to the uploading.
They attempted to do this with a very complex (and honestly still relatively privacy-preserving) way of comparing perceptual hashes, but perhaps they realised (from the feedback accompanying the backlash) this could easily be abused by authoritarian governments, so they abandoned this idea.
I would assume that a company like Apple is getting significant pressure behind back doors, and they cater to an audience that is unforgiving for any slight reduction in performance or ease-of-use, and wants security features that are almost fully transparent to them. Given these constraints, I’m not sure they can improve much faster than what they’ve demonstrated. Smaller, open-source projects probably don’t have these constraints.
Well put. I’m a privacy-conscious Linux user, but I’m constantly frustrated by the lack of technical understanding in the privacy community, especially towards Apple solutions. They’re not perfect but they are very good. They have made major investments in improving the privacy protections for their users in a way no other major company has that I know of.
Are you a very technical person who wants to improve their privacy and have fun figuring out technology? Great, me too, run Linux and GrapheneOS.
But, if you’re not highly technical or you don’t enjoy it you run a real risk of misconfiguring something and being far less private than you think. The truth is you might be better off with Apple’s “Advanced Data Protection” (E2E encryption), Private Relay (VPN + Tor hybrid on Cloudflare CDNs to avoid VPN blocks), “Hide My Email” email masking, disabled telemetry, fine grained app permissions, etc.
At the end of the day, all this stuff is good for is pushing back a bit against corporate advertising profiles. Focusing on it too much isn’t healthy for anyone.
maybe you could explain how apple was being super privacy conscious when they decided that gatekeeper shouldn’t have TLS or any other form of encryption?
what is misleading exactly?
the part where every app you open gets sent to apple along with third parties along with your IP?
because I’m pretty sure that’s all 100% true, and I think its been true for over 5 years…
you’re just suggesting that because they do one thing well they do everything well, which is a fallacy.
Also, any proprietary program that does “E2EE” is misleading you by omitting the part where they could totally steal anyones keys at any time with the push of a button, if they haven’t already. it is completely laughable to suggest any proprietary E2EE program is secure!
I’m not going to touch your other points, but you clearly have no idea how encryption works if you claim that any proprietary program using end-to-end encryption is insecure.
if you trust everything a sales person says, I have a bridge to sell you.
there is no reason to believe any proprietary program does what is says, and even if you decompile it and convince yourself its not sending your keys home, they could update it at any moment.
no, its just an additional attack vector, having the code to inspect makes validating updates much easier and more secure.
I’m evaluating the security of the software I’m using? what are you doing casually excusing a massive security flaw? you must not look either way before crossing the street
Oh really? You read the entire codebase of a project before downloading it, and every time you update it, you go over every single change like you’re the Greek God of code review? Because if you’re not, by your own standards, you’re opening yourself up to “additional attack vectors”
You’re talking cross-purposes. By your reasoning Lemmy or any client you use could be an attack vector - are you diving deep on the servers, their clusters, the network, their content relays, the source code to all of the software from servers to client? See, I doubt you do any of that.
I think all you do is play angels and demons and decide that what you don’t know isn’t important, what you think you know is.
What you’re describing is possible in certain circumstances , but it would expose the companies to an insane amount of liability. Also, open source software can introduce vulnerabilities that could be exploited to do the same exact thing. Open source software is not inherently more secure. Remember that time malware was introduced to the Linux kernel directly as a research project?
We have never combined data from these checks with information about Apple users or their devices. We do not use data from these checks to learn what individual users are launching or running on their devices.
To further protect privacy, we have stopped logging IP addresses associated with Developer ID certificate checks, and we will ensure that any collected IP addresses are removed from logs.
In addition, over the the next year we will introduce several changes to our security checks:
A new encrypted protocol for Developer ID certificate revocation checks
Strong protections against server failure
A new preference for users to opt out of these security protections
The fact that existed for years is the problem. the fact that execs signed off on this at all means apple is terrible for privacy
I read the article and the only pedantic detail that was wrong in the initial report was that gatekeeper didnt send the “appication hash” it sent the “applications certificate id” which is a worthless distinction and changes nothing. you’re acting like that somehow exonerates apple, and then just blindly believing what their PR person says. youd have to be a complete idiot or working for them to believe that crap.
So they did one thing wrong and it means they’re terrible for privacy? Welp, guess I can’t have a phone because the alternative (Google) has a business model that depends on being terrible for privacy, and my work apps disallow custom ROMs.
oh I guess none of us can have security because this guys work wont let us.
no, they did a bunch of things wrong. they all do, so instead of burying my head in the sand, Im going to call it out and work to build a better future.
Not everyone even knows how to use custom ROMs, tech workers may have a huge presence online but we’re a tiny minority irl.
Anyway, good, go build it. Saying one small mistake makes a company terrible privacy isn’t doing a whole lot for your credibility though, so I recommend you spend more time building than talking about it.
Unfortunately, this is highly misleading.
Thank you for sharing this, and I appreciate good, high quality information about privacy but please don’t spread misleading information about one of the few companies that provides easily accessible private tools for the not-so-tech-savvy, as well as the busy.
Apple applies E2E encryption for almost all iCloud data with Advanced Data Protection, applies something similar to Tor for web browsing, kills tracking pixels in your mail, uses differential privacy to avoid identifying you, and so much more.
Please see: https://blog.jacopo.io/en/post/apple-ocsp/
TL;DR
No, macOS does not send Apple a hash of your apps each time you run them.
You should be aware that macOS might transmit some opaque3 information about the developer certificate of the apps you run. This information is sent out in clear text on your network.
You shouldn’t probably block ocsp.apple.com with Little Snitch or in your hosts file.
The video is basically some dude reading a blog post (boy, I hate those, provide no value). The blog post he reads is this: https://sneak.berlin/20201112/your-computer-isnt-yours/
The author comments to the blog post you linked and it partially makes sense: if you fetch the developer’s certificate, Apple knows when you started an application of that developer (and which public IP address you have).
Whether or not there are many devs that only made one application, so you can identify this, I cannot estimate, I’m not an Apple user. But you don’t need to send a hash calculated in client side to get this info.
You’re absolutely right that it’s still an issue to transmit information about the developer certificate. Apple published a response to this, which admittedly is not ideal:
https://support.apple.com/en-us/HT202491#view:~:text=Privacy protections
We have never combined data from these checks with information about Apple users or their devices. We do not use data from these checks to learn what individual users are launching or running on their devices.
These security checks have never included the user’s Apple ID or the identity of their device. To further protect privacy, we have stopped logging IP addresses associated with Developer ID certificate checks, and we will ensure that any collected IP addresses are removed from logs.
In addition, over the the next year we will introduce several changes to our security checks:
A new encrypted protocol for Developer ID certificate revocation checks
Strong protections against server failure
A new preference for users to opt out of these security protections
I mean that sounds like a pretty good response to me
They only started doing that in December, it has not rolled out to everyone and everything yet, and like you said it won’t cover everything even then — mail, contacts and calendar will not be included. (And they considered backdooring it for a while before they relented.)
Even the E2E aspect is misleading. The encryption ultimately relies on a password, which can be brute-forced because most people don’t use overly complex passwords for their iCloud account. Hardware keys are something Apple has only very recently made possible to use.
https://www.theverge.com/2022/12/7/23498580/apple-end-to-end-encryption-icloud-backups-advanced-data-protection
https://www.schneier.com/blog/archives/2022/12/apple-is-finally-encrypting-icloud-backups.html
Bottom line, it would be more correct to say that Apple has recently made privacy improvements. But for the longest time they were nowhere near the privacy champion they styled themselves as.
Apple’s stated reason for not covering mail, contacts and calendar is “Because of the need to interoperate with the global email, contacts, and calendar systems, iCloud Mail, Contacts, and Calendar aren’t end-to-end encrypted”. I think it’s worth mentioning that critical bit of context. https://support.apple.com/en-sg/guide/security/sec973254c5f/web. Apple does have to balance usability and security, though this might not be as secure / private as you or I would like.
I think it’s a little misleading to say they considered backdooring it. They intended to scan images for CSAM before uploading it to iCloud Photo Library. A lot of speculation was they wanted to E2EE photos but were worried about the reaction from the FBI and other bodies, given the FBI had pressured them on this before, and so settled on this compromise. If they had managed to do this, they wouldn’t be able to access the photos after they had been uploaded, hence, they had to scan them prior to the uploading.
They attempted to do this with a very complex (and honestly still relatively privacy-preserving) way of comparing perceptual hashes, but perhaps they realised (from the feedback accompanying the backlash) this could easily be abused by authoritarian governments, so they abandoned this idea.
I would assume that a company like Apple is getting significant pressure behind back doors, and they cater to an audience that is unforgiving for any slight reduction in performance or ease-of-use, and wants security features that are almost fully transparent to them. Given these constraints, I’m not sure they can improve much faster than what they’ve demonstrated. Smaller, open-source projects probably don’t have these constraints.
Thank you!
Also FISA courts exist, and we have no reason to believe that apple doesn’t comply with their subpoenas by backdooring the supposed E2EE
Can you tell me some more about this? I haven’t heard about this interested to know what it is
Hi! It’s called iCloud Private Relay and it’s detailed here: https://threadreaderapp.com/thread/1402274867366477831.html
Well put. I’m a privacy-conscious Linux user, but I’m constantly frustrated by the lack of technical understanding in the privacy community, especially towards Apple solutions. They’re not perfect but they are very good. They have made major investments in improving the privacy protections for their users in a way no other major company has that I know of.
Are you a very technical person who wants to improve their privacy and have fun figuring out technology? Great, me too, run Linux and GrapheneOS.
But, if you’re not highly technical or you don’t enjoy it you run a real risk of misconfiguring something and being far less private than you think. The truth is you might be better off with Apple’s “Advanced Data Protection” (E2E encryption), Private Relay (VPN + Tor hybrid on Cloudflare CDNs to avoid VPN blocks), “Hide My Email” email masking, disabled telemetry, fine grained app permissions, etc.
At the end of the day, all this stuff is good for is pushing back a bit against corporate advertising profiles. Focusing on it too much isn’t healthy for anyone.
maybe you could explain how apple was being super privacy conscious when they decided that gatekeeper shouldn’t have TLS or any other form of encryption?
what is misleading exactly? the part where every app you open gets sent to apple along with third parties along with your IP?
because I’m pretty sure that’s all 100% true, and I think its been true for over 5 years…
you’re just suggesting that because they do one thing well they do everything well, which is a fallacy.
Also, any proprietary program that does “E2EE” is misleading you by omitting the part where they could totally steal anyones keys at any time with the push of a button, if they haven’t already. it is completely laughable to suggest any proprietary E2EE program is secure!
so who is spreading the missinfo again?
I’m not going to touch your other points, but you clearly have no idea how encryption works if you claim that any proprietary program using end-to-end encryption is insecure.
if you trust everything a sales person says, I have a bridge to sell you.
there is no reason to believe any proprietary program does what is says, and even if you decompile it and convince yourself its not sending your keys home, they could update it at any moment.
IDK where you get all of this trust from
Take your meds lol
you might need to lay off the stupid pills bruh
So in your view because anything could change everything will? How do you cross a road or drive or eat food or well anything at all?
You must be super paranoid and fearful.
no, its just an additional attack vector, having the code to inspect makes validating updates much easier and more secure.
I’m evaluating the security of the software I’m using? what are you doing casually excusing a massive security flaw? you must not look either way before crossing the street
Oh really? You read the entire codebase of a project before downloading it, and every time you update it, you go over every single change like you’re the Greek God of code review? Because if you’re not, by your own standards, you’re opening yourself up to “additional attack vectors”
You’re talking cross-purposes. By your reasoning Lemmy or any client you use could be an attack vector - are you diving deep on the servers, their clusters, the network, their content relays, the source code to all of the software from servers to client? See, I doubt you do any of that.
I think all you do is play angels and demons and decide that what you don’t know isn’t important, what you think you know is.
You’re the attack vector.
yeah, I’ve considered the security model of lemmy, havent you?
EDIT: Is your argument that nobody should care about security and just be happy with whatever apple sells us?
What you’re describing is possible in certain circumstances , but it would expose the companies to an insane amount of liability. Also, open source software can introduce vulnerabilities that could be exploited to do the same exact thing. Open source software is not inherently more secure. Remember that time malware was introduced to the Linux kernel directly as a research project?
I’m sorry but did you read the article l linked to or the TL;DR I lifted from the article?
They do not send the app you open to Apple, and there is no evidence they send it to third parties as the app information is not sent at all!
Nevertheless, they do send information about the developer certificate for notarization and gatekeeper checks.
https://support.apple.com/en-us/HT202491#view:~:text=Privacy protections
Quote:
We have never combined data from these checks with information about Apple users or their devices. We do not use data from these checks to learn what individual users are launching or running on their devices.
To further protect privacy, we have stopped logging IP addresses associated with Developer ID certificate checks, and we will ensure that any collected IP addresses are removed from logs.
In addition, over the the next year we will introduce several changes to our security checks: A new encrypted protocol for Developer ID certificate revocation checks Strong protections against server failure A new preference for users to opt out of these security protections
The fact that existed for years is the problem. the fact that execs signed off on this at all means apple is terrible for privacy
I read the article and the only pedantic detail that was wrong in the initial report was that gatekeeper didnt send the “appication hash” it sent the “applications certificate id” which is a worthless distinction and changes nothing. you’re acting like that somehow exonerates apple, and then just blindly believing what their PR person says. youd have to be a complete idiot or working for them to believe that crap.
So they did one thing wrong and it means they’re terrible for privacy? Welp, guess I can’t have a phone because the alternative (Google) has a business model that depends on being terrible for privacy, and my work apps disallow custom ROMs.
oh I guess none of us can have security because this guys work wont let us.
no, they did a bunch of things wrong. they all do, so instead of burying my head in the sand, Im going to call it out and work to build a better future.
Not everyone even knows how to use custom ROMs, tech workers may have a huge presence online but we’re a tiny minority irl.
Anyway, good, go build it. Saying one small mistake makes a company terrible privacy isn’t doing a whole lot for your credibility though, so I recommend you spend more time building than talking about it.
ok this is not “one small mistake” this is a systemic failure
They designed a security feature without considering security
They kept this feature without encryption for years
It is either a bafflingly huge mistake or they intentionally made spyware,
Ill remind you of hanlons razor and let you make your own decision:
Misleading as to WHY macOS is phoning home. It’s done to validate that the developer of the app you’re attempting to run is a trusted developer. Disabling or bypassing this check would open users up to potentially malicious software. https://www.howtogeek.com/701176/does-apple-track-every-mac-app-you-run-ocsp-explained/
youre being misleading by saying why!
unless you were in the room, your speculation is as good as mine, and Im not saying why, Im just stating facts!
Did you actually just say this outloud?
do you realize that im not the one making the speculation?
Bro I quoted your words.
I guess I dont get the point of your comment then