Nokia 3310
So they're turning an iPhone into a Nokia 3310. Maybe just buy a Nokia and save a grand or more?
Apple's latest security feature won't be used by most of its customers, but those who need Lockdown Mode could find it to be a literal life saver. The functionality, coming with iOS/iPadOS 16 and macOS Ventura, shrinks an iDevice's attack surface by disabling many of its features. It's designed to protect the small number of …
It was a GIF that contained some decades old fax format (which was then used to implement a primitive CPU) so checking file types wouldn't have prevented that NSO group hack.
Better for this type of user to completely disable that route for everything including all image types. Just because there is no zero day exploit against JPEG today, doesn't mean it is 100% impossible there won't be one tomorrow.
Yes, there have been various exploitable vulnerabilities in various image decoders over the years.
But this is probably an unavoidable compromise. People are so accustomed to viewing images in messages that if Apple blocked images, most of the Lockdown Mode users would turn lockdown off every time they received a message with an image, so 1) it wouldn't help, and 2) they'd be exposed to other exploits.
Krstić is a smart guy and an experienced security researcher, so I expect Apple applied a pretty sophisticated threat model here that included likely behavior by users.
Incidentally, the NSO Group iMessage exploit used a PDF mislabeled as a GIF which Apple's ImageIO library then content-sniffed and passed to the CoreGraphics PDF processor, which contained the vulnerability that let them construct and run their own interpreter. So it wasn't technically an "image file" at all; it was a PDF masquerading as one, which the overly-ambitious ImageIO then passed to the vulnerable PDF renderer. (Apple has since fixed this.)
But, as I said, there have been many other image-parser vulnerabilities. Like, say, these.
It sounds as if you're suggesting that the people who care enough about their security and privacy to enable this lockdown mode are also likely to install 'any old browser engine that the user cares to install from some random place on the internet'.
I don't think that this is going to be the problem that you seem to think it is.
I think you vastly underestimate the technical competence of the people targeted for such attacks. Obviously something like the NSO hack isn't their fault, since it was completely silent, but that's not always the case. Sometimes people are led to browse to a certain page, or a page that is known they'll browse to is hacked so they can be hacked when they go to it (i.e. some sort of employee portal at their job)
These people who are targets simply wouldn't see installing Chrome and using it instead of Safari as providing a new vector for attack. They don't understand that Chrome supports some questionable new "standards" that provide hardware access even if Javascript was disabled (though it is doubtful iOS would allow such access even if they eventually allow a full Chrome browser)
There are other vectors of attack "lockdown mode" would not help - for instance, if the person is using something other than iMessage or Facetime for messaging or calling, thus not getting the new protections. There have been zero days in Signal, Telegram and Skype, and there's no reason to believe there aren't undiscovered ones that are just as bad as the NSO hack that are being used against targets right now. Just like there is no way to know that even though Apple closed off NSO's hack that they didn't have a second iMessage based attack ready to deploy that similarly relied on image previews.
Obviously Apple can't ban people using lockdown mode from installing or using certain apps, but they probably need to at least warn people about the potential risks they are taking through their use. Not sure how to do that without people screaming "anticompetitive!" even if the warnings are only seen by the tiny tiny percentage of people worried they are a target of a nation state and enabling lockdown mode.
People are bad at OPSEC, full stop. It entails a lot of cognitive load, vigilance, correction of habits, and just plain irritation and inconvenience. And people are bad at correctly judging non-immediate risks. So even those in plausible danger of the sorts of attacks this is meant to help prevent are unlikely to be terribly good at reducing their exposure.
After all, many of the NSO Group victims had reason to guess they'd be targets of that sort of thing, and could have chosen to use feature-phones rather than smartphones. Many had the resources to use locked-down phone models. They didn't because maintaining those security practices is exhausting.
Apple providing a "reduce my attack surface" button is really not a bad move, because it addresses that human issue.
I can answer that one for you: they wouldn't install it. Since this is a user-decidable switch, Apple could even add that to the features: turn on the lockdown mode and non-WebKit engines get blocked. This wouldn't be a problem because a user who wanted a different engine could disable it. The issue about engines is with choice. If you don't want any engine other than WebKit, then don't install one and you'll be just fine. You'll probably be in the same group as many others, including me, as I don't have a need for a different one given the tiny amount of browsing I do on the device. Others choosing to do so won't force us to.
It’s not a question of what an individual may - or may not - do. And neither you or I could possibly know this. It’s a technical/legal question of how Apple could provide the option at the same time as complying with forthcoming EU legislation.
If someone happens to have a third party browser engine installed, something that many have clamoured for, Apple cannot then provide a lockdown radio button without substantial qualification on the settings app. Something like “If you have installed any of a long list of 3rd party apps, this feature will not work as described.”
If there was a way to select some of these options without taking them all, for those of us who aren't going to be targeted and don't need full "lockdown mode" but might want to disable some things "just in case" if we don't need them. Hopefully that comes in a future update.
This bit from the article is a bit interesting
Not allowing wired connections to computers or peripherals when the device is locked
Locked, or locked down? If it is just when locked then I'm pretty sure that is irrelevant as border security in most locations has the legally enforceable right to request you unlock the device.
…and yet a certain subset of commentards will happily believe their half-arsed brain farts will pierce Apple’s Lockdown Mode.
Lockdown Mode being a feature (lest we forget) designed specifically for Apple’s very own operating system; running on, well, what can only be described as proprietary Apple hardware; and rustled up by actual Apple-pays-their-salary engineers
Stroll on
Well, what they do isn’t really difficult to implement. Instead of protecting say your mail app from bugs in the processing of a gazillion different attachments, attachments other than images are just disabled. And I could imagine that only the most popular image types will be supported so your phone is easily protected against bugs in the processing of a gazillion obscure image formats as well.
Apple's being smarter about it than you would be on this then, as Apple is recognizing that their paid Apple-mindset engineers do have a set mindset and as such may not recognize a potential threat. That the bounty is so high shows that Apple is confident, but are still recognizing the possibility.
When you compare the bounty, which will come with strings attached, to the marketing budget you'll realise that that's where it's coming from. More convincing would be if they were prepared to indemnify users if it gets broken. Still it's good to see them taking on the idea of bounties.
I'll reserve judgement on it until its released but, thus far, many of Apple's security features have turned out to have trap doors so that the OS or Apple's own apps can do things.
Indemnify users from what, exactly? Being imprisoned on bogus criminal charges, or under reprehensible laws? Being murdered by government thugs? That'd be a tough one, even for Apple.
Bug bounties have a mixed record, but they can be useful, and it's appropriate for Apple to offer one for this feature.
Apple probably didn't think their OS had so many security flaws. They were wrong, and the latest 'fix' is lockdown mode. The thing about being wrong, is you can be wrong more than once. you know, like patches for OSs are released monthly, because one fix doesn't fix everything, for ever.
"They were wrong, and the latest 'fix' is lockdown mode."
No, and no.
'Wrong' implies a viewpoint which has been discredited. Apple has operated a bug bounty programme for years, and this isn't the first time they've offered millions of dollars for discovery; so they have demonstrably long held the viewpoint that others might be able to spot things they haven't.
Lockdown mode isn't a 'fix'. It's a feature intended for a small group of people who are prepared to sacrifice a substantial degree of functionality in return for a higher level of security. For the average Joe, it won't 'fix' anything; in fact they probably won't even know or care that it exists.
this will be of any impact to whatever high profile targets for users of NSO or other similar kit, no matter how secure this will be.
The enemy of security is always cool features, links, etc ... If some users are restricted, they'll turn this off.
Also, those targets are not really the best at knowing what risk feature X or Y could pose, therefore not the best at mitigate them and bear the restrictions.
Also, images are not entirely hack-proof, either, as many have remarked, here.
Given this is attempting to prevent State surveillance. Worth stating that deliveries can be intercepted and addresses flagged for purchases of electronic equipment due to be delivered and examined/opened before delivery.
If you're going to this much trouble, you also need to go to the trouble of obtaining a device through someone else, who isn't being targeted and the real problem with Apple's idea here, is in setting this mode.
Apple is likely to have a list of devices with this feature enabled, could all those devices be subpoenaed or be identified by attempting to access a service that is locked down? Given, activating this mode, is equivalent to putting your head above the parapet, to Apple at least, and potentially to others, with an official looking warrant - for further investigation.
Surely better to sit below the radar with an unassuming run-of-the-mill device within the masses, switch off every privacy compromising feature you can, so that it looks like every other regular Apple device. The idea is not to stand out from the crowd.
This seems to be doing the complete opposite of what the user is trying to achieve. This doesn't fit the zero-trust model.
"Surely better to sit below the radar with an unassuming run-of-the-mill device within the masses, switch off every privacy compromising feature you can, so that it looks like every other regular Apple device. The idea is not to stand out from the crowd."
Security through obscurity has been thoroughly discredited as a concept. More often than not, the bad actors already know who you are, which device you're using, which accounts or services you're active on, and who you're talking to. All they need is that last bit of compromising evidence.
"Given this is attempting to prevent State surveillance. Worth stating that deliveries can be intercepted and addresses flagged for purchases of electronic equipment due to be delivered and examined/opened before delivery."
NSO malware is frequently used by states surveilling people in other states. Saudi Arabia couldn't have intercepted a phone being delivered in the US to compromise it, at least not as cheaply as doing it locally. They may also lack a convenient exploit kit to install on a phone that remains resident, given that the initial setup process only happens normally when there is no user data.
"If you're going to this much trouble, you also need to go to the trouble of obtaining a device through someone else, who isn't being targeted"
Or get lucky. The last laptop I bought for someone was by walking into a shop, paying for it, and carrying it out. You can buy phones like that too. Try intercepting that delivery. Unless they've got a spy in every computer store or opportunistic malware on all of them (and I'm sure they'd like to), you can't guarantee it. They can do a number of things, but they aren't certain and they're expensive and difficult.
"Apple is likely to have a list of devices with this feature enabled,"
Why? They don't need that in a database. As you correctly point out, doing that could cause problems. There's no reason for them to want that list or to put in code to collect it, which could not help them but would certainly anger users.
"Surely better to sit below the radar with an unassuming run-of-the-mill device within the masses, switch off every privacy compromising feature you can, so that it looks like every other regular Apple device. The idea is not to stand out from the crowd."
Again, this is on-device config. It's not a spotlight attracting attention to you. Likely, if you're turning this on, they already know who you are and can find your device without needing this, and the feature just protects you from their attempts to penetrate your defenses.
"This doesn't fit the zero-trust model."
Actually, it entirely does. The zero trust model isn't about trying to hide. It's about having protections on everything. A zero trust configuration is very different from a default config that has several trust-based attack surfaces. I should point out, however, that zero trust configurations don't announce themselves routinely. You only find out whether it is one when you intercept its traffic or attempt an attack.
Link previews are evil. Plainly only there to ensure you see the ad, they are a no-click attack vector.
JS compilation up-front is another one.
Blocking unknown Facetime calls? Surely that should be the default, with a "an unknown caller is calling you. Accept?" alert.
No USB connection when locked. Well, duh! When locked, it should just charge - unless you accepted the connection whilst unlocked.
Remote updates should always be subject to a cancellable alert.
Just proves how little you own your device by default.
Most of these are pretty basic to any phone or PC I would use, why is Apple so slow?
It does not seem to have occured to anyone that the most essential features are real hardware buttons to disable network, camera microphone etc and an even bigger button to disable Google/Apple/Microsoft/Meta spying on us.
The threat of spyware from companies like NSO is no light matter. It is how, for instance, the Saudi Arabian government was able to allegedly track down and assassinate Washington Post columnist Jamal Khashoggi.
They certainly did spy on Khashoggi, but that's not how he died. As is well documented, he entered a Saudi embassy in Turkey to file papers for divorce from his Saudi wife - he told them he was coming so no spying necessary for that - and some Saudi henchmen took the opportunity to dismember him with a bone saw and take him back "home" in multiple pieces.