Hell must be cooling off
The most confusing part of this article is this: "Apple told The Reg..."
It appears the WireLurker malware threatening Macs, iPads and iPhones has, for now, been partially neutralized. Apple told The Reg it has revoked a previously legit cryptographic certificate the malware was using to sign itself: this certificate tricked iOS devices into trusting and installing WireLurker's malicious apps. Now …
"This allowed the malware to spread itself even to non-jailbroken iPhones and iPads." Well, kindasortamaybe; if WireLurker attempts to install malicious code on a non-jailbroken iOS device through iOS's enterprise provisioning app-installation system, you'll first be warned.
As Palo Alto networks explains when discussing non-jailbroken iOS devices, "... on the first attempt to run a WireLurker application on iOS, users are presented with a dialog requesting confirmation to open a third-party application. If the user chooses to continue, a third-party enterprise provisioning profile will be installed and WireLurker will have successfully compromised that non-jailbroken device."
In other words, if you're not among the subset of humanity whom that sage observer of human frailty Bugs Bunny identifies as a "maroon," you're okay if your iDevice is not jailbroken — you're likely smart enough to not provide that confirmation.
And if you aren't smart enough — after all, maroon is a favored color among vast stretches of the populace — well... (Oh, and do note that maroon is a hue widely to be found staining users of iOS, Android, Windows, OS X, Linux, etcetera, etcetera, etcetera.)
Palo Alto Networks' white paper (PDF) explains it all rather well.
iOS has a feature that allows companies to install their own software on iOS devices that the company controls. To do that, the company buys an "Enterprise license". The employees of the company who want to use their own company's software install a certificate on their iOS device that allows this software to run. It's common sense that you only install that certificate if you are an employee of that company. It's also quite easy to remove that certificate from your iOS device.
The dialog box is under control of iOS. There is no way any evil hacker can change what the dialog box says. Same as the dialog for opening / saving files on MacOS X, it is completely under control of the operating system and cannot be changed (for software coming from the App Store).
So what happened is that these people went to a website that is none to host pirated software; that means anything they get is deserved. They then were asked to install a certificate on their phone that gives the phone permission to use software from some company they don't know, and they did that as well. That's quite a combination of stupid actions.
So what happened is that these people went to a website that is none to host pirated software; that means anything they get is deserved.
Blame the victim much? While I have no sympathy for these folks, it has once again been shown that simple greed can defeat security. No surprises there, but if you think that it is as simple as, "They got what they deserved. Move along. Nothing to see here," the perhaps you should reconsider. The black hats are upping their game and will eventually find a way to get past security controls without user intervention. Between now and then, I would expect incremental work toward that end. Why wait? Fix the underlying issues now and we will have a much better measure of security.
"Device security hanging on a dialog box. What could possibly go wrong with that?"
...exactly. My wife would simply click 'ok', on being presented with a dialog which doesn't make sense to her.
This not at all a slight on my wife's intelligence. She - like the majority of consumers - simply lacks the technical background needed to make an informed decision about how to respond to this kind of dialog on a computing device, and defaults to trusting the manufacturer of the device to have made the 'ok' option a sensible one.
"Device security hanging on a dialog box. What could possibly go wrong with that?"
That's just to install the app.
Apps are still sandboxed on non-jailbroken iOS devices. Meaning that after being installed, this app still won't have access to anything interesting.
The app can request access to the user's contact list but this requires the user to agree to another dialog box. Likewise with location. That's the extent of the "harmful" data that an app can access.
So you are saying Apple users are so smart, they won't click allow?
So to clarify this for me as I want to understand these amazing people that buy Apple products....they read the permissions of any App from the app store they install and still have accepted that their personal contacts, images and phone calls and god knows what else be shipped of to some unknown location in some foreign land?
So prey tell, who is the smartest, the one that agrees to giving away all their own (and others) personal information or the ones that blindly click any box they see so they can play the latest freemium game?
Did they steal a valid MDM certificate from some corporation? That's kind of what it sounds like, but it isn't clear. Or did they use social engineering or other tactics to get someone to sign a falsified certificate for them? More information would be nice. Was the whole exploit hinging on this certificate so that revocation takes care of it, or is that just a finger in the dike until it can be permanently addressed with a patch to OS X and/or iOS?
Since this is an exploit depending on malicious software on a Mac, I can't help but wonder if malicious software on a Windows machine work equally well, or is there something special about using a Mac (i.e. the iPhone "trusts" it more?) It is pretty simple to deliver malware to a PC, and most iPhone owners will have a PC rather than a Mac, so it is interesting that the vector used a Mac. It seems there must be something different about the way an iPhone talks to or trusts a Mac that makes the malware possible there, otherwise they would have delivered the payload via Windows or made it dual platform.
That is what I came here to say. Surely the issue here isn't the malware itself, that's just a bit of software. The issue is how they've managed to sign it with a legitimate Apple certificate and whether this hole has been plugged. If it hasn't, what's to stop them repeating the process with the new certificate?
The certificate is not an Apple certificate. It's an "Enterprise license" certificate, which allows companies to write their own software, sign it with the certificate, ask their employees to install the certificate and then these employees can run the enterprise software. This doesn't work unless the owner of the phone willingly installs that Enterprise certificate.
So, and I may be getting this wrong here, there is a social attack vector where you install what seems to be a completely legit (and possibly useful) app, potentially using a freemium model to make it more immediately attractive but which installs the Enterprise Licence, at which point you have opened up the door for any additional apps using that are signed with that certificate, even if it is a year later (as long as it is before the revocation date)?
Not that I like saying it, but when comments like that appear:
Yes, iOS is more secure, at least than for example Android.
This is obvious ot any tech-literate. Only a tech-illiterate, or a heavily biased mind, would say something else.
Android allows install of any app without any signing, has a tricky capability system that most users can't understand, and that in most installations users can't do much about the apps capabilities. Apps can get to more data and other systems on Android, which is sometimes useful but opens up to more ways of compromising data and software (which is exactly why iOS don't allow those extra ways).
Most Android phones don't get important security updates to the OS, so users run with old unpatched versions.
Just compare the number of malware programs, the percentage of infected devices, and the ways that the devices are compromised, and it will be obvious, for "any tech-literate".
IOS *is* more secure. If it weren't, there would be more viruses and trojans for it. Before you point out that hackers aren't interested in iOS because of lack of users, this isn't true. At it's peak, iOS had over half the smartphone sales. Last I checked, the iPhone still accounted for a large percentage of smartphone sales, and certainly more than any individual Android model.
iOS offers a huge market for hackers.
The walled garden approach Apple have taken has certainly helped, but so have various other things that Apple have implemented, such as sandboxing each app and minimising the network's role in distributing updates thus ensuring that the latest patches for iOS can get to the users rapidly, without being delayed indefinitely by the phone networks.
The security on iOS is not perfect. No OS has perfect security. In fact our old Software Engineering Management lecturer had a particular interest in security and always maintained that a perfect security system (Ie one with no flaws whatsoever) is practically impossible to achieve, and that the first person to do achieve it would become very wealthy very quickly.
Yes, Apple tech is so secure that until a couple of months ago, it allowed infinite login attempts to iCloud...
It's things like that which let Apple down, tbh. Their encryption tech is outstanding... but having built up the castle they leave the front door open in the name of 'user experience', leaving them with an embarrassingly leaky iCloud and a growing malware sector. Any other company would be reacting to this with patches and lockdowns, but Apple just cancel a certificate, leaving the attack vector completely open to an identical bit of malware signed by a different CA (something that'll take the writers all of thirty seconds to acquire).
There's a culture of complacency at Cupertino when it comes to security, so it's a 'feature' to be boasted about but not a necessity to be enforced. I suspect the wilderness years of the 80s and 90s, where no-one would ever bother trying to hack a Mac, have left Apple way behind in the psychological aspects of the security game; while Windows and Linux were teaching admins to lock everything down even if it made user's lives hell, Apple was too busy painting things white and patenting rounded corners. I expect we'll see a) more Apple security blunders on the Fappening model in future, and b) a big uptick in iOS malware as the walls of the garden begin tumbling down.
"There's a culture of complacency at Cupertino when it comes to security, so it's a 'feature' to be boasted about but not a necessity to be enforced. I suspect the wilderness years of the 80s and 90s, where no-one would ever bother trying to hack a Mac, have left Apple way behind in the psychological aspects of the security game; while Windows and Linux were teaching admins to lock everything down even if it made user's lives hell"
Where have you been for the last 7 years?
Apple completely revolutionized computer security with iOS. It was the first mainstream OS to sandbox all 3rd party software. Apple was the first company with the BALLS to do this even though it makes software interoperability a living hell for developers and users. And now it's a feature every other company has copied (either partially or wholesale) in all mobile operating systems.
Instead of complaining about Apple and their security, you should be thanking them every day for their enormous contribution to your device's security, regardless of which mobile devices you use.