"...used by Syed Farook as attended a work event in San Bernardino, California, in 20215..."
I guess you REALLY do need a copy editor.
Australian security firm Azimuth has been identified as the experts who managed to crack a mass shooter's iPhone that was at the center of an encryption standoff between the FBI and Apple. Until this week it had largely been assumed that Israeli outfit Cellebrite was hired to forcibly unlock an encrypted iPhone 5C used by Syed …
Hi -- yeah, we need as much help with editing as possible. So much so, it would be great to have some kind of subscriber-level feature where people can create 'pull requests' for improvements to pieces and we'll accept them if they're any good and you'll get some sort of credit for it.
If you want to know how this kind of blunder happens, it's when a sentence turns into a bit of a mouthful or is missing some info and then when someone (like me) tries to tidy it up, they push it straight to prod and get distracted by something else that needs sorting out, and they forget to look back to make sure the change is correct.
Mea culpa, I should have previewed the change then made it live. Just got too many other things right now to juggle.
So, it's fixed. Don't forget to email corrections@theregister.com if you spot anything wrong, please. Like bugs make their way into software, sometimes errors creep in during the edit.
C.
Sure, miskates happen, and the real bane of proofreeding is that unfortunately you can carefully read over something several times, and sometimes your eyes still see what they want to see, rather than what actually happens to be on the screen…
But quite a few people have suggested on many occasions that you could/should have a simple form to report corrections, rather than just an email address (a form could even do nice useful behind the scenes stuff like include the URI of the article automatically, saving on the amount of finger-poking we would have to do), so I would politely suggest that it probably would be quite a good thing to add to the never-ending to-do list?
(Of course, some commentards just like pointing out the mistakes in the comments for the lolz… ;-) )
Apple is just as aggressive
What sort of "balance" does Apple have?
Sure, Apple is trying to close a loophole in it's product, however, but Apple do-not-want to update the security of an EoS product now, would they?
The crime happened in 2015. It was bound to happen that someone smart enough will find ways around the security feature of a product in present time.
Apple should not get "upset". It is bound to happen.
OTOH.... The Fruity Company would get a huge amount of stick here and in many other places if they didn't make every effort to find these holes and fix them.
I wonder if there are any 'Will Not Fix' replies on their IOS Security Bugzilla. Seeing that would be a gold mine to the likes of Corellium.
They aren't still updating iOS 9, but if once they learned what the flaw was, had they found out it was still present in iOS 12 or 14 (the versions that are currently receiving regular security updates) they damn sure would have wanted to patch it.
They make improvements in security on both a hardware and software level every year, but knowing what types of exploits are being used by the kind of companies that deal in the shady Feds/crooks market for million dollar exploits helps them decide where to target their resources into improving those defenses. You can either patch holes one by one, or close up entire classes of exploits all at once with the kind of enhancements that come from the yearly hardware/software cycles. The latter is better, but if you expend effort to close up an entire class of potential exploits that aren't actually exploitable, it is mostly wasted effort.
I always ask when I can: is there a basis for the general public to know that an iOS version is eligible for updates, besides (1) it is the latest version and there isn't a new iPhone due in the near future, or (2) it got an update not very long ago?
I haven't seen it published, and I'm on my second iPhone, a 7, from second hand retail because I assumed - wrongly it seems - that only the latest major version and the phones or devices that run it were maintained. But there's a lot of unsupported phones in the second hand channel.
Thank you!
"I always ask when I can: is there a basis for the general public to know that an iOS version is eligible for updates, besides (1) it is the latest version and there isn't a new iPhone due in the near future, or (2) it got an update not very long ago?
I haven't seen it published,"
I can only assume you haven't looked very hard because even the briefest of searches ("supported iOS versions") gives you pages from wikipedia and from Apple themselves as to which versions are currently supported on which phones.
iOS 12, 13, 14 are currently supported.
Actually iOS 13 is no longer supported. Since everything that ran iOS 13 can also run iOS 14, there is no reason to continue supporting iOS 13. So iOS 12 is still supported since it is the newest version than the iPhone 5S and 6 can run. Even older phones that could only run iOS 9 had their last patch in summer 2019.
To the poster who asked if there is a basis to know if an iOS version is eligible for updates. Not really, at least not from Apple (well it is probably on their website somewhere, but I haven't looked) You will get notified by the device if updates are available, but you don't get a notification when Apple decides it will no longer support a device. That would probably be a good idea (or maybe just have something that comes up saying that when you go the 'software update' section in Settings) though I'm sure haters would claim Apple is only doing it to encourage them to buy a new phone.
Wikipedia is a good resource if you want to find out what devices are supported by what version, and when the most recent patch was offered for a particular version. Search for 'ios version history'.
Yeah, I'm still looking for news of support e.g. the rest of this year, though maybe theoretical i.e. if no updates are required then there will be none. What I can look up, that I know of, is which versions were supported recently, when an update came out.
It crosses my mind now that iOS 12 has been updated to include the SARS-COV-2 exposure notification function - my phone says hello in Bluetooth to yours when we meet, then if I get sick, I tell my phone to broadcast that it and I probably have been infectious, and your phone remembers that we met - and so while Apple and Google have been happy to say no to some more oppressive British government ideas about that, they probably wouldn't drop support for devices which people are using for that purpose, until SARS 2 is well dead, or has gone from "pandemic" (spreading everywhere) to "endemic" (just everywhere).
And we might have to patch the OS to stop a virus exposure notification virus.
(not quite the same scenario)
I create the ultimate 'uncrackable' encryption.
The feds want me to crack a file that used my encryption
Erm, just because I've created it doesn't mean I can crack it because I designed it to be uncrackable and everything I, and others, have thrown at it so far shows it is uncrackable... so until someone announces they have managed it you are out of luck and a law suit won't fix your problem
They didn't ask Apple to break RSA encryption or anything like that. They asked Apple to load new specially designed for the FBI software onto that phone that would ignore the 10 try limit and let them make unlimited attempts at unlocking the phone until they succeeded. Which they would, because it was using a simple 4 digit PIN.
Apple has always supported an alphanumeric passcode in addition to 4 and 6 digit PINs, so breaking the retry limit wouldn't have helped if the phone had used a passcode. I wonder what their demand of Apple would have been in that case? They couldn't ask Apple to create software to bypass the passcode because that WOULD have required them to break RSA, as the device key is encrypted by a key generated from the user's PIN/passcode.
The FBI was obviously trying to use this situation as a way to force a court precedent to require Apple's help. Once the court forced Apple to help in any way, the precedent would have been set for their next ask which would have no doubt been a court order or law saying "you have to build a backdoor for us into all future software versions" would have been their next step.
IMHO what Apple should have done (and maybe they did since then I haven't really checked) is to prevent new software from being loaded in a locked phone. That was at the time possible due to a "DFU mode" which is sort equivalent to accessing the BIOS on a PC - used to recover phones that are bricked or otherwise screwed up. The marginal utility of recovering such phones (which has never happened to me, or anyone I know) isn't worth the risk of Apple leaving a way in for the FBI trying something like this again someday.
@DS999: "They asked Apple to load new specially designed for the FBI software onto that phone that would ignore the 10 try limit and let them make unlimited attempts at unlocking the phone until they succeeded."
Which is exactly what David Wang and Azymuth did, jusding from the WP and El Reg's accounts.
The encryption was not "unbreakable". In fact, it was easy to brute-force. The only obstacle was an external limitation preventing brute-forcing. Once you got around that... No, it wasn't easy to get around it, but it does look like "real security, such as a secure password, would be too hard for an iPhone user..."
That last statement was probably correct, too, but then how can one complain?
"Apple also wants any vulnerabilities discovered in its software to be given to it, rather than sold to law enforcement and governments, so the super-corp can patch them."
As someone that has to use iPhone for work, and works in security - I sure as all effing heck want that system secure. LE might abuse cracking the phone for a date, but people in governments have and will sell exploits to criminals as well as abuse them themselves. (shadow brokers)
Step it up a notch and lets say now everyone has a new Zday exploit that will require a hardware change (takes a year) and no iPhone is safe, and it could have been prevented. Doubt it could happen? I didn't know you would hide your face in public for years either.
>Apple also wants any vulnerabilities discovered in its software to be given to it, rather than sold to law enforcement and governments, so the super-corp can patch them.
Another way of looking at this could be: Apple wants all vulnerabilities to be given to it and be illegal to tell anyone else about them (copyright), so it can patch them or not bother.
Product security is a lot easier if it's illegal to publish anything about any flaws.
That's stupid, they know that people would still be finding and using vulnerabilities. It isn't as if laws against selling drugs stop people selling them, so Apple would still have the problem of vulnerabilities they needed to patch either way. It wouldn't make product security any easier at all.
Why the FBI would think that a terrorist would have incriminating data on a work phone that wasn't destroyed and used the weakest possible password (4 digit PIN, instead of the alphanumeric passcodes which are available) and most importantly was NOT destroyed when their personal phone was, is a mystery.
I don't think the FBI ever believed they would find anything of value on that phone, but merely used the publicity around "yikes terrorists!!!" to engineer the best court case they could to raise public awareness about their problem accessing smartphone data hoping for court precedent / new laws mandating back doors.
I think they believed that after people so willingly accepted the TSA security theater after 9/11 that the general public would be overwhelmingly on their side, and were probably quite surprised when that was not the case.
Certainly it's far from the first time that the FBI or other law-enforcement organizations and representatives beat the "terrorists!" drum in an attempt to get backdoors. They're not going to pass up anything that looks like it might gain support for their case.
To make the improvement I really want to see - let me set an encryption key for iCloud backups on my phone. I can do that for iTunes backups, but Apple apparently is afraid of making this improvement for iCloud because it would raise the ire of the FBI and authorities in other countries who are able to subpoena Apple for iCloud backups to access stuff like iMessage history (some things in iCloud backups are encrypted in a way that leaves them unreadable to Apple, so it isn't as though they can't do this)
@DS999
Quote: "....let me set an encryption key for iCloud backups on my phone..."
Some of use physical backups in two secure locations, one of them offsite. The spooks need physical access to get anywhere near these backups. Oh....and sensitive information is on air gapped systems (separate CAT5 LAN, no WiFi, no internet connections). Now...this arrangement is somewhat inconvenient, and certainly has security issues around personnel. But compared with anything "cloud"?
Now about this quote. How many entry points can one get into twelve words? "phone", "iCloud", "key" Or implied entry points? "bluetooth" "internet" "phone network" "server"
I suppose my definition of the word "security" is a bit different from the definition typically used on El Reg. A quote from 1999 seems apposite:
- Link: https://www.wired.com/1999/01/sun-on-privacy-get-over-it/