back to article Apple to hand out limited-edition iPhones among 1337 h4x0rs because it wants more bug-hunters

Apple has announced the existence of a new and very limited-edition iPhone. The new "Security Research Device" (SRD) is a full iPhone that adds shell access so that security researchers can give it a thorough going-over. As Apple explains: "This program is designed to help improve security for all iOS users, bring more …

  1. Pascal Monett Silver badge

    That has to be a Good Thing (TM)

    Good on Apple for this initiative that will undoubtedly improve the security of their products.

    The FBI must be seething. On the other hand, the FBI can very well enroll in this program, either openly or covertly, and it probably will.

    I wonder how Apple is going to manage that ?

    1. DS999

      Re: That has to be a Good Thing (TM)

      The FBI wants to be able to break into regular iPhones, getting a special one that's already broken into does them no good.

      What Apple is giving researchers is basically a phone that's already been jailbroken.

    2. arthoss

      Re: That has to be a Good Thing (TM)

      I came here to exactly comment that.

  2. Charlie Clark Silver badge
    Joke

    We know where you live

    Given Apple's historical reticence when it comes to bug submissions, this is presumably just a ruse to find out where the people are before sending round the AET (Apple Enforcement Team)!

  3. Anonymous Coward
    Anonymous Coward

    If they want people to work at testing and improving their product, why don't they employ people to work at testing and improving their product?

    1. JimBob01

      Cognitive Bias?

      I’d bet that Apple already employ many people for product security purposes but any employee is open to the many cognitive biases just because they are an employee.

      Handing low level access to “independents” is a way of reducing the role of cognitive bias in security assessments, basically getting input from another set of eyes.

      An obvious weakness in this approach is that you have to trust the people (you give low level access to) that they will report any interesting findings and not keep it to themselves, eg would the FBI report they had found a useful backdoor?

      1. Lee D Silver badge

        Re: Cognitive Bias?

        The last round of hacking on Apple's phones managed to find a remote flaw in browser parsing of a simple website, caused by all kinds of things that just shouldn't be possible - not just technically, but procedurally - that allowed a Mac, iPhone and iPad compromise of the browser to illicitly enable the camera.

        https://www.ryanpickren.com/webcam-hacking

        This included allowing websites to download arbitrary files, then treat those files as trusted local file: or about:, blob: or even data: protocols (!!), letting you load javascript from them, thus bypassing security permissions, along the way discovering that domains with .- or -. in their name don't appear in the permisisons dialogs, you can do popups and even force a browser password autocomplete, and abuse window history to play clever tricks.

        The problem is not the bug you suffer from. The problem is the CLASS of bugs you suffer from. Because they indicate the design of the system, rather than a tiny incidental oversight. It's not an oversight to do the above... it's a completely thoughtless design process. Which is the opposite of security.

        As with everything Apple that I've ever touched - design for them means "designer", not good design, not easy-to-use, intuitive, sensible, planned-out, functional, etc.

    2. FlaSheridn

      The More the Merrier

      > employ people to work at testing and improving their product

      I have been employed by Apple as a tester, but I also think that this is a good idea.

      1. gnasher729 Silver badge

        Re: The More the Merrier

        A tester is mostly there to check that software does what it is supposed to do. Checking that software doesn't do what it's not supposed to do requires a different mindset.

      2. gnasher729 Silver badge

        Re: The More the Merrier

        And developers want the software to be bug free, so they don't want to find bugs. Sometimes that makes them bad at finding bugs. They also know how the software is intended to be used, so finding bugs that happen when the software is used differently can be difficult for them.

    3. DS999
      Facepalm

      Yes, because other companies that are known to employ security professionals like Google have products without any security issues at all.

  4. Pete 2

    Do it yourself

    > The new "Security Research Device" (SRD) is a full iPhone that adds shell access so that security researchers can give it a thorough going-over.

    Surely the sort of "security researcher" who was any good at finding _real_ exploits would be able to gain shell access, without any help.

    1. Stuart Castle Silver badge

      Re: Do it yourself

      "Surely the sort of "security researcher" who was any good at finding _real_ exploits would be able to gain shell access, without any help."

      They would, but they'd be spending time looking for a jailbreak to enable a shell that could be used finding other exploits. There is also the risk that the exploit they use to enable the shell may change something that actually enables the exploit they find. It's a long shot, but the TLDR is that they will have made a change that means that the device is in a different state than the average device.

    2. DS999

      Re: Do it yourself

      The SRD is basically an iPhone that has been pre-jailbroken. Sure, security researchers can jailbreak an iPhone themselves, but what happens when Apple releases a newer version of iOS that patches the hole the jailbreak depends on? They'll have to stay on an older version until they or someone else is able to modify the jailbreak by using a new exploit. Then rinse and repeat when the next version of iOS comes out. If you are researching an exploit on iOS 13.5 then 13.6 comes out and you can't use it because the jailbreak has been closed off, how do you know the exploit you are trying to develop wasn't fixed or otherwise modified as a result of the changes in 13.6 if you can't run 13.6?

      And as stated above, using an exploit to jailbreak is already taking an iPhone out of its 'natural state' which makes security research just a little bit more difficult. This lets them install vanilla iOS but have shell access, and presumably some useful binaries added standard iPhones likely don't ship with like a debugger.

  5. just_some_dude

    I guess I don't really understand the economics of security research. In exchange for the use of phone, people will be entering into legal agreements with Apple that include nondisclosure and expected to spend hours of their time searching for security flaws with no guarantee of compensation or even recognition? I don't understand why someone would agree to this.

    1. gnasher729 Silver badge

      Your mistake is your assumption of "no compensation".

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2020