back to article How to pwn phones with shady replacement parts

A group of researchers has shown how, for instance, a repair shop could siphon data from Android handsets or infect them with malware with nothing more than a screen repair. Omer Shwartz, Amir Cohen, Asaf Shabtai and Yossi Oren, all from Israel's Ben-Gurion University of the Negev, this week warn that smartphone makers are not …

  1. Paratrooping Parrot
    Mushroom

    Fantastic way to make phones unrepairable

    This seems to be the way to make sure that only the manufacturers can repair phones. No more cheap phone repairs, just make sure everyone chucks their old phones away to buy new ones, or pay over the odds so then Apple or Samsung or their partners can repair phones.

    1. SuccessCase

      Re: Fantastic way to make phones unrepairable

      "This seems to be the way to make sure that only the manufacturers can repair phones."

      "The way" being reality. You have just anthropomorphised an inconvenient fact, presumably to make it sound like the manufacturer is an evil actor "doing this" to make us have to pay more.

    2. Blotto Silver badge

      Re: Fantastic way to make phones unrepairable

      @Paratrooping Parot

      Ffs, Do you want your portable device to be vulnerable to attack or not?

      It's not the phone manufacturers coming up with compromised replacable parts.

      Get the genuine oem stuff from the manufacturer or its approved resellers, not the shady cheaper , no recognisable brand thing off eBay arriving from some unknown Chinese seller at a fraction of the price.

      It's like when a company outsources their IT to a developing nation, on paper it's the same tasks being performed but we all know the quality will be vastly different, and who knows what back doors are being left wide open exposing company data to all.

      I'm reminded of that golf advert, the dealer closing a car door and saying "see it sounds just like a golf", the girl looking back as if to say no chance.

      1. Anonymous Coward
        Anonymous Coward

        Re: Fantastic way to make phones unrepairable

        @Blotto

        Not all manufacturers will sell parts for repairs on the open market and only permit authorised repairers to get their hands on them.

    3. macjules

      Re: Fantastic way to make phones unrepairable

      And talking of which, Samsung just re-re-released the Note 7. This time as the Note Fanboi Edition, or Note FE and made with salvaged parts from the Note 7. Hopefully not the battery.

  2. Anonymous Coward
    Anonymous Coward

    This is news?

    I remember working for PC repair shops back in the mid-2000's, and it was (and still is) a common request to have your hard drive checked for malware. But in doing so that also implicitly grants access to your files, and so the shady repair shops could use that as a way to siphon your data.

    So why are we getting worked up over the same thing happening to smartphones? It's been common knowledge for a long time that on Android you can access the entire storage by plugging it into a USB port while in the fastboot or recovery modes, and every non-Google OEM does their firmware flashes over USB with similarly-designed proprietary tools. I believe Apple has similar methods to download firmware at the factory, but I don't use iOS devices and can't confirm.

    Physical access has always been a guaranteed method for pwning a device. This isn't surprising or newsworthy.

    1. Paul Crawford Silver badge

      Re: This is news?

      I am also thinking, why would they do this? As in, why would a cheap repair shop be using more expensive parts to compromise phones that are probably mostly used by customers on lower budgets?

      Sure it might make sense to do such a nefarious swap on some drug baron's phone to bypass security as part of a CIA sting operation, but I don't see enough general revenue for the risks to make a cheap repair shop go down that route. Not with Google already whoring most of your data from advertiser to advertiser as a "legitimate" business.

      1. Anonymous Coward
        Anonymous Coward

        Re: This is news?

        Surely any drug baron worthy of the name is going to be using burner phones and chucking them at intervals? Repair isn't something that would happen.

        Might be worth doing for the shop if they could extract identity/financial data to sell on.

    2. Anonymous Coward
      Anonymous Coward

      Re: This is news?

      Except connecting via ADB doesnt decrypt the volume containing personal stuff on most phones, just allows low level access to the OS.

      At best you could plant malware but you'll struggle to siphon data off in a lot of cases.

      1. Ken Hagan Gold badge

        Re: This is news?

        "At best you could plant malware but you'll struggle to siphon data off in a lot of cases."

        That's what the malware would be for. After the user has done their thing to decrypt the drive, your malware can siphon whatever it likes.

  3. BebopWeBop

    Having bust the screen on my iPhone (OK....) before taking ti to be repaired at the local moderately cheap screen repairer I was careful to wipe the phone (not that any dodgy images I might have had would titillate anyone other than a landscape fetishist). I was very pleased to see them ask whether I had and suggest that if I had not I should do so and then bring it back. Now they might have been careful make them wary because of previous problems with their own staff (small), but I like to think that they were simply covering their (and their customers) backs.

    1. Anonymous Coward
      Anonymous Coward

      I would be out of business in two shakes of a duck's tail should I even suggest customers wipe their devices to get their screen replaced.

      1. Anonymous Coward
        Anonymous Coward

        I would be out of business in two shakes of a duck's tail should I even suggest customers wipe their devices to get their screen replaced.

        And I would if I did not because my customers trust us to protect their privacy in every move we make - in other words, they would become very concerned if we did not ask that question. It simply depends on who your customers are and what they expect of you.

  4. redpawn

    Identify and Warn

    Let the user know that a part has been replaced and give them the opportunity to green light the part if they so desire.

  5. John Smith 19 Gold badge
    Childcatcher

    sounds like a great way to extend the mfg ID chip on some printer cartridges to phones

    Y'know, for your own good.

    Yes I understand the theory.

    No I do not want.

  6. Updraft102

    So just pop out the Micro SD card containing all your personal stuff and...

    Ohh, right.

  7. mark l 2 Silver badge

    If someone has physical access to my device then I assume that someone with sufficient technical knowledge can gain access to the contents. The only thing I have on there that would be of concern to someone tampering is my banking app.

    I am more concerned with manufacturers fixing remote exploits that affect millions of handsets than something that will only affect a small number of devices.

    But then again I usually buy low end phones that cost about 100 quid so if the screen breaks its probably cheaper to just go and buy a new phone.

    1. Trilkhai

      Or repair it yourself

      I usually buy low-end or used phones that cost a little over half that much, but I fix/replace any bad parts myself if at all possible, since most of the repairs are pretty simple. My logic is that if hardware is damaged enough that it needs a professional to work on it, then whatever problem I'm noticing is likely just the tip of the clusterfuck, and in that case I just sell it as-is for parts.

  8. Jonathan 27

    I really disagree with the conclusions here, why is it the manufacturer's duty to guard against the possible evils of third party hardware? It's the customer's choice to go which cheap knock-off parts and guarding against them is explicitly anti-consumer, plus it costs the manufacturer more money for the additional components.

    For the vast majority of customers this would be a negative.

    1. Kanhef

      Most shops don't have the ability to fabricate or program components like this; I'd worry about problems starting much higher in the supply chain. I can see a (probably Chinese) component manufacturer being paid to include ad-injection code. Not terribly different in principle from the bloatware cruft that PCs come preloaded with so often, but much harder to get rid of.

      1. Charles 9

        If that were true, why isn't it happening already at the point of manufacture? Perfect and unavoidable point for hardware pwnage.

        1. Doctor Syntax Silver badge

          "If that were true, why isn't it happening already at the point of manufacture?"

          It would depend on the brand. An expensive brand has a reputation to protect and could be destroyed when it leaked out if they did this. However, a component manufacturer selling to repair shops is unknown to the public, doesn't have that reputation to protect and could do this without repercussions - just burn their brand and start again in the event of real trouble.

          1. Ken Hagan Gold badge

            Those expensive brands don't make all their own components, so they would be neither liable nor aware if they were fed dodgy components from somewhere. So, um, where do the big brands do all their manufacturing and component supply these days?

  9. Anonymous Coward
    Stop

    Error 53

    Perhaps Apple were actually doing the right thing here?

    1. bazza Silver badge

      Re: Error 53

      I think partly yes, and then again no.

      It's possible, so guarding against it is a good idea.

      On the other hand, the cost/reward ratio for someone doing this isn't that favourable. You'd have to do some serious bank account drainage to make it worthwhile I suspect. And if it became a common thing people would simply stop using the dodgy repair guys, lesson learned.

      I think Apple's reasons were more related to revenue "protection".

      1. Anonymous Coward
        Anonymous Coward

        Re: Error 53

        Why do people always assume Apple is doing this for revenue, as if repairing iPhones is a huge business for them? Considering the ridiculous rent where they locate their stores, while the cost to have Apple replace your screen is higher than at the mall kiosk, given the cost involved and the fact you get genuine parts with full warranty, they can't be making any more money at it than the mall kiosk guys.

        Apple has taken the blame for dodgy third party parts before, so they need to protect themselves. Now maybe an "error 53" wasn't the right way, and instead the phone should raise a stink at you when it boots and require you to hit 'OK' to acknowledge use of non-Apple approved parts that may compromise functionality and aren't covered under your warranty, but letting people put low quality parts that weren't designed to be used is undesirable for Apple and its customers.

        1. Trilkhai

          Re: Error 53

          Even if the income from repairing a phone is negligible, it gives Apple the chance to sell other stuff to the person while they're in their store, particularly things that would relate to whatever had been wrong with the phone.

          Also, it could be a pretty decent generator of sales of new iPhones. Apple stores aren't easily accessible everywhere (do they offer mail-in repairs?), so if the option of taking an iPhone to a third-party repair shop wasn't available, the user would donate or sell a broken one as-is and buy a replacement. The same outcome would be likely to result for people (especially non-technical types who got the news third-hand) who got the impression that all repairs lead to the iPhone being unable to tolerate upgrades.

        2. bazza Silver badge

          Re: Error 53

          I sometimes wonder if people ever stop and think about why phone manufacturers like Apple are fond of sleek, smooth materials like glass, used in places where glass is not required.

          Looks nice? Sure. Breaks easily? Fairly easily. Encourages you to buy a new one when the back of your old one is trashed? Yes.

          They're certainly not made for durability, which plastic is actually much better at.

          Not that durability requires plastic. When Apple had the opportunity to move over to sapphire glass, which is nigh on indestructible, they decided not to. Part of that decision might have been the motivation to not make a phone that really would last forever.

          1. Anonymous Coward
            Anonymous Coward

            Re: Error 53

            Where do they use glass that glass doesn't belong on the last few year's worth of models? You can't very well use plastic for the display, it scratches WAY too easily. Sapphire would have made the display scratch proof (unless you carry loose diamonds in your pocket) but wouldn't have made it shatterproof. Corning even claimed a sapphire screen would be MORE prone to shattering than Gorilla Glass, but they're hardly impartial. At any rate, Apple didn't "decide" not to do the sapphire screens, the company they were working with to produce the screens was unable to produce them in sufficient numbers at sufficient quality, or so it was reported.

            Apple lost a few hundred million dollars on the whole fiasco, they obviously wanted the sapphire screens as a competitive advantage - that's why the exclusive relationship with GT so competitors like Samsung would have to find their own supply. If they could make the 'holy grail' shatterproof and scratchproof screen they'd obviously do it - what a competitive advantage that would be! The loss of repair revenue and "unforced upgrades" would be chickenfeed to compared to Android users who would come over to Apple to get a phone that would never break when dropped.

            The glass backs are reportedly making a comeback with the iPhone 8, because a metal back doesn't work well with wireless charging that will reportedly have. I guess the conspiracy theorists will claim it is because Apple wants more phones breaking...probably in many cases the same people who have been whining about Apple not having wireless charging!

  10. Anonymous Coward
    Anonymous Coward

    Not really, you can keyboard and mouse intercept a MAC just fine....

    1. Trilkhai

      Mac, not MAC

      The computer's called a "Mac" — a "MAC" is the unique network address that a device has to identify it on a network.

      1. Anonymous Coward
        Anonymous Coward

        Re: Mac, not MAC

        I think that was what was meant: use a rogue USB device to sniff out a system. Cloning a MAC allows for an impersonation attack.

  11. Anonymous Coward
    Anonymous Coward

    Do I undestand correctly

    That the research illustrates that individual hardware components can subvert the security model of the phone as a whole and access not just already existing data, which may or may not be there depending on whether the customer to the precaution to wipe the phone before handing control of it to a third party, but it also allows the rogue component to produce and manipulate new data, such as recording video and audio, logging user inputs, and tampering with arbitrary phone functions?

    Or is it my fault for having read the article?

  12. uncommon_sense

    >not doing enough<

    Those who have had an iPhone error 53 might think otherwise!

  13. patrickstar

    I totally fail to envision a scenario - any scenario at all - where the HARDWARE ITSELF wouldn't be considered trusted...

    1. Charles 9

      BadUSB? SMM pwnage?

      1. patrickstar

        My point is that you essentially HAVE to consider the hardware trusted. If it's compromised, game over.

        If an attacker can replace basic hardware components they have already won.

        1. Charles 9

          WHY do you HAVE to consider the hardware trusted? What prevents you from considering it UNtrusted? What about things like Protected Hardware Paths that require hardware authentication?

          1. nijam Silver badge

            > ... Protected Hardware Paths ...

            Oh, that bit's trustworthy, is it?

    2. Ken Hagan Gold badge

      If I may just butt in on your exchange with Charles 9, I think the issue is what you understand by the phrase "the hardware itself". The difficulty is in the first word: "the".

      Some hardware needs to be trusted. To my knowledge, no-one has found a way of building a trusted plaform on top of an untrusted CPU. At some point, the data has to be processed. Building a transparent hardware encryption of memory is conceivable, but I don't know of anyone who has done it. I imagine the cost (in performance) is a worry and I imagine that replacing "needs to trust memory" with "needs to trust the memory controller" isn't reckoned to be worth the effort. You can, however, build a trusted data volume on an untrusted drive and this is now commonplace.

      Once you get to "hardware that you plug in", like USB sticks and eSATA drives, there is an expectation that "the hardware" should not blindly trust "the peripheral" and some bus architectures have been crtiticised (well, actually, more like written off as "do not use, ever") on this site and elsewhere for allowing precisely that.

      With that context, I'd say it makes a big difference whether the hardware is outside or inside "the box" and that test should be interpreted as "end-user serviceable" rather than taken literally. So the SD card counts as "outside" even if you have to take the case off and remove the battery in order to get to it. The screen, however, is definitely "inside" for a phone or laptop, but would be equally definitely "outside" if it is a desktop machine with graphics card and a cable socket.

      There is no shame in building systems that trust the hardware inside the box. There is plenty of shame in trusting hardware outside the box. Vendors should probably design their boxes so that you just need fingernails to access the outside parts but you need a screw-driver (possibly one of those stupid ones that no normal person has) to access the inside parts. Then everything is clear.

      1. patrickstar

        Yes - good clarification.

        I think the general cutoff between untrusted/trusted should be something along the lines of "if the screen is locked / user logged out, can this be reasonably be used to bypass that?"

        So USB sticks would be untrusted. The motherboard would be trusted even though you could theoretically hook up a logic analyzer and signal generator to it. Memory would be trusted as long as it remains in the box (you'd expect someone to be able to remove the memory and read it out so you'd scramble the contents, but not expect it to be under attacker control as long as it remains in place).

        Other considerations might apply if you have an advanced threat model, but then the answer isn't to attempt to build a box where nothing trusts anything else or even itself, but rather to prevent someone from getting in the box in the first place (tamper detection and/or filling the entire thing with epoxy and/or applying physical security like locks and safes around it).

      2. Charles 9

        "Some hardware needs to be trusted. To my knowledge, no-one has found a way of building a trusted plaform on top of an untrusted CPU. "

        But that raises a scary prospect. Given (1) that ARM CPU designs can be tinkered at the licensor's discretion (which is how these SoCs come into being), and (2) that some State agencies are loony enough to want control at the hardware level, including hidden stuff in the CPU, doesn't this raise some serious DTA prospects?

        1. patrickstar

          This is, of course, not a new concept or fear.

          In the case of a pure CPU backdoor at the mask level, it would be pretty easy to insert a backdoor that for example would allow a local attacker full kernel compromise. For example "if certain conditions are true, then bypass all page protection checks". This would be very desirable for a TLA looking to compromise phones - then all they would need would be a single clientside exploit in any app (of which there are plenty), instead of the usual chain of clientside exploit -> sandbox escape / local privilege escalation.

          The same thing could be done with desktop systems of course, but somehow I imagine/hope it's more difficult to sneak a backdoor in at Intel than at some obscure SoC vendor...

          1. Charles 9

            Intel's an American company. Most of the SoC makers are based in China. BOTH are known to be interested in such a thing, and doing it at the manufacturer level would be a win-win for them: ubiquitous so hard to avoid, relatively inexpensive, plus plenty of room for plausible deniability.

      3. Hugh McIntyre

        Re: memory encryption

        Re: "Building a transparent hardware encryption of memory is conceivable, but I don't know of anyone who has done it."

        AMD EPYC (Zen core) CPUs have hardware memory encryption, so someone has done it:

        "Secure Memory Encryption (SME) encrypts system memory. Secure Encrypted Virtualization (SEV) isolates the hypervisor and guest VMs to prevent access to data in shared guest data areas."

        More details are under http://www.amd.com/system/files/2017-06/Trusting-in-the-CPU.pdf. Some OS/hypervisor enablement is required, but no change needed for application software.

    3. Joe Gurman

      Come again?

      Why can't you envision that? It's exactly what the article was about. Given the vast supply of used parts, down to the chip level, in China and other countries, what it to stop criminal gangs or state actors from posing as low-priced sources of replacement parts?

      1. patrickstar

        Re: Come again?

        Again - my point is that you HAVE to consider the hardware trusted, not that someone can't actually compromise the hardware with physical access.

        If someone has access to your phone to the point where they can change the screen, it's game over.

        If you want to prevent that, you don't do it by putting some DRMish stuff in the screen to authenticate it (a la Apple and the fingerprint sensor). This is completely meaningless even if we assume there's no way to stick an evil screen in place considering that they have unrestricted access to literally everything.

        To prevent this, you simply don't allow untrusted parties to have that sort of access to the phone in the first place.

        It would be relevant if this was about connecting an external screen to a desktop computer, or perhaps some sort of Lego phone where replacing the screen does not involve taking it apart.

        It's not relevant here.

        1. Charles 9

          Re: Come again?

          "If you want to prevent that, you don't do it by putting some DRMish stuff in the screen to authenticate it (a la Apple and the fingerprint sensor). This is completely meaningless even if we assume there's no way to stick an evil screen in place considering that they have unrestricted access to literally everything."

          What about 4K BluRay Players and modern gaming consoles that use protected hardware paths (to prevent pirating)? Doesn't that work by using black-boxed keys inside each component so that every link in the chain is encrypted and authenticated to prevent tampering (replace even one component and you break the chain since you change that part's key which, being black-boxed, can't be extracted or copied)? Don't some Android devices use the same technique to prevent rooting and the use of custom builds? And don't these chains INCLUDE the CPU in having encryption keys (cryptoprocessors spring to mind)?

          1. patrickstar

            Re: Come again?

            You don't need to replace any hardware in a phone to pwn it. You might simply add a bug - this has been done since the early days of telephony.

            Or you could replace the entire contents of the phone with something that just shows you a fake login screen and then errors out after entering the password/PIN code, sending it to the guy in possession of the real phone, if that's what you're after.

            Etc.

            Also, the threat models are radically different, but that's probably another discussion.

            1. Charles 9

              Re: Come again?

              "You don't need to replace any hardware in a phone to pwn it. You might simply add a bug - this has been done since the early days of telephony."

              But in a trusted hardware chain, that breaks the chain, resulting in a brick. And in the protected hardware path approach, even the wires are sending all-encrypted data. And the devices are designed to close up on a brick due to the encrypted links, meaning you can't take advantage of the brick to extract data. And I don't think the threat models are THAT different given that BOTH this and the movie/gaming companies are trying to prevent exfiltration of data that can in turn be used to exfiltrate other data.

              "Or you could replace the entire contents of the phone with something that just shows you a fake login screen and then errors out after entering the password/PIN code, sending it to the guy in possession of the real phone, if that's what you're after."

              If you're gonna go THAT far, it would be more trivial to switch out the entire phone with a replica.

              1. patrickstar

                Re: Come again?

                You simply need to add a small circuit board with a microphone (or other listening device - radio/EM fields/position/etc) on it. This is not stopped in any way whatsoever by any chain of trust.

                Rather it's stopped by tamper protection and physical security, both of which are, by definition, not relevant if you just handed your phone to someone and expect him to switch out the screen.

                There's no way to compare keeping secrets on a phone unreadable to preventing home users from pirating BluRay discs - there are simply no commonalities between the scenarios.

                Regarding switching out the entire phone - sure, but it might be a tad suspicious if you hand in your old worn thing (probably dinged up from whatever broke the screen as well) and get back a brand new phone. Just sayin'.

                1. Charles 9

                  Re: Come again?

                  "You simply need to add a small circuit board with a microphone (or other listening device - radio/EM fields/position/etc) on it. This is not stopped in any way whatsoever by any chain of trust."

                  Yes it is, as it still doesn't get you into the contents, which probably did not come in by speech. We're not talking bugging, we're talking pwning. And pwning is also a piracy path, which is why BluRay players and console makers are interested (as pwning the Wii allowed backups to be made using its own drive, which by design MUST be able to read them). And before you say "bug the touchscreen," the touchscreen itself would have an encrypted data path, just like ATM PIN pads.

                  "Regarding switching out the entire phone - sure, but it might be a tad suspicious if you hand in your old worn thing (probably dinged up from whatever broke the screen as well) and get back a brand new phone. Just sayin'."

                  So yo swap out the used phone for one in similar condition. Shouldn't be that hard as long as most of the hardware is intact. If the phone's damaged enough to be unique-looking, then OK you'll need another tactic; thus the repair shop front.

                  1. patrickstar

                    Re: Come again?

                    You can presumably sniff things like EMI, or otherwise detect hand movements. Lots of possibilities here, with interesting precedent in what's been done against PIN pads.

                    Plus your phone has other secrets to protect than just its' contents. Like everything being said in the same room as the phone, even if it's off if it's bugged.

                    Regarding PIN pads, the VISA EPP standard is not meant to withstand a day or so of unsupervised access, which is what handing your phone in for repair certainly does in a lot of cases.

                    Or protect against a rogue service technician at all, atleast in more ways than having the keys split across multiple persons (which doesn't do you any good if the thing comes back from service trojaned to the hilt).

                    The scenario for DVD/BluRay/etc is to protect the actual digital data, to prevent an exact (high-definition high-quality) copy, not keep the contents per se seciret. Their whole purpose is to do a very lousy job at that so you can actually watch the movie.

                    Same with games - you are SUPPOSED to be able to play the game.

                    The scenario of a phone is to protect many different secrets from getting read out in any way, or intercepted in the first place.

                    Plus the value of making a copy of a single BluRay disc is substantially lower than the potential value of getting the contents, or simply bugging the environment, of a single phone.

                    If you hand something in for service and don't trust the service techs, consider it pwnd. This is almost a basic law of computing.

                    1. Charles 9

                      Re: Come again?

                      "You can presumably sniff things like EMI, or otherwise detect hand movements. Lots of possibilities here, with interesting precedent in what's been done against PIN pads."

                      You'd still need context, though. Harder to get without access to the innards.

                      "Plus your phone has other secrets to protect than just its' contents. Like everything being said in the same room as the phone, even if it's off if it's bugged."

                      Still need a way to EXfiltrate those conversations, and if the radio chips are also protected, then you'll need a total package. Might as well use a specialized bug in that instance.

                      "Regarding PIN pads, the VISA EPP standard is not meant to withstand a day or so of unsupervised access, which is what handing your phone in for repair certainly does in a lot of cases."

                      ATMs have to sit by their lonesome for days at a time. Who within a location actually pays attention to the PIN pads during normal operation? As for techs, that usually points to inside jobs, meaning they have access to key chips. Rogue techs could use side channels like hidden cameras, but again that's close to insider status to get them clandestinely in the machines and outside this context.

                      "The scenario for DVD/BluRay/etc is to protect the actual digital data, to prevent an exact (high-definition high-quality) copy, not keep the contents per se seciret."

                      The reason being they have a perennial problem: the enemy only has to be lucky ONCE. Then sharing instantly nulls their economic advantage, and the human condition means people WILL cheat. That's why they've been working on this VERY hard for the last 20-30 years, coming up now with this chain of trust system for the 4K systems (as well as the consoles, which double as 4K players) based on what the phone makers have been doing (and some phone STILL haven't been rooted or custom-ROM'd at this point; ask xda). Similarly for pwning a device. ONE slip and it's Game Over. They have to hold that off for as long as they can.

                      "If you hand something in for service and don't trust the service techs, consider it pwnd. This is almost a basic law of computing."

                      But not COMPLETELY. Otherwise, we'd see a formal proof by now a la Turing's Halting Problem disproof, as there ARE real scenarios where DTA must be assumed, so there IS a practical angle.

                      1. patrickstar

                        Re: Come again?

                        "You'd still need context, though. Harder to get without access to the innards."

                        Context here would be a password/PIN entry screen, or what's being typed in general. If you say "randomize the positions of things on the PIN entry screen", then you have suddenly slowed down the user and thus made shoulder-surfing/secret recording of the entry a lot easier. Tradeoffs and all...

                        And I don't sit around designing exotic iPhone bugs for a living, believe it or not. I'm sure that the people who actually do can come up with a myriad other ways of haxxor you with a day's unsupervised access to the phone, which don't involve a dodgy screen.

                        "Still need a way to EXfiltrate those conversations, and if the radio chips are also protected, then you'll need a total package. Might as well use a specialized bug in that instance."

                        The problems you encounter when making a small bug are the power supply and antenna. In a phone you have both - a miniature transmitter is not only readily available commercially but also trivial to build from parts.

                        "ATMs have to sit by their lonesome for days at a time. Who within a location actually pays attention to the PIN pads during normal operation?"

                        I can't find a public document with the whole standard (thank Jesus/Allah/Buddha/Kek I haven't had to deal with PCI standards in a good while), but the requirements are in the range of withstanding tampering for 10 hours or a budget of a couple tens of thousands USD. Solitary ATMs presumably have additional layers (as opposed to payment terminals or such) - the whole shell of the ATM itself, associated alarms, CCTV, etc.

                        "As for techs, that usually points to inside jobs, meaning they have access to key chips. Rogue techs could use side channels like hidden cameras, but again that's close to insider status to get them clandestinely in the machines and outside this context."

                        The EPP standards basically say that opening the thing (eg for service) should nuke the keys. They say very little about what's stopping someone from grabbing the keys as they are re-entered, becacuse this is really difficult to do.

                        "That's why they've been working on this VERY hard for the last 20-30 years, coming up now with this chain of trust system for the 4K systems (as well as the consoles, which double as 4K players) based on what the phone makers have been doing"

                        Budget for copying a single movie: Small (price of movie for a home user or total sales for a commercial piracy operation)

                        Budget for pwning a single phone: Large (potentially millions)

                        It's even worse than that - stopping a phone from leaking data to a physical attacker would be like stopping someone from recording a movie by pointing a camera at the screen.

                        Plus, perhaps most importantly, 4K movies get pirated all the time - so either it's broken already (just not public), or there's no incentive to break it because they get out another way. Admittedly they're not as frequent on the torrent sites, it seems (I rarely watch movies and don't even own a 4K display so I don't keep track of the particulars), but this might just be due to lack of demand for the higher quality.

                        "(and some phone STILL haven't been rooted or custom-ROM'd at this point; ask xda)."

                        All of them can be and regularly are rooted... with a couple of million dollars worth of gear (scanning electron microscope, FIB workstation, high-freq logic analyzers, etc), knowledge and time/budget.

                        It's just meant to be unfeasible for the end user and lower-range attackers (and slow down higher-range attackers so they can't do it en masse).

                        If screens turned out to be a viable vector of pwnership and DRMish protection applied to them, that sort of budget would immediately start going towards breaking it.

                        Then the sort of attacker who would pwn your phone with a fake screen would ... pwn your phone with a more expensive fake screen.

                        So even if we don't consider all the other very viable (and far more likely) attacks that applies if you give someone a day of unmonitored fiddling with your phone, the most you have accomplished is shifting the attackers' budget bracket slightly upwards. I should remind you that a fake phone-pwning screen wouldn't exactly be cheap on the grey forensics/spook market in the first place - five or six digits most likely.

  14. Joe Gurman

    And people complain about Apple discouraging third-party repair shops

    (Not that it does any good.) A good way to jack up repair prices, but at the same time, also a way to insure the provenance of the parts.

    1. nijam Silver badge

      Re: And people complain about Apple discouraging third-party repair shops

      > ...a way to insure the provenance of the parts.

      (a) you mean "ensure" rather than "insure", I believe.

      (b) I deduce you're an optimist.

    2. patrickstar

      Re: And people complain about Apple discouraging third-party repair shops

      And really, do people actually want unrepairable phones?

      Today's smartphones can cost more than a decent desktop or even laptop computer. Do you really want them to be impossible to repair reasonably - or only repairable under the conditions and prices dictated by the manufacturer (if they're even interested in doing it at all)? Just to stop some attack scenario with dodgy parts that you'd expect in case of a nation-state level attacker and/or high-level industrial espionage, not someone out to empty random bank accounts or get ad clicks (taps?).

      Just look at the uproar a number of years ago when Apple started using Pentalobe screws to discourage fiddling with the phone internals. And that's something that's trivial to defeat even on a shoe-string budget...

      There could definitely be a market for phones that are essentially epoxy bricks riddled with tamper detection gizmos and severely paranoid hardware (TrustNo1, not even the screen), if there isn't already, but I doubt even the vast majority of security-conscious users would appreciate the tradeoff.

      Such a phone would presumably be subject to similar security testing/certifications to other tamper-protected devices (PIN pads for card transactions, for example... or good-old fashioned locks and safes) where you have a clear threat model - an certain amount of time and/or money needed to break it. Even though there certainly is some overlap in the technology employed, this is still very different from some ad-hoc DRM scheme on random components a bunch of leet haxorz at the manufacturer came up with.

  15. Kay Burley ate my hamster
    Big Brother

    This isn't news. It's a sales pitch..

    The'll be seeling this 'tech' just as soon as the patents come through.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like