back to article Apple's on-device gen AI for the iPhone should surprise no-one. The way it does it might

Apple’s efforts to add generative AI to its iDevices should surprise no one, but Cupertino's existing uses of of the tech, and the constraints of mobile hardware, suggest it won’t be a big feature of iOS in the near future. Apple has not joined the recent wave of generative AI boosterism, even generally avoiding the terms "AI …

  1. elsergiovolador Silver badge


    The AI features are so great.

    Imagine burglar goes on the usual sweep. Now they can take a selfie at someone's house, then use AI feature to remove themselves from the selfie.

    Then when Police catches them, they can show them the picture "Look guv, I wasn't there!"

    Nah I am hallucinating. Police won't be catching burglars, so there is no need for the AI nonsense.

    1. doublelayer Silver badge

      Re: Alibi

      Is this supposed to be a joke? I'm really not getting it.

      Imagine burglar goes on the usual sweep. Now they can take a selfie at someone's house, then use AI feature to remove themselves from the selfie.

      Then when Police catches them, they can show them the picture "Look guv, I wasn't there!"

      Look, you have a picture of the crime scene at the time of the crime, on my phone. I feel like there's something in your proposal that I'm missing. I'm going to need more explanation.

      1. Robin

        Re: Alibi

        I feel like there's something in your proposal that I'm missing. I'm going to need more explanation.

        Au contraire, I'd suggest that the proposal as it stands mentions "AI" enough to attract plenty of investors.

      2. elsergiovolador Silver badge

        Re: Alibi

        The burglars typically aren't very smart people, but so the police...

        1. Handlebars

          Re: Alibi

          years ago there was a news report about some guys who stole payment card details from a remote database and then used them to order a load of things for delivery to their own homes. A cop was quoted saying "most criminals I meet couldn't even switch a computer on, but they're still smarter than these guys"

      3. mevets

        Re: Alibi

        Ever feel a sort of whooshing over your head?

        1. doublelayer Silver badge

          Re: Alibi

          I considered it, but the sarcasm didn't make any more sense than the literal message did. Most ways the message could be reversed didn't make any more sense.

          1. mevets

            Re: Alibi

            Dissection, be it of a joke or a frog, seldom yields any insight; and in both cases, there is no hope for the patients survival.

    2. katrinab Silver badge

      Re: Alibi

      Not in the photo doesn’t mean not there.

      Having the photo means you probably were there, and the most likely explanation would be that you were out of frame, perhaps on the other side of the camera.

  2. Ace2 Silver badge

    I use Bing all the time, because I loathe Google and wish they would collapse into a pile of smoking radioactive garbage.

    But the Bing AI stuff is getting stoooopid. They’ve started making some of the links on the page that look like search results actually call up the chat interface. It’s getting progressively harder to avoid.

    1. aerogems Silver badge

      You're aware of DuckDuckGo right? Basically Google, but it strips out all the privacy invasion garbage. Or at least most of it.

      1. Anonymous Coward
        Anonymous Coward

        It's not basically Google at all. Try looking up your favourite Japanese "actress" in DuckDuckGo and you will be sorely disappointed.

    2. W.S.Gosset

      Duckduckgo uses the Bing search corpus, if you'd like a different frontend to the same rankings. I can confirm it's not popping up Chat panels as of this afternoon (at least, not on Firefox).

      Caveat: he threw his business model in the bin last year, and is now Censoring results.

      1. Zippy´s Sausage Factory

        I'm looking for an alternative but now that DDG is mainstream, what is there?

        1. Shalghar

          Maybe metager ?

          When it comes to tech searches DDG and metager have proven to be quite good, despite the financial interest openly communicated by DDG, the results were ok so far.

        2. andrewj

          I've been trying qwant, but it's early days.

        3. Handy Plough

          Kagi is awesome. Yes, you have to pay, but for the $9.99 per month you get no tracking, no ads and accurate results. It's an extremely fair trade, and it works really well.

        4. IGotOut Silver badge

          Startpage every time.

          Makes DDG look like a data horder and privacy nightmare.

          Also I find DDG's results to US centric. Startpage is European based (but not headquartered)

          1. Rich 2 Silver badge

            I used to use startpage but then it transpired they got sold or something and the new owners were definitely not too interested in your privacy

            Try Googling for it. Oh the irony :-)

      2. cyberdemon Silver badge

        As I understand it, Brave search is like DDG but with Google as a backend instead of Bing.

        Google itself is becoming shit though, even without the privacy-invasive crap

        That and Brave still has a shitty gen-AI feature

  3. aerogems Silver badge


    I'm fine with more of the background type of stuff. Using NPUs to make Siri more independent of the data center pipeline it required in the early days would be nice. The more it can handle on-device the better, which they could even market as a privacy feature. All those random questions you ask don't necessarily get logged by Cupertino any longer. Personally, I'm not interested in a bunch of parlor tricks. Until someone can demonstrate an actual "killer app" that has real-world practical applications, I'm just not that interested.

    1. Michael Wojcik Silver badge

      Re: Meh

      Personally, I don't even like the "background stuff". When I want a computer to do something, I'll tell it to do something.

  4. DS999 Silver badge

    While I'm very skeptical of the AI hype compared to the reality

    I am interested to see what Apple does as far as running an LLM using NAND instead of DRAM. Flash keeps getting more dense, and if the LLM could do something useful devoting 256GB of flash to an on device LLM would become reasonable (at least providing a reason for people to upgrade to 512 GB or larger iPhone to gain the ability to install that LLM)

    I don't know anywhere near enough about the field to even look at Apple's paper and guess how well it could perform, but even if it couldn't handle complex queries in realtime there would be plenty of use for querying it and having it work on your query in the background while you're doing something else. Sort of like having a personal assistant across the room and giving them tasks related to your work that take a few minutes to complete and they feed you the results.

    The main goal of course is that by keeping it on device you avoid any worries about your corporate or personal info going into someone's cloud, so even if it wasn't quite as fast as connecting to a datacenter full of H100s that can run your query at lightning speed with response time limited more by network latency than compute time there would be a market. And it would fit in with Apple's push that what you do on your iPhone doesn't have to leave your iPhone (other than backups that can be done with an encryption key solely under your control)

    1. Richard 12 Silver badge

      Re: While I'm very skeptical of the AI hype compared to the reality

      Flash is way, way too slow and wears out after a few thousand to maybe a million rewrites. Capacity has increased by making cells smaller and storing more bits per cell (TLC, QLC) - both of which trade endurance for capacity.

      AI inferencing works by updating the values of a few million to billion variables a few billion times for each input - that's a lot of writes.

      An LLM running in flash would take a really long time to execute, and physically destroy the hardware in a few hundred runs.

      1. gnasher729 Silver badge

        Re: While I'm very skeptical of the AI hype compared to the reality

        You better tell Apple that, in case they didn’t think of that. But I didn’t realise that you needed to _write_ multiple gigabytes for every use.

      2. DS999 Silver badge

        Re: While I'm very skeptical of the AI hype compared to the reality

        Well maybe you should read Apple's paper about it and figure out what they are doing different, assuming your understanding of how LLMs work is the correct one (writing millions or billions of variables a few billion times seems highly unlikely to me - that would take months or years to execute even in RAM)

    2. doublelayer Silver badge

      Re: While I'm very skeptical of the AI hype compared to the reality

      I'm worried about the wear that might cause the flash. Lots of people have decided to use the SSD when there isn't enough RAM, and in addition to slowing things down, it also starts using something that can't take that much writing and is required for the phone to function. It isn't going to bother Apple if using their software means phones start dying of SSD failures a bit faster, since they don't happen enough now to be much of a warranty issue and, when they do happen, most customers will just buy another iPhone to replace it. It does bother me.

      1. DS999 Silver badge

        Re: While I'm very skeptical of the AI hype compared to the reality

        Considering I've never heard of an iPhone failing due to flash wearout (or any smartphone for that matter) that may be an overblown concern. People are not writing tens or hundreds of gigabytes a day to their phone's flash like they'd need to wear it out in only a few years.

        1. doublelayer Silver badge

          Re: While I'm very skeptical of the AI hype compared to the reality

          Whenever I've run models, which admittedly is not often, it writes to memory a lot. It loads a lot of assets into memory, runs conversions on them, and then thrashes the RAM for a while. That's fine when it's RAM it's using, but not so much if it's flash. I agree with you that people don't tend to wear that out now, but that could easily change if that's what the software is doing.

          It's not entirely new, either. Some Android phone manufacturers have started advertising massive amounts of RAM, for example 24 GB, which consists of a moderate 8 GB of real RAM, and then 16 GB of swap. There have been some reports of this causing flash-related failures, but that is much more limited use than running a model actively in it, since the swap space isn't constantly being loaded and rewritten. Just because it isn't a problem now doesn't mean it will never be one, especially if we change the reasons why it's not been a problem now.

  5. Jeff Smith

    Achieving a fully conversational voice assistant is surely the catalyst to mainstream adoption of AI. Being able to have a natural and detailed back and forth conversation with your computer to help it understand your requirements for complex tasks will be completely transformative, perhaps even more so than the adoption of the GUI.

    1. werdsmith Silver badge

      I have one that speaks French to me, for the purposes of maintaining my B2 level. It's really getting there.

      1. Anonymous Coward
        Anonymous Coward

        re. I have one that speaks French to me

        trouble is, they're - as we already know, the bots are very deceptive, i.e. provide reliable information and bullshit with equal confidence and unless you're advanced enough, you wouldn't know. Obviously conversation is a different kettle of fish, but the factual info they provide, I've caught 'it' a couple of times, but I can't be on the lookout all the time, 'it' is supposed to be be a 'reliable enough' support tool...

    2. aerogems Silver badge

      I forget what company it is, AliExpress maybe... Anyway, some company in China actually has something quite a bit better than what we have pretty much everywhere else. It'll make automated calls to people to let them know a package or something is coming, and it can handle responses like, "who is this?" and respond appropriately. Sort of like how Japan has all kinds of fun little toys that never make it to the rest of the world. Like attachments for your Nintendo DS handheld console for use in museums to give more info about exhibits and whatnot, and they've had paying for things with your phone years before the likes of Apple/Google Pay came along to the rest of the world.

    3. Roland6 Silver badge

      That requires semantic analysis.and inference; I don’t see how a LLM is going to improve this.

    4. doublelayer Silver badge

      I think this is only the case when computers can actually do the things you're asking for. In short, the problem is a backend rather than a frontend one. People have been able to talk to their computers and ask natural language queries for some time, but it hasn't been very popular because they can only make it do a few things. People eventually run out of times when they want to ask about the weather, search for music, or have their emails read out to them.

      By bolting on an LLM, they can make the frontend more conversational and they've added one more command they can give: write something for me. However, none of the other functions of a computer will be available just because it sounds more like a person when you talk to it. I think it won't be popular for the same reason that existing voice assistants like Siri, Google Assistant, Alexa, and Cortana haven't proven massively popular. After a bit of novelty value, you realize that they can't actually do almost anything you want. I still use the one on my phone, but basically two commands "set a timer for x minutes" and "call y".

    5. captain veg Silver badge

      Re: Being able to have a natural and detailed back and forth conversation with your computer

      Sounds dreadful, frankly.

      "I'm sorry Dave, I'm afraid can't do that."


      1. Michael Wojcik Silver badge

        Re: Being able to have a natural and detailed back and forth conversation with your computer

        Agreed. It might make "AI" "mainstream", but personally I loathe conversational interfaces.

        See also Maneesh Agrawala's recent ACM Tech Talk, "Unpredictable Black Boxes are Terrible Interfaces". (I'd link, but the ACM links are to Zoom and Discourse, and I don't feel like spinning up a browser supported by either of those shambling horrors at the moment.)

  6. 45RPM Silver badge

    Surely this would wear out the SSD in double quick time and, since the SSD is soldered to the board, require the computer to be replaced? It doesn’t seem like the most sustainable plan - but I suppose it will be good for the shareholders.

    1. Anonymous Coward
      Anonymous Coward

      in apple context this is possibly THE plan ;)

  7. jmch Silver badge


    "we suspect Apple's next gen of phones may need more memory, or the models will need to be smaller and more targeted."

    Apple is notoriously stingy with memory on it's hardware. Last time I bought a Mac (2009 iMac), it shipped with 8GB RAM, which seemed enough for a year or so, after which I had to add another 4GB (luckily this was the last model Apple shipped where users could add their own memory!!). 15 years later, the base model still ships with 8GB. Now, on the one hand it's great that MacOS and iOS aren't giant memory hogs like Windows (can't really comment about Android), but really 8GB is still bare-bones. And the markup to get a 16GB model (certainly on the new iMac, but I think also on the iPhone) is eyewatering compared to the real cost of an extra 8GB RAM. And it's the same with SSD on iMac / flash memory on iPhones

    1. 43300 Silver badge

      Re: Memory

      And in 2009 you could add memory to the machine later if you wanted. Not any more - all Apple devices now have soldered RAM (and it's becoming increasingly common on Windows laptop too, especially the thin and light ones).

      1. captain veg Silver badge

        Re: Memory

        > all Apple devices now have soldered RAM

        Soldered as in part of the CPU chip package.

        Good luck de-soldering that.


    2. Ace2 Silver badge

      Re: Memory

      Minor quibble: while what you say is true today, I had no issue adding my own RAM to a 2020 iMac. 2009 wasn’t ‘the last model’ you could upgrade.

    3. doublelayer Silver badge

      Re: Memory

      They don't have much choice for iPhones either. Each model only has one amount of RAM. The current models range from 4 GB (SE 2022) to 8 GB (iPhone 15 Pro/Max).

      1. 43300 Silver badge

        Re: Memory

        The bigger issue with them is the amount of flash as they have no SD card slot - if you have a lot of music, it's either going to be a case of being selective or paying through the nose for a higher capacity.

        1. Lord Elpuss Silver badge

          Re: Memory


          1. 43300 Silver badge

            Re: Memory

            Not much use if you want to have the music available in areas with a patchy phone signal (there are still a lot of those in more rural areas).

            1. Lord Elpuss Silver badge

              Re: Memory

              That's why you have onboard flash memory.

              1. 43300 Silver badge

                Re: Memory

                Err, yes, but Apple massively hikes the price for larger amounts of this (whereas a comparable-sized SD card wouldn't cost much at all).

                1. Lord Elpuss Silver badge

                  Re: Memory

                  Comparing the onboard flash to an SD card is like comparing enterprise storage to consumer. On paper they might do a similar job, but they're very different beasts.

                  Of course if all you're doing is storing music on it then the point is moot (and an SD card would do the job) but you can't expect it to cost the same per GB as SD storage.

                  PS if 'all' you want is SD storage for music, you can achieve this with an SD card > Lightning adapter. It's a little clunky, but it works fine.

                  1. 43300 Silver badge

                    Re: Memory

                    These are phones we are talking about, not anything normally used for tasks requiring fast memory, SD cards are perfectly adequate for storing music, which must be one of the main types of data stored on phones.

                    Carrying a phone in your pocket with an SD card attached via an adapter isn't likely to do the socket much good!

                    1. Lord Elpuss Silver badge

                      Re: Memory

                      If all you need is a 'phone', then an iPhone of any flavour is massive overkill. Better to go for a feature phone with an SD card slot.

                      You might not need anything requiring fast memory, but others might. iPhones these days are designed to be far more than just a phone; they're incredibly powerful pocket computers hence the fast flash. Personally I run my company through my iPhone; the highest workload I have is taking, editing and uploading 4K@60fps video, and even then I will freely admit that the specs of my iPhone exceed what I actually need. Others will have more intensive workloads, but I would suggest that most people who have an iPhone don't actually need it. Including me. I have it primarily because I want it.

  8. Dinanziame Silver badge

    Apple is not a software company

    And they are not aiming to become one. However they feel the need to pretend.

  9. Anonymous Coward
    Anonymous Coward

    on-device gen AI for the iPhone

    to do what exactly? Other than 'because it's there'?

    1. A. Coatsworth Silver badge

      Re: on-device gen AI for the iPhone

      >>to do what exactly?

      It will implement a better sieve algoritm able to more efficiently uncouple persons with challenged mental processes from the currency they have so far accrued

    2. Roj Blake Silver badge

      Re: on-device gen AI for the iPhone

      Once it's there, they can sue other companies for having rounded AI on their phones.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like