back to article Intel offers desktop chip that can hit 6GHz if everything goes right, you can keep it cool, stars align, pigs fly

Intel delivered on its promise of a 6GHz Raptor Lake chip this week with the launch of its Core i9 13900KS. The Silicon Valley chip giant has teased the part since mid-2022, when it claimed the chip would run at 6GHz on default (stick) settings and reach 8GHz when overclocked — albeit only with the help of exotic coolants like …

  1. nautica Silver badge

    Intel could have been years ahead...

    Back in the days (8086➔80286➔80386➔...) when Intel was running as fast as they could (just as now) to continually increase the capability of their offerings, a wag was supposed to have said, "I don't understand. Just let me have an 8086 which runs at 20 GHz, and I can do anything Intel could ever conceive of."

    Well, if Intel would have followed this gentleman's advice, imagine how much money (time; effort; manpower; facilities...) would have been saved by playing the same games they're playing now--as outlined in this article--simply by running a bog-standard 80286, but cooled with liquid helium at -452⁰ F; 4.2K.

    1. FeepingCreature Bronze badge

      Re: Intel could have been years ahead...

      Good question there. 86 are comparatively tiny- how high could you clock one nowadays?

      1. Ken Hagan Gold badge

        Re: Intel could have been years ahead...

        It had a clocks per instruction rather than an IPC, so unless you can crank it to about 50GHz and your workload is 16-bit integer arithmetic fitting in 1MB, it's not as interesting as you think.

        1. martinusher Silver badge

          Re: Intel could have been years ahead...

          We had 32 bit and 64 bit integers back then, honest.

          As for fitting in 1MByte space, that's just a matter of discipline. You can do a lot of work in even a 64K (actually , more like 56K) space. You won't be able to create or display websites where 80% of the screen is taken up with videos, popovers and general crap then its horses for courses.

          I've not noticed any increase in productivity from all this whirring and clanking. Maybe that's why Millennials have rediscovered things like flip phones? The purpose of a 'thing' is to communicate, add up, make coffee and so on. Not to serve endless ads and snoop constantly on the user. Space and bandwidth might be cheap but time is a limited resource.

    2. Sparkus

      Re: Intel could have been years ahead...

      NEC took their v20/v30 offerings down that road. Out of the box they were about 30% 'better' than an equiv 8086, drop-in pin and (generally) BIOS compatible. In addition to the all-plastic DIP package, it was available with an 'exposed' core which was easy to cool and overclock.

      The NEC chips included their own maths processing opcodes but were also compatible with the 8087s of the day.

      Those series of chips were very interesting. The follow on v40/50 chips evolved into one of the first full SOC implementations and could give you a mostly complete XT then AT CPU and mobo on a single chip package.

      Sony OEMd and built the V20/30 designs for their own PCs and microcontroller projects.

  2. EricB123 Bronze badge

    It Computes OK.

    ...And doubles as a toaster.

    1. Kane

      Re: It Computes OK.

      "...And doubles as a toaster."

      It could probably warm my house at those temps.

      1. phuzz Silver badge
        Boffin

        Re: It Computes OK.

        Pretty much all the electricity you put into a computer is going to come out as heat, and if this chip is pulling ~300W, that's about a third as much heat as an electric heater. (Those are usually about 1kW).

        Of course, if you have a 300W CPU, the rest of your system is unlikely to be modest, so hitting a power draw of 1kW is possible, and all of that will end up as heat in your room.

        tl/dr this CPU could heat a small room, not quite a whole house.

        1. the spectacularly refined chap

          Re: It Computes OK.

          I think I've mentioned here before the University of Manchester's Computer Science building was supposedly built without a heating system, just a circulation system to move the heat from the machine room to the rest of the building. Certainly when I was there in the late 90s the building appeared to have no functional heating system - the idea was predicated on computers remaining valve based.

          Have a mate who works there now, apparently the machine room is no longer even there now, it's an office for IT support. And no, the heating still doesn't work.

        2. Trevor Gale
          Joke

          Re: It Computes OK.

          If such a CPU could heat a small room, but not a who[l]e house, that might be because it's running a word-processor with autocorrect set to 'on'. One can't help wondering how many would be needed with autocorrect set to 'off' to heat a who[*]e house....

    2. Farmer Fred

      Re: It Computes OK.

      The question is this: given that God is infinite, and that the Universe is also infinite...would you like a toasted teacake?

    3. Roj Blake Silver badge

      Re: It Computes OK.

      Does anyone want some toast?

      https://www.youtube.com/watch?v=LRq_SAuQDec

  3. sreynolds

    How the mighty have fallen....

    It uses the same exotic materials that so called quantum computers use to get something that could make this even remotely newsworthy.

  4. Pascal Monett Silver badge
    FAIL

    "that test was running the 7zip benchmark"

    Oh come on Intel, the 7zip benchmark ?

    Are you that scared that your tech won't work ?

    Run the Crysis demo, for Pete's sake, and show us that your chip doesn't melt and keeps the game chugging. That would be meaningful.

    1. Fruit and Nutcase Silver badge
      Joke

      Re: "that test was running the 7zip benchmark"

      Crysis? What Crysis?

      Says the bloke from marketing.

      A bit topical, with the current economic and political situation in blighty....

      http://news.bbc.co.uk/1/hi/uk_politics/921524.stm

      Also, album by Supertramp

  5. Anonymous Coward
    Anonymous Coward

    New paradigm time

    It's all a bit reminiscent of the fading days of the vacuum tube - ingenious contraptions of electrodes and beam shaping to do complex tasks for "high performance" computing or the military, all swept away by the humble transistor: low power, low voltage, just lots of them.

    Time for Intel and the others to stop squeezing this lemon and find a better way to do it. What's the TDP of the human brain?

    1. Richard 12 Silver badge
      Pint

      Re: New paradigm time

      About 12W, give or take.

      Don't know if anyone has done good experiments for peak, but must be under 20W or it'd cook without liberal volumes of coolant.

      That said, while brains are massively parallel, they can only really manage one foreground task at a time.

      1. Fruit and Nutcase Silver badge
        Joke

        Re: New paradigm time

        they can only really manage one foreground task at a time.

        That is on the single tasking model.

        The one with the multi-tasking core is the one with model designation "Woman"

        1. MrDamage Silver badge
          Coat

          Re: New paradigm time

          Who says men can't multi-task? I can watch porn, wank, and drink a beer at the same time.

          I'm just not allowed to do it at the pub anymore.

        2. quxinot
          Joke

          Re: New paradigm time

          Many bugs have been noted in testing, though.

          On both models.

    2. nautica Silver badge

      Re: New paradigm time

      "...Time for Intel and the others to stop squeezing this lemon and find a better way to do it..."

      The over-riding problem is one of big-business inertia. Intel have gotten so huge "...squeezing this lemon..." that it knows nothing else; it has no other options, the implementation of which would allow its corporate coffers to increase--continually--at the same rate which is (implicitly) demanded by its investors.

      I am reminded of a headline read recently which (paraphrased) stated, "It has been proven that widening roads does not improve the flow of traffic; why do we keep doing it?"

      The answer to that headline is, "Because the powers-that-be know of no other way [which can demonstrate to the tax-payers, as quickly (!)as that solution, that the problem is being "solved"].

      1. Jimmy2Cows Silver badge

        Re: widening roads does not improve the flow of traffic; why do we keep doing it?

        Nah that's just what happens when you piecemeal upgrade the road network slower than the traffic growth rate. And then cover the upgraded sections and the rest of the roads with random roadworks. Always building for what the traffic is when the upgrades are planned, rather than building for what the traffic will be10 years after the upgrades are completed.

        [For any traffic planners out there, yeah I know it's more complex than that]

  6. Elongated Muskrat Silver badge
    Boffin

    So which is it?

    It needs liquid nitrogen (at -196°C) or a standard off-the-shelf CPU water cooler (at about £130)? It goes without saying that there's quite a bit of difference between the two, and this article does seem to contain quite a bit of hyperbole if it's suggesting they are comparable.

    1. nautica Silver badge
      Thumb Up

      Re: So which is it?

      "...this article does seem to contain quite a bit of hyperbole..."

      "...QUITE A BIT of hyperbole..."

    2. phuzz Silver badge

      Re: So which is it?

      It sounds like you'll need at least a basic AIO cooler to hit 6GHz even briefly on a couple of cores, a big custom watercooling loop should be able to boost to 6GHz for longer and on more cores, and you'll need liquid nitrogen to hit 8GHz (probably briefly, on a few cores).

      1. nautica Silver badge
        Happy

        Re: So which is it?

        Seems as though Intel have never heard the old adage, "If wishes were horses, beggars would ride."

        Or...considering the amount of puffery and dissembling coming from that particular quarter for quite some time now, perhaps they have.

  7. Sparkus

    assuming that the os needs to be.....

    windows workstation, windows server, or linux to take advantage of those cores.....

  8. mtp

    Move work to the compiler?

    x86 CPUs are working as hard as they can on instruction ordering, branch prediction and then there is the entire groups of spectre/meltdown/... that should be grouped and called precognition bugs all of which consume transistors and power. Most of this logic of this is known to the compiler so it should be able to take the one off hit in creating efficient code instead of letting the CPU do it on the fly.

    Hmm - I think this has been solved before at least twice (RISK and dare I say it Itanium)

    1. the spectacularly refined chap

      Re: Move work to the compiler?

      That was the philosophy behind Sun's Niagara processors, best part of twenty years ago now. Instead of spending ever more transistors on ever longer pipelines, ever more complex branch prediction and ever larger caches simplify the whole core, kit it out for a shedload of threads, stick as many cores on there as we can and crank up the clock speed.

      The result was the world's fastest processor bar none. Didn't stop them getting swallowed by Oracle a few years later.

      1. Paul Crawford Silver badge

        Re: Move work to the compiler?

        We had some of the first Niagara chip-based servers, or Viagra as we so wittingly called them, for use in the web server of the day. It worked very well for that as the work load was also massively parallel/threaded, however, for more general work loads it had AIFK a single FPU shared. Not an issue for our tasks, but made it less useful than the later multi-core x86 beasts that came along by late 2000s.

        Which ran Linux and eventually ate Sun's lunch. We had been fans of Solaris (plus ZFS) and kept both in use until Oracle took over then it was painfully obvious what the future would be...

        Our Niagra servers continued in use until the facility was shut down in 2019.

  9. Binraider Silver badge

    One thing we've had re-learn recently about both Intel and AMD: "TDP" has more in common with PR ratings than actual consumption.

    I'm glad a lot of the benchmarkers are now measuring at the socket, because the moment marketing are involved any semblance of fact is usually erased.

    Pentium IV is all the evidence you need to know that throwing more power consumption at the problem is not how you produce something that anyone actually wants.

    1. Elongated Muskrat Silver badge

      I suspect it's in the same way that people who write the marketing guff for storage use "metric" measurements for the capacity, so "1TB" is actually 1,000,000,000,000 bytes (931GB), and not 1,099,511,627,776 bytes, like it should be.

      And no, I refuse to use "TiB" and "GiB" just because marketing droids lie and have decided to repurpose a perfectly good, existing, abbreviation.

  10. Ghostman

    How about all of these running at

    8Ghz and better. https://hwbot.org/benchmark/cpu_frequency/rankings#start=0#interval=20

    A lot of AMD FX chips have done it over the years, and a few Celeron chips.

    Using liquid Helium under certain conditions makes chips go faster. Been proven since the days we used to put motherboards into dorm fridges to get more umph out of them.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like