back to article Linus Torvalds drops Intel and adopts 32-core AMD Ryzen Threadripper on personal PC

Linux overseer Linus Torvalds has binned Intel on his personal PC and hinted that he hopes to one day run an ARM-powered desktop. In his weekly State of the Kernel post Torvalds released Linux 5.7 rc7, said the development process has been smooth and commented “Of course, anything can still change, but everything _looks_ all …

  1. Anonymous Coward
    Anonymous Coward

    ARM wrestle

    Perhaps his remarks even mean that when the mythical Year Of Linux On the Desktop comes about it will be the Year of Linux On The ARM Desktop too.

    There's more likelihood of Linus switching to using OS X on an ARM-based MacBook than it ever being YOLOTD!

    1. Scotthva5

      Re: ARM wrestle

      Sadly this is true.

    2. Mystic Megabyte
      Linux

      Re: ARM wrestle

      You may be living on a different planet, it's been YOLOTD for over 10 years. Sorry to burst your bubble :(

      1. Anonymous Coward
        Anonymous Coward

        Re: ARM wrestle

        You may be living on a different planet, it's been YOLOTD for over 10 years. Sorry to burst your bubble :(

        2% market share with a 1% gain over 7 years?

        https://www.statista.com/statistics/218089/global-market-share-of-windows-7/

        I guess that's considered a success in loony Linux land, but the average person in the street has never even heard of the OS which is pretty poor considering it was launched almost 30 years ago!

        1. DeathSquid

          Re: ARM wrestle

          This century, mobile is where the growth is. And Linux/Android holds the dominant market share globally. Windows isn't even a rounding error.

          1. werdsmith Silver badge

            Re: ARM wrestle

            Yes, when I get a desktop phone it won’t be a windows one.

    3. Anonymous Coward
      Anonymous Coward

      Re: ARM wrestle

      Are you seriously suggesting that someone who needs a 32 core CPU to get adequate speed on his regular tasks is going to use a MacBook?

      1. Anonymous Coward
        Anonymous Coward

        MacBook

        A pirated version is available with Ananas.com, the all new 64 core MacHook !!!! Watch that Hook ARM-wrestle !!

    4. Beeblebrox

      New PC

      "The Register feels Torvalds has probably acquired a whole new PC, as the Threadripper range requires a sTRX4 socket and those debuted on motherboards from late 2019."

      Same PC, just new CPU, RAM, Mobo and maybe graphics cards.

      1. AndyMTB

        Re: New PC

        Yes, my own desktop feels a bit like Trigger's broom. New everything (a few times over) apart from the case and the ethernet cable. At least I'm a whizz at getting it open these days.

        1. Steve Button Silver badge

          Re: New PC

          If that was Trigger's PC, it would have had several new cases as well.

        2. Captain Scarlet

          Re: New PC

          My own PC is the same (Except the case and ethernet cable have been replaced several times), I brought an Evesham Vale Pentium Celeron (PII Variant) and it has been upgraded in bits ever since.

      2. Peter2 Silver badge

        Re: New PC

        And power supply. I tried an upgrade of just the CPU, RAM, Mobo and graphics card and then discovered the existing PSU wouldn't provide enough juice.

        I then just gave up and added a new case (Have you ANY idea how difficult it is to find a traditional monotonously boring beige box these days?) a PCI-E SSD and a newer HDD and rebuilt a computer out of the scrap and gave it to a friends kids as a gaming box.

        1. RichardBarrell

          Re: New PC

          FWIW if you're ever having difficulty finding PC cases that don't look like a disco ball covered in neon lights, I can strongly recommend Fractal Design. They have a whole range that all look like solid black monoliths from 2001.

          1. MR J

            Re: New PC

            +1 for Fractal, on their Mid-Tower cases at least.

            There's a few things I don't like, the water line grommets that come with like to fall out, the "filter" for the PSU on the cases is generally crap, the pads for the feet are also usually poorly attached. I still have two cases here (Older r4 I think) but they have held up well with multiple builds and have done well for me.

            1. Richard Crossley
              Linux

              Re: New PC

              Another +1 for Fractal Design. R7 here with no windows either in hardware or software. Best case I've ever worked on.

              AMD Ryzen 3900X inside, it seems the Linux Lord and I share several preferences.

              1. LawAbidingCitizen

                Re: New PC

                Still using an i7-4790, but am waiting for my 3900x, which should arrive in about 2 weeks.. Can't wait!!

          2. Peter2 Silver badge

            Re: New PC

            FWIW if you're ever having difficulty finding PC cases that don't look like a disco ball covered in neon lights, I can strongly recommend Fractal Design. They have a whole range that all look like solid black monoliths from 2001.

            I'm sure they are nice boxes, but I just want the most painfully bland beige box going.

            My first boxes a few years back had to be that way of course when black and white boxes were only really available to the likes of Compaq and Dell, and all that was available to us mere mortals in the enthusiast world was the Beige Box. These days I take a particularly perverse delight in having put all of my money into good components and having a really quite absurd amount of computing power in a painfully nondescript beige box.

            My friends kids found it particularly hilarious as they were gifted my old gaming box. It is in a wonderfully bland old scratched and dented beige box complete with missing blanking plates from where a blu ray writer used to sit before they got it (which was a second hand thing picked up from CEX for £20 as they couldn't figure out why it wasn't working properly; ie old firmware...) and they set it up to play against the box that one of their friends had, which was a really, really expensive alienware box that looked lovely with transparent bits showing flashing rgb lights on the memory modules, motherboard etc.

            I am told that my ancient Phenom box destroyed it on load time and performance in a way that was apparently quite embarrassing for the kid with the bright flashy Alienware box, which probably cost his dad something like twice what my Phenom box cost me over it's ~13 year life including the incremental life extensions (new drives, ssd for a boot drive, random mid range graphics cards every few years instead of absurdly expensive graphics cards once)

            But then i'm an old fashioned gamer who used to buy stuff at a good point on the price/performance chart, configure it to get the last erg of performance and then run it into the ground until newer games wouldn't run on the minimum resolution/graphics settings anymore. ;)

          3. Claverhouse

            Re: New PC

            Very nice --- though I'm rather bored with black for design [ particularly most cabling being black... ] --- but looking them up their English website was far too modishly confusing.

            Almost impossible to compare the numerous options.

      3. Anonymous Coward
        Anonymous Coward

        Re: New PC

        Ok Trig.

        1. Brewster's Angle Grinder Silver badge

          Re: New PC

          Was that a sin off?

    5. Christian Berger

      Virtually all competent IT departments...

      ... offer Linux desktops, and particularly more technical departments readily gobble them up.

      It's just that incompetent IT departments are far more common than competent ones.

      1. Anonymous Coward
        Anonymous Coward

        Re: Virtually all competent IT departments...

        Word

    6. phuzz Silver badge

      Re: ARM wrestle

      The whole "year of Linux on the desktop" idea came about in a time when desktop PCs were the main interaction most people had with computers.

      These days most people don't own a desktop, although they might own a laptop, but the main way people these days use a computer is the phone in their pocket.

      These days the worldwide majority of phones are running Android, which is based in part on Linux, so you could say that the 'Year of Linux on the ARM Desktop', came about where no one was looking.

  2. Kev99 Silver badge

    One of these day's I'll understand how someone creates a new operating system. Mr Torvalds, hot head that he is, seems to have hit the nut on the head. But he really should see a doctor about the huge zit on his ear.

    1. Anonymous Coward
      Anonymous Coward

      Well, he had Minix as a starting point.

      From what I recall (it was a *very* long time ago) the hardest parts about booting an OS were dealing the bullshit involved in switching from 16-bit mode into protected mode. (Yes, all x86 based PCs *still* start in 16-bit mode.) Not an impossible task, obviously, but it could have been easier.

      I did have fun writing the floppy (that's how long ago this was) boot loader though. You basically needed to read the root directory of the the boot drive, find the secondary boot loader (or kernel), load it into memory, and start executing it within the 512 bytes available on the first sector of the disk. I managed it in 396 bytes (I just checked) including error messages and debug print functions. I think I was trying to switch to protected mode before loading the kernel in the remaining 116 bytes, before other projects, work, and having a life got in the way. :)

      1. dajames

        The hardest part of writing an Operating System is not writing an Operating System, it's getting other people to use it.

        From what I recall (it was a *very* long time ago) the hardest parts about booting an OS were dealing the bullshit involved in switching from 16-bit mode into protected mode. (Yes, all x86 based PCs *still* start in 16-bit mode.) Not an impossible task, obviously, but it could have been easier.

        That's not that hard ... certainly not as hard as switching back again (without rebooting) used to be.

        I did have fun writing the floppy (that's how long ago this was) boot loader though.

        The bootloader works in the same way on a hard disk (in pre-UEFI machines), fortunately the BIOS does the actual reading and writing of the disk. The standard hard drive bootloader is only 512 bytes (including the partition table) but the rest of track 0 is unused, so you can use your bootloader to load a more sophisticated bootstrap from the rest of that track. I've worked on bootloaders that enabled booting from an encrypted drive, and they work like that.

    2. Anonymous Coward
      Anonymous Coward

      Linus wrote a kernel, not an OS. The OS mainly used is GNU, however Linux will also run on BSD.

      1. Hans 1
        Boffin

        Linux will also run BSD, and a BSD kernel can run GNU.

        #TFTFY

      2. Joe Montana

        Actually the os most often used with linus' kernel is probably android, followed by busybox...

    3. John H Woods

      zit

      hurrah, i'm not the only one who detests flesh-coloured mics

  3. Arbuthnot the Magnificent

    Quite an upgrade...

    I've got a photo somewhere of his old student 486, which is on display in a museum in Helsinki.

    1. RM Myers
      Thumb Up

      Re: Quite an upgrade...

      Even more of an upgrade over my first "PC" - https://en.wikipedia.org/wiki/KIM-1/

      1. Sebastian Brosig

        Re: Quite an upgrade...

        Luxury! We had to make do with an Abacus made out of spaghetti and lumps of mouldy bread.

    2. MacroRodent

      Re: Quite an upgrade...

      > I've got a photo somewhere of his old student 486, which is on display in a museum in Helsinki.

      Thanks. I did not know it was on display (and I live in Helsinki). Definitely on my places to visit list when museums reopen. Some info from the University of Helsinki web pages:

      "The Power of Thought" is a permanent exhibition of the University of Helsinki and its students, teachers and researchers. [...] Later objects in the exhibition include a computer used by Linus Torvalds and a student boilersuit from 2007. The exhibition is situated on the 3rd floor of the main building of the University (Fabianinkatu 33).

    3. Uncle Slacky Silver badge

      Re: Quite an upgrade...

      Didn't he start off with a Sinclair QL?

      1. DJV Silver badge

        @Uncle Slacky

        I think that came after the Vic-20.

        1. Anonymous Coward
          Anonymous Coward

          Re: @Uncle Slacky

          > I think that came after the Vic-20.

          To think that I, too, started on a VIC-20, then progressed to a QL. There, but for having dramatically less talent than Mr Torvalds' stand-up desk, went I.

      2. MacroRodent

        Re: Quite an upgrade...

        He had a Sinclair QL, but started writing Linux only after getting a 386-based PC. The man himself describes shopping for it:

        "January 2, 1991. It was the first day the stores were open after Christmas, and my twenty-first birthday, the two biggest cash-generating days on my calendar. [...] It was at one one of these small corner shops, sort of a mom-and-pop computer store, only in this case it was just pop. I didn't care about the manufacturer, so I settled on a no-name, white-box computer. The guy showed you a price list and a smorgasbord of what CPU was available, how much memory, what disk size. I wanted power. I wanted to have 4 megabytes of RAM instead of 2 megabytes. I wanted 33 megahertz. Sure, I could have settled for 16 megahertz, but no, I wanted top of the line." (from "Just for Fun", Chapter IV).

        1. P. Lee

          Re: Quite an upgrade...

          Did he get the 387 maths co-processor?

          1. MacroRodent
            Linux

            Re: Quite an upgrade...

            The book does not mention buying the 387, and at the time it was a rather expensive add-on that was considered useful only for people with number-crunching needs. The Linux kernel also has included code for 387 emulation starting from quite early versions. So my guess is he did not have it in his first 386 computer. But further research is needed to be sure.

        2. hittitezombie

          Re: Quite an upgrade...

          Only a year later I had purchased a 386Dx40 with 1MB RAM with no hard disk, 720kb floyy drive and a 1200 baud modem. Slowly I built it up to 4MB and first 10MB, then 20MB, then 50MB hard disks.

          Unfortunately my talent level is nowhere near Linus.

      3. P. Lee

        Re: Quite an upgrade...

        Ahhh the happy days of 80's computing when computers were different from each other.

        We almost got a QL. Wait, wut? 8-bit data bus?

    4. redpawn

      Re: Quite an upgrade...

      With that upgrade, I could even run Windows 10!

      1. Anonymous Coward
        Anonymous Coward

        Re: Quite an upgrade...

        You don't multi-task then?? :-)

        1. Graham Dawson Silver badge
          Trollface

          Re: Quite an upgrade...

          Neither does Windows 10.

  4. man_iii

    AMD Dreams

    Once upon a time AMD had plans to launch an APU that had an ARM core inside. Just like how they started to include the North bridge and south bridge chipsets in the CPU dies and now with GPUs inside.

    Soon I hope to find an AMD AP-AP-APU with a RISCV or ARM big.LITTLE 4-core8-core , dual GPGPUs, a 6800x or a Starcore or TI DSP, the regular CPUs x86_64 and maybe x86_128? even OpenPower ?, 64GB DDR7 RAM. All of it stuffed into a 1nm die process ;-) lolololol :-D

    1. Tomato42
      Boffin

      Re: AMD Dreams

      well, to be honest, current CPUs are RISC that emulate CISC, so to some degree his wish did come true

    2. Ben 56

      Re: AMD Dreams

      You're unlikely to see x86_128 because most common numbers fit within 64 bits. The only purpose I can see for it would be cryptographic usage.

      1. erikscott

        Re: AMD Dreams

        In a very real sense, MMX/SSE/SSE2/SSE3/SSE4/AVX/AVX2/AVX-512 does this. AVX-512 could be viewed as x86_512 for that matter. :-)

        The big gain in the move to x86_64 was the ability to directly address more than 4G of RAM. At the time ('05?) that was becoming important. IBM had it in POWER and it really made a difference in DB/2. An even larger address space, combined with some smart virtual memory, could make it easier to map very sparse, high-dimensional problems onto a (mostly?) linear address space. Which would be data warehousing and analytics jobs, so yeah... bring on x86_1024!

        1. the spectacularly refined chap Silver badge

          Re: AMD Dreams

          64bit will be big enough to memory map anything you like for the foreseeable future. I did some calculations a couple of years ago and 2^64 bytes was equivalent to several months global hard drive production. Just had a look for some current statistics and it still seems to be a couple of weeks to a month's worth.

          1. Stumpy

            Re: AMD Dreams

            *AHEM* 640KB should be enough memory for anyone.

            1. Alan Brown Silver badge

              Re: AMD Dreams

              I was just thinking about that quote myself.

              Memories of sitting in computer class in 1982 and calculating that the current rate of memory increase in systems, 1GB would be the norm by Y2K and be pushing 64GB systems by 2010. Then thinking... "nahh"

            2. P. Lee

              Re: AMD Dreams

              True, but we're talking exponential increases.

              I think we're good for a few years. I'm less annoyed with 640k barrier than the refusal of ISP's to support ipv6.

            3. Anonymous Coward
              Anonymous Coward

              Re: AMD Dreams

              I would think 520KB quite suffices.

          2. toenail22

            Re: AMD Dreams

            Yeah but x86_64 uses 48 or 42 bit address lines which gives a limit of 256Terabytes of RAM, I can envision workloads that might need more in the foreseable future.

            1. doublelayer Silver badge

              Re: AMD Dreams

              But that's not an intrinsic limit either of the instruction set or of memory. Should we need to remove that limitation, it can be done. It would require some OS code changes, but they get updated all the time so we can manage that. Current processors can't connect to that much memory anyway so any limitations would be removed by the time they can. The next limit would be 16 exabytes, which would be much harder to work around, but I figure that one is a long time off.

        2. Gerhard Mack

          Re: AMD Dreams

          "The big gain in the move to x86_64 was the ability to directly address more than 4G of RAM."

          A larger gain was in having more registers. A lot of time gets lost to swapping memory into registers and back again and X86_32 was notoriously short on registers.

          1. John Sager

            Re: AMD Dreams

            Still is short on registers. The register-ful architectures all have 32 now. However x86 and _64 are so embedded in the computing space now that that argument is pretty futile. I really liked the DEC Alpha architecture when it first came out, and some heavy hardware built on it really served us well many moons ago, in winning one of the Elliptic Curve crypto challenges from that time. Sadly it went the way of all silicon, as did DEC ☹️

        3. eldakka

          Re: AMD Dreams

          The big gain in the move to x86_64 was the ability to directly address more than 4G of RAM. At the time ('05?) that was becoming important. IBM had it in POWER and it really made a difference in DB/2.

          There was a delay in bringing 64-bit addressing to x86. As you noted POWER had it before x86, as did SPARC (1994), Alpha, and Intel's own IA-64 (Itanium). But that was intentional by Intel.

          Back then, as now, Intel loved to segment the market, to be able to push more SKUs with slightly differnet feature sets to better monetize their products (read: screw over the customers by ripping them off). Now there are different editions of essentially the same Intel CPU that have different memory support, not different types of memory, different sizes of memory.

          Take for example the current 8280, 8280M and 8280L, which are essentially the same CPU. They are all 28c/56t, 2.7GHz-4.0GHz, 6 channel memory, 205W TDP processors. The only difference is 1TB, 2TB and 4.5TB respectively RAM support, costing approximate list price1 of $10k, $13k and $17k. Purely so Intel can charge a premium for greater RAM capabilities, there is no technical necessity in the different RAM limits or cost to Intel in supporting 1TB vs 4TB2.

          In the early 2000's Intel was intentionally segementing the x86 market into low-RAM 32-bit x86, and if you wanted high-RAM 64-bit you'd go IA-64. It took AMD's AMD64 to break that, and Intel adopting AMD64 as x86-64 in its own processors to counter AMD was a major setback for their IA-64 (among other things, like the concept never working as noone could get a proper optimizing compiler to work well on it).

          __________________________

          1. no-one ever pays list price. It's the starting point for negotiating on price. Anyone buying one of these processors will pay substantially less, like 20% less. Anyone buying a significant volume, say fitting out a data centre with scores or hundreds, will pay 50% or less in all likliehood. But the ratio difference between the different memory support models remains in effect.

          2. As evidenced by AMDs server-processor model where every server CPU can support 4TB of RAM. There is no RAM capacity segmentation.

          1. Brewster's Angle Grinder Silver badge

            Re: AMD Dreams

            Itanium was a mixture of second system syndrome and a genuine attempt to overcome the limitations of x86. But in the end, it failed even on technical grounds - it just didn't perform. It relies heavily on static analysis which is very inflexible. Run-time analysis enables you to have completely different cores optimising the code for the resources they have without recompilation.

            1. eldakka

              Re: AMD Dreams

              But in the end, it failed even on technical grounds - it just didn't perform. It relies heavily on static analysis which is very inflexible.

              Which is the job of the compiler on IA-64, which is what I meant by:

              (among other things, like the concept never working as noone could get a proper optimizing compiler to work well on it)
              ;)

        4. John PM Chappell

          Re: AMD Dreams

          "The big gain in the move to x86_64 was the ability to directly address more than 4G of RAM. At the time ('05?) that was becoming important."

          No. It was already and had been for a long time, possible to do that. Address lines rarely map directly to the the instruction size (they don't for the AMD64 current architecture, either) and physical memory addressing is orthogonal to instruction size. There is no "4G (sic)" limit. It's true that you need more than 32 bits for a memory address higher than 4 GiB but that's actually nothing to do with the processor's instruction size - 32 bit processors were addressing more than that long before people were worrying about it being a problem. PAE was the standard that addressed it, and it dates to 1995 (Pentium Pro), it also was directly extended to form the standard for memory addressing used in the AMD64 architecture.

          You may be confusing the issue with a Microsoft *licensing* decision to begin restricting 32 bit MS desktop OSes from addressing memory at addresses higher than 4 GiB. A detailed analysis of that can be found here https://www.geoffchappell.com/notes/windows/license/memory.htm (no relation, as far as I know).

          1. Anonymous Coward
            Anonymous Coward

            Re: AMD Dreams

            It's misleading to talk about instruction size in the context of addressing, since direct addressing in the instruction is more relevant to small microcontrollers. Register size is more relevant. If the register size is smaller than the address width, translation schemes are always needed for indirect addressing. It is simply cleaner and more convenient to have registers that are equal to or wider than the address bus, even if the MMU is in there in the middle moving everything around. Although the performance of the CDP 1802 family was low compared to the nmos ones of the era, it's interesting to compare its very elegant array of 16 16-bit registers (with 8-bit instructions) which meant that the PC and stack, along with indexed operations, could easily address a full 64k address space without all the messy workarounds of, say, the 6502. The 68000 implemented a similar approach at the 16/32 bit level which, as the Wikipedia article remarks "was popular with programmers."

          2. eldakka

            Re: AMD Dreams

            No. It was already and had been for a long time, possible to do that. Address lines rarely map directly to the the instruction size (they don't for the AMD64 current architecture, either) and physical memory addressing is orthogonal to instruction size. There is no "4G (sic)" limit. It's true that you need more than 32 bits for a memory address higher than 4 GiB but that's actually nothing to do with the processor's instruction size - 32 bit processors were addressing more than that long before people were worrying about it being a problem. PAE was the standard that addressed it, and it dates to 1995 (Pentium Pro), it also was directly extended to form the standard for memory addressing used in the AMD64 architecture.

            PAE extended memory addresses to 36bits, which allowed the processor to address up to 64GB of physical memory. However, individual processes were still limited to a virtual address size of 4GB. So with PAE on pre-64-bit processors you could have multiple processes that require 4GB each to run, but none of those indvidual processes could address more the 4GB. Therefore in the OPs example of DB/2, that is directly referring to DB/2 process image sizes larger than 4GB, and java JVMs greater than 4GB, and so on. PAE didn't allow that, and being able to have an 8GB, 16GB or more individual process image is a huge boon for workloads that need large data sets, such as databases or statistical software like 'R'.

  5. tempemeaty

    ARM?

    Mr Torvalds seeing what's on the horizon?

    Maybe his next computer change...

    ┐(・ε・)┌

  6. Anonymous Coward
    Anonymous Coward

    mythical Year Of Linux On the Desktop comes

    It's not a myth, it's happening in 2021 next year.

    1. Doctor Syntax Silver badge

      Re: mythical Year Of Linux On the Desktop comes

      As far as I'm concerned it happened when laptops no longer had floppies to boot the SCO installer and Windows first wanted to phone home.

    2. Anonymous Coward
      Anonymous Coward

      Re: mythical Year Of Linux On the Desktop comes

      I ran Linux On The Desktop from 1999 - 2019, so I don't know why you are still waiting.

      1. Anonymous Coward
        Anonymous Coward

        Re: mythical Year Of Linux On the Desktop comes

        I have had my computer sit on the desktop all my life.

    3. Unicornpiss
      Linux

      Re: mythical Year Of Linux On the Desktop comes

      Personally I would love to see it, having been a devout Linux user by night for many years, and supporting Windows by day.

      Unfortunately it's not about the quality of the code, innovation, elegance, or even price. The year of Linux on the desktop will only come when Microsoft's marketing department has a bad day. The company I work for is so deep in bed with MS (SharePoint, AD, Teams, Azure, etc.) and its partner Dell, that it's hard to imagine it could ever reach escape velocity. We do run Linux for certain things, but everything else would need to be reinvented, and our IT dept. is spread so thin you could say it is already at the 7nm level, even if our processors aren't.

      1. Yet Another Anonymous coward Silver badge

        Re: mythical Year Of Linux On the Desktop comes

        Teams already works well on Linux

        Office365 is getting better, don't use sharepoitn anymore so don't know how well it's supported outside IE

        The bits of Microsoft that make $$$ don't care what OS you are using as long as you pay your subscription and I think most of them weren't born when Balmer went on his linux=cancer rampage

        1. Snake Silver badge

          Re: mythical Year Of Linux On the Desktop comes

          The bits of Microsoft that make $$$ don't care what OS you are using as long as you pay your subscription

          Exactly. With respect, I'm tired of hearing that Microsoft, somehow, pulls a magic hypnosis session on anyone and everyone who will bother to look and somehow buy/stay with Microsoft only because of the brainwashing.

          People and industries stay with Microsoft products because the products gets work done. Be it Windows, Office365 or other systems, "The Year of the Linux Desktop" will *NEVER* happen until someone in FOSS stops concentrating exclusively on the OS quality and brings the entire application ecosystem up to a level that not a single compromise in productivity needs to be made by switching over.

          That is the ONLY measurement that companies, and professionals using systems worldwide, care about: the fact that a piece of FOSS software is free is meaningless if it doesn't provide an efficient workflow to get their work accomplished.

          Go on to YouTube and listen to professionals talk about their jobs, people!!! They *don't* say "Oh, this free software is going to save me money and I'm helping support freedom!"; they ALL say "My workflow", "output quality", 'difficulty in switching and impact on my productivity", "quality of tools", "features to get my job done"...etc etc etc.

          It's all they care about: Time is Money. And until FOSS delivers every bit of productivity that the quality paid solutions do, almost nobody is going to switch. Davinci Resolve has done it, offered an extremely powerful video system that is now putting Adobe Premier in the boxing ring, and on the ropes, but (sadly) it seems to be an outlier. GIMP doesn't compare to Photoshop, forget the dreams - no hard-core paid imaging professional falls back on GIMP as their primary editor, unlike Photoshop. No industry is going to drop Microsoft unless, and until, every bit of productivity software that they depend upon daily is up and running on Linux, with absolutely no compromises in both performance AND support - because businesses only care about getting things done, not about taking lots of time to make it work to get to that point.

          1. Anonymous Coward
            Anonymous Coward

            Re: mythical Year Of Linux On the Desktop comes

            ...until FOSS delivers every bit of productivity that the quality paid solutions do, almost nobody is going to switch...

            Most established companies won't switch regardless of what FOSS do, because it's their old way of doing business. If it doesn't "look" that broken, don't fix it.

            New businesses are trying FOSS and keeping it when it works. Frankly, they got nothing to lose and plenty to gain.

          2. hittitezombie

            Re: mythical Year Of Linux On the Desktop comes

            Linux support by Microsoft is fantastic during the last two years.

            SQL Server on a Docker container running in Linux? Why not? Just pay the licensing cost.

            Teams? Runs 100%.

            Windows 365 on a browser - why, it works as good as, and in some cases like collaboration, way better on Chromium browsers on Linux.

            Visual Studio Code? Works brilliantly even on a Chromebook.

            I still hate WIndows with a passion but MS is not the old MS.

          3. Graham Cobb Silver badge

            Re: mythical Year Of Linux On the Desktop comes

            Most of the apps are done: most users aren't using special software, they are using office apps, and they just about work in the cloud today, and will improve further as that is where Microsoft's office product investment is going.

            My employer is a >100K people organisation and pays a lot of money to Microsoft. Our IT dept are pushing Microsoft hard to make "dumb PC working with cloud apps and data" work well enough that they can switch 90% of users to that (the remainder are developers who already use Linux). Mainly for two reasons: security (get all the corporate data of the user's device and strictly under their control), and cost of support (if something breaks - just give the user a new device and it immediately just works).

            Today the biggest issues with this model are that the cloud version of the Office apps don't work quite well enough for the power users (cloud Excel is too slow for finance, cloud Powerpoint is too restrictive for marketing, etc). The other issue is that the model still doesn't work for power-travellers (sales people working from trains or planes, mom-and-pop hotels with crap internet or customer sites with no guest wifi) who need all their data and apps locally.

            Once Microsoft fix those problems, our IT plan to stop supporting desktops/laptops except as cloud access devices. I am guessing 70% of users in our company will then move to a "chromebook" type of device, that is if they need a keyboard and mouse at all and can't just use a tablet.

            They already offer a desktop build for that but not many depts will take it up yet.

      2. AlbertH

        Re: mythical Year Of Linux On the Desktop comes

        The final death of Windows will come when MS fully moved to a "leased" model - the end user doesn't own anything in the way of software (and probably not even their own data, either!). Businesses will finally realise that the buy software, upgrade equipment to cope, buy new OS, upgrade equipment.... rinse and repeat cycle.... is costing them a fortune. Even government departments are starting to realise this!

  7. PhilipN Silver badge

    $$$

    Almost 2,000 just for the CPU. Nice .... but ...... **** me!

    1. PhilipN Silver badge

      Re: $$$

      P.S. Reminded me what motherboards and CPU’s used to cost 30 years ago.

      1. Sgt_Oddball

        Re: $$$

        You do get a lot of cpu for the money. Those bad boys are massive and 4 times the core count of the more standard chipage.

        But they are also very niche and bragging rights processors.

        That said. I'd have one of I could get it past the wife...

        1. 9Rune5

          Re: $$$

          Buy a big thermos, create a secret compartment in the bottom, put the threadripper inside and I'm sure you can sneak it past the guar...I mean wife.

          The motherboard is going to be less easy. Delivery by drone might do the trick..?

          1. Tomato42

            Re: $$$

            and how do you hide it in the bank statement?

            1. Doctor Syntax Silver badge

              Re: $$$

              Wrap it carefully and tape the edges together.

        2. mihares

          Re: $$$

          I bought one before I married, but I guess it's too late for you.

          Utterly, utterly awesome bit of kit, BTW.

          1. Zimmer
            Linux

            Re: $$$

            The moths in the wallet quake at the thought of spending over £400 for a new rig (tightfisted? Moi?)

            But the first PC I purchased for my son in 1995 was a Gateway Pentium P75

            at @£ 1595.00

            (still have the CPU in a box in the garage..upgraded later to a P120)

            1. Duffy Moon

              Re: $$$

              £1595 in 1995 is the equivalent to just over £3000 now.

            2. JimCr

              Re: $$$

              My dad spent something like that on our first PC, a P133 from Gateway. Plus about £400 on the printer.

              We went all the way to some showroom/store in Central London. Then I think UPS delivered the black and white boxes a few weeks later. I was about 14 at the time, so it was pretty exciting at the time... must have cost like 2 months wages.

              1. Yet Another Anonymous coward Silver badge

                Re: $$$

                My first PC, an Olivetti from Morgans on TCR cost me 500 quid = a whole summer student wages

                1. werdsmith Silver badge

                  Re: $$$

                  There’s a Johnny Cash song that describes how I got my first PC, for free. Complete with 20GB Seagate hard drive that I used run run a head parking command for before switching off.

                  “One Piece at a Time” 1976.

                  In all the years since, I’ve never purchased a PC for myself (excluding Raspberry PIs).

                  1. Anonymous Coward
                    Anonymous Coward

                    Re: $$$

                    “I never considered to be a thief”.

                2. Doctor Syntax Silver badge

                  Re: $$$

                  Upvote for Morgans of blessed memory. Before they went into computers I used to get Exacta stuff from them.

            3. James 47

              Re: $$$

              We had the same. There was a jumper on the mobo that allowed overclocking to 100Mhz or 133Mhz. Mine couldn't handle 133, but it was nice to 33% more clocks for free.

      2. Pascal Monett Silver badge

        On August 24, 1996 I bought a Pentium I 133 for €250.

        That came with a Yellow Dragon P1 motherboard for €153, and 8MB of EDO RAM for €287.

        PS : prices not adjusted for inflation

        1. PerlyKing
          Happy

          Re: Prices not adjusted

          You may not have adjusted the prices, but the euro didn't come into use until 1 Jan 1999 ;-)

          1. A.P. Veening Silver badge

            Re: Prices not adjusted

            Small correction, that date it came into existence, it only came into (general) use three years later.

      3. NorthIowan
        Unhappy

        Re: $$$

        P,S, and also the cost of memory when you wanted another 128 K. (Or was it 64K.)

        I think you could get a whole PC a few years later for what doubling my RAM in the first one would have cost.

        1. the spectacularly refined chap Silver badge

          Re: $$$

          I had a similar thought perhaps five or ten years ago upgrading the memory on my printer. 160Mb RAM, that's eight times the hard drive on my first PC compatible.

      4. Alan Brown Silver badge

        Re: $$$

        for comparison: Intel were flogging 24 core Xeons for $5-7000 apiece this time last year. I drew the line on one system because the CPUs the user wanted were $13,500 apiece.

        Having convinced him he could survive on 8 cores less per socket, we were able to knock $12k off the total price

    2. Anonymous Coward
      Anonymous Coward

      Re: $$$

      Almost 2,000 just for the CPU.

      £2300 in Norway. As nice as it would be, I can't justifying spending 50% more than my monthly mortgage payment on a processor. :/

      1. Anonymous Coward
        Anonymous Coward

        Re: $$$

        But you can offset it against your heating bill. That thing should have no trouble keeping you warm in the winter months.

        1. Anonymous Coward
          Thumb Up

          Re: $$$

          hah That's true. I have a quad GPU mining machine in my hallway which does help a lot in the winter. (And during the "summer" if I'm being brutally honest.)

          My place is always cold (I'm pretty sure it was built on some ancient burial ground) so every little helps.

    3. Alan Brown Silver badge

      Re: $$$

      Price doesn't scale linearly with number of processors. the 16 and 24 core ones are a relative bargain

      1. Doctor Syntax Silver badge

        Re: $$$

        Often things do scale more or less linearly until you get to an inflection point. Rule of thumb: either buy the last thing before the inflection point or wait another year.

    4. Boothy

      Re: $$$

      The price was based on competing with Intel, who were at least 2 (and sometimes many) more times more expensive for a similar (and often lower spec) machine.

      Intel did drop their prices in order to compete with AMD, but they are still far more expensive than an AMD equivalent. Plus of course Intel can't compete directly with the high end Zen 2 parts, as they have nothing to pair against the 64 core, 128 thread part.

    5. ChrisC Silver badge

      Re: $$$

      You know, I look at the price of high-end CPUs today and start to think the same, but then I cast my mind back 25 years and being quite happy to spend £849 to upgrade my Amiga with a 50MHz 68060, which was then the fastest processor option available... In todays money that's around £1650-1700, which in turn translates into around $2000.

      Back then it was certainly a significant outlay, but I barely thought twice about making it - being a computer enthusiast back then was an expensive business anyway, so the relative cost of the upgrade didn't feel like such a leap - whereas these days I'd certainly need to take a long run up at the "Buy now" button in order to persuade myself to spend so much the next time I upgrade my PC, no matter how tempting it might be to have that much processing power sat under the desk...

  8. Maelstorm Bronze badge

    AMD vs. Intel: War Games v3.0

    I'm not surprised by Linus's statement. AMD has always been the underdog to Intel, but with a superior product. I remember when the Athlon processor came out and decked Intel's clock. Furthermore, AMD chips, in general, execute instructions faster than Intel with a lower clock speed thereby reducing heat and power consumption. Since AMD bought ATI, AMD has been placing GPU cores on the same die as the CPU. This takes a byte out of nVidia's CUDA because having the GPUs on the same die as the compute cores means that the GPUs can get their data from the same highspeed buss that the compute core do, without the PCIe bottlekneck.

    1. LeoP

      Re: AMD vs. Intel: War Games v3.0

      "without the PCIe bottlekneck"

      This made me again realize what an old fart I am.

      1. Anonymous Coward
        Anonymous Coward

        Re: AMD vs. Intel: War Games v3.0

        > This made me again realize what an old fart I am.

        You didn't <GASP> /understand/ microchannel architecture, did you?

      2. Anonymous Coward
        Anonymous Coward

        Re: PCIe bottlekneck

        ISA old fart to.

        Last week I just turned 0x40 and I smiled when I realized what a nice number it is.

        1. Anonymous Coward
          Anonymous Coward

          Re: PCIe bottlekneck

          It is, they even made a famous song about it

    2. Anonymous Coward
      Anonymous Coward

      Re: AMD vs. Intel: War Games v3.0

      yes, not really a surprise ...

      Back in the years of the early 90s and mono-core CPUs, I used to "make -j 3" to compile, in order to accelerate the process.

      I suppose Linus is probably now doing a make -j 64 on his new rig :)

      1. Michael H.F. Wilkinson Silver badge
        Happy

        Re: AMD vs. Intel: War Games v3.0

        I have a 64-core Opteron compute server at work (512 GB RAM is also nice), and tried make -j 64 on a big code base and was very impressed to see that fly. In practice make -j 32 generally maxes out the speed, simply due to IO limitations, but it is fun to watch things compile and install really fast. The machine is getting old, so I hope to weedle out funds for a replacement, and maybe while I am at it try to slip a 32 core desktop machine into the budget. Maybe I should connect to my inner BOFH to make this happen

    3. Charlie Clark Silver badge

      Re: AMD vs. Intel: War Games v3.0

      AMD has always been the underdog to Intel, but with a superior product.

      This has often but not always been the case and, as a result, AMD has struggled to raise the capital for investment. Nevertheless, it seems to have got a lot of things right over the last few years, while Intel has made several missteps.

      1. ben kendim

        Re: AMD vs. Intel: War Games v3.0

        Ah, I 'heart' AMD since the 2901/2910/2930 bit-slice days. Good times writing microcode....

        Mick & Brick, was an excellent book as well.

      2. Maelstorm Bronze badge

        Re: AMD vs. Intel: War Games v3.0

        Eh...I'll agree with you there. The thing about AMD though is that they definitely keep Intel on their toes.

    4. Cederic Silver badge

      Re: AMD vs. Intel: War Games v3.0

      Hmm. I disagree firmly that AMD have always had a superior product.

      Athlon 64 was a magnificent beast but also the first time AMD had a clear gap above Intel. They kept it only until the Intel Core 2 Duo series that gave Intel a price/temperature/performance lead over AMD.

      Now the pendulum swings again, and I welcome this. My next PC may well be AMD and not Intel, just as I switched the other way over a decade ago.

      I'll still go with Nvidia for my graphics though.

      1. magicaces

        Re: AMD vs. Intel: War Games v3.0

        Agree, Intel were superior for a long time but have now been pegged back by AMD. I owned an Athlon back then but then switched to Intel with the core duos and have bought Intel since but now I would seriously consider a Ryzen Cpu as my next one. Also agree on GPU just hope AMD can release something to compete with Nvidia on the top end as Nvidia prices are crazy right now but no one to compete with them.

        1. Adrian Midgley 1

          Ryzen for a few years on my

          Linux desktop machine.

          Goes nicely.

        2. Boothy

          Re: AMD vs. Intel: War Games v3.0

          AMD are competitive in the mid range GFX front, with their 5700XT being on average somewhere between a 2060 Super and a 2070 Super (which itself isn't far behind the 2080 non super). Although their drivers have been a bit meh for the last year or so (black screen issues etc), but they seem to be working to fix those problems.

          Hopefully RDNA2 (current 5700s use RDND 1) which is due out towards the end of the year, will help AMD in the GFX front. So far RDNA2 in the new XBox and PS consoles seems to be comparable, at least on paper, to the RTX 2080. Silicon has already been demonstrated for the PS5 (look up 'unreal engine 5 tech demo'). The expectation that in a PC GFX card, this should be even better than the console versions, as you don't have the same power or heat limitations. So who know, maybe AMD will beat the 2080Ti at that point!

          Although also worth noting, nVidia isn't sitting idle, they have their new 7nm Ampere architecture coming at, with some speculating that the new 3000 chips being quite a bit faster than the 2000 range, so will be interesting to see what the new 3080 Ti looks like. All just rumours currently though on release dates, many thing next year, but something they might try for a Nov/Dec launch for Christmas sales. Who knows!

          1. Boothy
            WTF?

            Re: AMD vs. Intel: War Games v3.0

            Why do I notice auto-correct typo's after the edit time window is up!

            architecture coming at

            =

            architecture coming out

            many thing next year, but something they might try for a Nov/Dec launch for Christmas sales

            =

            many think next year, but some think they might try for a Nov/Dec launch for Christmas sales.

        3. NetBlackOps

          Re: AMD vs. Intel: War Games v3.0

          AMD Fire Pro VII has seiously caught my attention, dual card configuration, for my CAE and AI/ML workloads. Performance/$ seems to have a nice leg up on nVidia especially in DP-TFLOPs and nice TDP.

      2. Maelstorm Bronze badge

        Re: AMD vs. Intel: War Games v3.0

        The problem that I've had with nVidia was not the performance, but the reliability of their products. Every care that I've owned that had an nVidia chip on it failed within 3 years. I'll go 10 years without replacing or upgrading the video card if it does what I need it to do. So for reliability, I go for ATI, and they have decent performance too. Hell, I still have some old 3dfx cards inside one of my server boxes. And one machine has a Hercules card in it.

    5. Pascal Monett Silver badge

      AMD was the first to have a GHz CPU with the Thunderbird in 1999.

    6. Irongut Silver badge

      Re: AMD vs. Intel: War Games v3.0

      I've not run an Intel chip in my main workstation for over a decade and recommend AMD at work and privately but you're spouting an awful lot of inaccurate fanboi bullshit which really doesn't help them.

      > AMD has always been the underdog to Intel, but with a superior product.

      Always the underdog yes, always superior no. See Bulldozer.

      > Furthermore, AMD chips, in general, execute instructions faster than Intel with a lower clock speed thereby reducing heat and power consumption.

      No they don't. See all the articles about Zen and Zen+ having fewer instructions per cycle (IPC) than Intel and the articles about the 15% improvement in Zen2.

      > Since AMD bought ATI, AMD has been placing GPU cores on the same die as the CPU. This takes a byte out of nVidia's CUDA because having the GPUs on the same die as the compute cores means that the GPUs can get their data from the same highspeed buss that the compute core do, without the PCIe bottlekneck.

      Because PCIe 3 and 4 are sooo slow. AMD have botched several recent generations of GPU so nVidia are faster for running compute jobs. See any GPU compute performance comparison article.

    7. magicaces

      Re: AMD vs. Intel: War Games v3.0

      Umm before Ryzen that was not the case but thankfully AMD have released a very good product and now have Intel at a disadvantage which they are now utilising well. Very good for the market.

      The previous 10 years though Intel have been superior so I don't think you can say they had the superior product during that time at all and were just the underdogs.

      1. Yet Another Anonymous coward Silver badge

        Re: AMD vs. Intel: War Games v3.0

        The difference is that Intel used to own the market for all the connecting chips.

        So you could build a laptop around AMD but then Intel wouldn't sell you the SATA / WIFI / ethernet chips for your server product line - so only cheap crappy companies could risk upsetting intel.

    8. P. Lee

      Re: AMD vs. Intel: War Games v3.0

      The AMD and Intel products are different.

      AMD scale for IO and has better scale out on the desktop with Ryzen.

      Intel scales vertically providing better single- and low-thread-count performance - good for games. They also focus on the very high-end compute-server stuff and seem to be a bit better on electricity consumption for laptops.

      My desktop concern would be vmware - I heard it isn't great on AMD (happy to be corrected), even if your compile and transcode jobs are great. I suspect Linus doesn't have a problem with a corporate Windows SOE compliance though.

      Which is better depends what you want. AMD as a fixed-desktop workstation seems sensible, though I'd have thought a stonking server in the garage and a laptop would have been the way to go. I guess if someone pays you to have their best kit... :)

      (Caveat - its been a while since I looked)

      1. Anonymous Coward
        Anonymous Coward

        Re: AMD vs. Intel: War Games v3.0

        TL;DR: AMD has scaled out by cores rather than frequency as it's where the majority of the market share is. Intel are scaling for frequency because that's the only easy road they have until 10nm is fully functional.

        I'm using VMware on desktop and servers and haven't hit any AMD-specifc issues with the hardware mix I have - YMMV. For large VM environments you can't mix-and-match Intel and AMD but that's well known.

        For power usage, AMD is more efficient across their 7nm range which is primarily desktop and server at present - laptops will get 7nm GPU's/APU's next month and we should see them in volume in September. Understanding of power usage by CPU's is left as an exercise for the reader - TDP is a guide and may not represent what you experience.

        For gaming, Intel is likely to be the best option if you need every single ounce of performance and cost is irrelevant. For midrange gaming, AMD will likely provide more band for the buck adn you can use the money you saved on the CPU to dump into GPU and a larger SSD that are more likely to have an impact.

        For corporates? I suspect availability of Intel/AMD products will be the major driver - Intel availability has been limited over the last 6 months and if that continues when better AMD laptops appear later this year, I suspect even corporates will change. Slowly.

  9. don't you hate it when you lose your account

    While it will play crysis

    I doubt he will.

  10. Pete 2 Silver badge

    Minimum spec?

    > 32 cores and 64 threads at 3.7GHz

    The problem with giving developers (does Torvalds count a a developer? probably not) ultra-high spec kit is that it disincentivises them from writing efficient code. I understand that resource consumption is waaaaaay down there on the list of priorities for code mungers - probably even lower than fixing bugs or writing documentation. However, when we are told that Linux is "lightweight" and will run on ½GB of memory and 1GHz of CPU then it would be nice to have some confidence that this sort of box would do meaningful work.

    And what could be more meaningful than the head honcho adopting it for their daily lambasting of all around them?

    1. garden-snail

      Re: Minimum spec?

      Running the kernel and building the kernel are two very different workloads, the latter requiring far more horsepower. Most Linux users will never build their own kernel.

      You can guarantee that Linus has other, lower powered devices for testing purposes, if only to keep his primary development and build machine (s) stable and clean.

    2. jonha

      Re: Minimum spec?

      >it disincentivises them from writing efficient code

      Writing efficient code is a mindset and has got nothing to do with CPU power. Writing inefficient code might well be company policy (as it's presumably churned out faster than efficient code) but that's a different kettle of fish.

    3. 9Rune5

      Re: Minimum spec?

      I don't follow your logic of giving the craftsman inferior tools while expecting superior results.

      But certainly, for most of us the profiler is an underused tool. Ironically, one of the reasons is that a profiler tends to be resource hungry on even the strongest of irons we throw at it. (depending on what setting you run the profiler at of course)

      A bigger problem IMHO is the way we organize work. Where I work, our new head of R&D feels that one week sprints is the way to go. (go where? Bankruptcy?) Manglement often gets IT horribly wrong and most devs just go with the flow (myself included at the moment).

      1. Anonymous Coward
        Anonymous Coward

        Re: Minimum spec?

        It was certainly true where I used to work. Large screens for devs when the users had tiny screens, LAN bandwidth when the users were across a WAN were two recurring issues when new code got rolled out.

        1. stiine Silver badge
          Pint

          Re: Minimum spec?

          re: bandwidth and screensize

          You have nearly nailed it, negelcting only NotInventedHere syndrome.

    4. Irongut Silver badge

      Re: Minimum spec?

      How does making the parts of my job where I'm doing nothing other than waiting on a compiler take longer make my code better?

      Does writing with a piece of charcoal on stone make an author's work better?

      Does watching the paint dry make an artist's work better?

      1. Adelio

        Re: Minimum spec?

        Oh, that reminds me of my COBOL days, Change the code and submit a compile, then get the compile printout and check for any errors. Miss a full stop anf get hundreds of errors!

      2. Tom 7

        Re: Minimum spec?

        I must confess that when I had to wait 20 minutes for something to compile I'd often thought of a better way of doing it, I could say that was because I was learning at the same time as doing - but I'm still doing that now 45 years or so after having written my first line of basic on paper to be taken away by the computer club leader who then took it 20 miles to the nearest available computer.

        People tend, like electricity, to take the path of least resistance. Its definitely not the best way to get power to where you want it. Sometimes that extra compile time takes the pressure off your brain and lets it do something the computer cant. I've seen people hammer out hundreds of lines of code a day that impresses the boss and prevents them breathing down their neck all the time but it was often re-engineered later. You may not be in that category but I'd advise you to at least enjoy the chance of a chill out.

        And I dare say a charcoal sketch on a piece of stone by Picasso will likely fetch more than you or I will earn in a lifetime! And letting the paint dry makes it less likely to smudge and become useless.

    5. Anonymous Coward
      Anonymous Coward

      Re: Minimum spec?

      I worked somewhere a long time ago (a games company) that gave all developers low-spec machines for testing. (We had slightly higher spec machines for the actual coding and compiling, and secondary machines for remote debugging/testing.)

      It didn't work. We stuck with it - we were a small company with a limited budget, but it was a painful experience - unoptimised (by the compiler) debug code always runs slower than optimised code, and algorithm optimisation only ever happens towards the end of the project - unless something is abjectly too slow to be pushed.

      For maximum developer efficiency, you need to make daily workflows as optimal and as streamlined as possible. And that includes fast hardware for development and testing. You will need to test on slower hardware eventually, but that slow hardware should never be your primary test environment.

      1. IGotOut Silver badge

        Re: Minimum spec?

        "I worked somewhere a long time ago (a games company) that gave all developers low-spec machines for testing. "

        Did they make MDK by any chance, as that was how they achieved such good graphics and motion for the time.

        1. Anonymous Coward
          Anonymous Coward

          Re: Minimum spec?

          Nope, not us.

      2. swm

        Re: Minimum spec?

        Xerox wrote their STAR office products using the XDE (Xerox Development Environment). The development environment was blazingly fast but the actual user code was abysmally slow because the developers never used it.

    6. Charlie Clark Silver badge

      Re: Minimum spec?

      Please define efficient. Modern operating systems are very different beasts from the kind of control systems that were used for, say, early space exploration programs where everything was constrained by the available memory. Even for FORmula TRANslaton (writing the code that let computers do complicated mathematical equations), the advice was to use as much memory as you can. And once we got beyond simple mechanical computations it soon started to make more sense to let compilers and profilers do the optimising. Of course, that doesn't mean there isn't a load of inefficient crap in modern systems, just that it's differently inefficient!

    7. TheSkunkyMonk

      Re: Minimum spec?

      Trainee developers no, they most certainly should get old kit preferably some dumb terminals and vi and teach them to use what limited resources they have. when they graduate and get that dev hat then they get the well deserved fancy new machine. Linux has certainly graduated, degree or not.

      1. Teiwaz

        Re: Minimum spec?

        Not bad at all, some of those old mainframe screen editors.

        I've been just as productive on those as on yer fancy-ass new-fangled IDEs moreso perhaps.

    8. Will Godfrey Silver badge
      Linux

      Re: Minimum spec?

      Even on the small project I work on I use my fastest machine for actual development. Only when getting near to a release do I then cross-check with older and lower speced machines - these days including a raspberry Pi 3B. Occasionally I'll run it on an elderly EeePC900, or to be more correct I'll walk it

  11. Tubz Silver badge

    Intel stifled the PC evolution and ripped off customers, happy to notch up the speed here and there by a couple of hundred Mhz while raking in the cash with over the top prices, until AMD stepped up kicked them in the nutz and said this is how you do it, ever since, Intel has been playing catch up. proof, look how quickly the increased the core count and speed boost on their products within months of AMD's releases. AMD not perfect, but will never buy another Intel CPU unless they come up with something magical at sensible prices !

    1. Mark192

      Yup

      Also not helped by Intel's problems with its next-gen tech meaning they're trying to compete with updates to last-gen tech.

      Intel are big enough to come back strong and, if they can win on performance, will probably also try to win on price to snuff out AMDs revenue enough to limit future AMD R&D investment.

      Of course the big competition will, in time, be the Chinese chips... that's gonna shake things up some.

    2. naive

      Intel did that in collusion with the major, also US based, software manufacturers like VmWare, Oracle, MS etc.

      The Xeon cpu's always have this handy advantage in the metrics on which these software companies base their licensing fees, this puts AMD in a disadvantage in the datacenter since licensing costs often exceed hardware costs by multiple factors.

  12. Wardy01

    This isn't news

    Prominent tech people upgrade their computers all the time.

    1. Mark192

      Re: This isn't news

      Unfortunately, because you both clicked and replied to an article you disapproved of, the Register will check their metrics and see that was both popular and generated reader engagement.

      i.e. they're responding to perceived demand and YOU are the problem.

  13. Altrux

    It all started with the Am386

    For me, it all began with the mighty Am386. My first PC had the 40MHz version of that, which Intel couldn't match at the time (1992). Later, I had a tasty Athlon XP and Athlon64, before reluctantly reverting to Intel in the Core2 era. But now, the Ryzens are looking very tasty, and are pretty sure to be in my next machine.

  14. Fred Flintstone Gold badge

    AMD's new marketing slogan: "Intel outside"

    Come on, this one was obvious..

    :)

    1. AVee

      Re: AMD's new marketing slogan: "Intel outside"

      I used to peel of those 'Intel Inside' stickers when I came across then and stick them onto the nearest thrash can...

      1. GrumpenKraut
        Angel

        Re: AMD's new marketing slogan: "Intel outside"

        I had one of those stickers on my toilet lid for years.

      2. Doctor Syntax Silver badge

        Re: AMD's new marketing slogan: "Intel outside"

        I was working on one piece of kit and didn't notice the sticker had peeled off until I found it had stuck itself onto my screwdriver set. I just left it there for years.

    2. Doctor Syntax Silver badge

      Re: AMD's new marketing slogan: "Intel outside"

      Cyrix had "Cyrix instead".

  15. Netgeezer
    Happy

    How much?

    I can remember building an AMD based system a while back, Athlon XP 3000+ Barton Core, all in the name of saving a few pennies.

    I don't think I'll be able to save much with a 3970x...

    1. GrumpenKraut

      Re: How much?

      > I don't think I'll be able to save much with a 3970x...

      Have you compared to some (roughly) equivalent intel offering?

  16. SuperGeek

    They say you can tell a man....

    By the PC he runs. And Linus's is just like him, and his attitude. Overpowered. Linus, you ain't THAT great. I hold Berners-Lee in a higher regard than you. At least he ain't an arrogant son of a bullying bitch. Hey Linus, where's your husband, Steve Jobs? Oh, right. Dead. Where bullying attitude deserves to be. 6 foot under.

    1. Boris the Cockroach Silver badge
      Windows

      Re: They say you can tell a man....

      Use your anger, let the hate flow through you and you will be my apprentice to the dark side forever

      1. Anonymous Coward
        Anonymous Coward

        Re: They say you can tell a man....

        Perhaps he had a pull request rejected at some point. :-)

        1. Anonymous Coward
          Anonymous Coward

          Re: They say you can tell a man....

          Probably suffers that too in RL..

    2. Doctor Syntax Silver badge

      Re: They say you can tell a man....

      There, there. Lockout getting to you? There's a chance schools might reopen before the end of term.

  17. doggybreath

    ThreadRipper rocks?

    Sorry, "rocking" is from a different generation than Torvalds.

    But the 3970X is absolutely amazing nonetheless. (The 3990X even more so, but not as efficient with the $$)

    1. werdsmith Silver badge

      Re: ThreadRipper rocks?

      “Rocking” is an old old expression. And Linus is an older sort of guy, being 50.

      But at least you apologised.

  18. TheSkunkyMonk

    Sure they just change the socket 90% of the time just to force us into that extra upgrade and not because some new chip model actually requires it :( really should be pushed into giving us only model a year, the latest and greatest not planning out product lines 10years in advance. Remember Moore's law isa business model not an actual scientific law. Breakthroughs happen over night.

    1. Alan Brown Silver badge

      "Sure they just change the socket 90% of the time just to force us into that extra upgrade and not because some new chip model actually requires it"

      Intel: yes

      AMD: no

      Even to the point where the latest AMD APUs use the same AM4 socket but can't be run on older boards due to bios constraints (the older APUs can only address 16MB in real mode) and vice versa. There are some kludged workarounds coming to allow it on enthusiast boards but they involve serious hackery of the underlaying codebase to make it all fit

      Intel would change the socket and force the issue

      1. Ropewash

        Socket != chipset

        AM4 is just a pegboard and the pinout circuitry.

        What you are thinking of is the chipset. B450 and X470 to be precise are the ones that need some kludging to allow for the zen3 parts. It's not really the bios size in general, just that they will need to remove a lot of existing cpu definitions to make room for the new sku's, which will result in market fragmentation and hence why they tried to avoid it by specifying only X570+.

        Also note that the 320 chipset AM4 boards cannot be used with Zen2 parts.

        So you still need to swap out boards for AMD as well, but at least they tried to keep it to a minimum.

  19. Jolyon Ralph

    Ryzan Threadripper

    I'm sure Ryzan Threadripper was a character in Game of Thrones

  20. Teiwaz

    It's a sensible move

    Nvidia GPUs still suck on Linux, so if you are opting for a sensible AMD GPU, you might as well also go for the AMD CPU too.

  21. jelabarre59

    why Arm?

    Not migrated to an ARM processor? Why not just move to a Power9 system?

  22. TeeCee Gold badge
    Happy

    The Anandtech review of that CPU is a scream. Opens with some blurb about how odd it is to see a premium priced processor from AMD, then puts it up against Intel's top end i9.

    The summary starts with; "I don't like to use the word 'bloodbath', but...", then goes on to rub salt into the wounds by pointing out exactly why you'd be mad to consider the Intel product. Finally closes with a message to Intel that , if they were thinking of cobbling something together and rushing it out to compete, tough shit 'cos AMD have a 64 core one about to ship.

    It probably sucks even more for Intel that their most puissant offering got its clock cleaned even when setup in the "please pwn me" mode of Hyperthreading on.....(!)

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like