At 140 Watts...
Could it be used to keep a supply of water hot for tea? Seems a shame not to use the considerable heat.
Intel's Core range of CPUs now comes in a new "family" and has a new upper limit. Unveiled at the Computex trade show in Taiwan, the new family is dubbed the “Intel Core X-series processors family”. One member of the family is a the Core i9 Extreme Edition, which takes the Core range's upper limit from a “7” to a “9”. The …
The bad news is that you need Tcase for you Intel tea maker, not Tjunction. A reasonable guess is about 70°C, which is sufficient for green tea. Heating time for a well insulated kettle is 4200*(Thot-Tcold)*mass/Power. Cold water from the tap is about 10°C, so a 140W processor make 1kg of hot water every half hour, or one cup every 7.5 minutes. Green tea is usually served in smaller cups, so you can have a fresh cup every 4.5 minutes.
It looks like Intel have found a killer app that totally trashes the raspberry pi, which would take over two hours to make a cup of green tea.
It's not actually that bad, the worst recently were AMD's 9xxx series at 220W. These chips fit a lot of cores into an 140W thermal envelope, and restrict the clock speed based on that.
You might like my new system, dual E5-2690 (v1. i.e. old (2012), quite fast, cheap-ish, lots of cores) - 135W each, with two GTX 480s flashed to Quadro 6000s. Those are 250W GPUs, so if everything is at full chat, it'll be using over a KW in power..
Yes, I did buy them before Ryzen was out in case anyone asks. Ryzen isn't as good as the Intel alternative, but for half the price, it's far more than half better (Unless you're using VME, which is currently broken on it)
It's due to yield. AFAIK, when a chip is created, not all the PCIe lanes pass validation. If too many fail, that chip is bumped down a notch.
There will probably be another version of the 7740X, probably called the 7760X or some such which has the same core count, clock speed and full fat PCIe.
For a similar historical situation, have a look at the 5820k and the 5930k - identical chips but the former had failed validation on more lanes so had a reduced PCIe capability.
This is spot on. My game uses 1 (ONE) core of my expensive I7. The rest is idling around.
All these cores for desktops are non sense. They better use the transistor space on the cpu die for making a superfast 2-core with huge L1/L2/L3 caches so memory access can be minimized.
Strange nobody is thinking about this, and they keep making cpu's suitable for parallel processing with which no software maker can do anything useful with.
As do I; Handbrake will use every drop of processor you can feed it when transcoding a 25/50 GB Blu-ray rip down to something manageable in size and in a reasonable amount of time.
My current machine (i7 6700) manages it in ~20 minutes, whereas the last machine (an old Precision 5500) was more or less real time (2 hours plus).
I figure i should get ~8 years service out of it, which is what I got from the last set of computing hardware I bought.)
"This is spot on. My game uses 1 (ONE) core of my expensive I7. The rest is idling around."
Some games are using all cores, have a look at Hitman 2016, which can potentially bottleneck at CPU Level, even with a very good GPU.
And this is going to be more and more frequent, with more and more games multi-threaded, now the GPU API allows that (DX12, Vulkan).
Start up your PC and let it boot into the OS. Now, without manually starting up *any* apps, games or whatever else you might use your PC for, open up the task manager and see just how much stuff is already running in the background...
Your favourite game might be so badly coded that it genuinely can only use a single core, but even then your gaming experience will be enhanced by having additional cores available to handle all the other crap that a modern PC will want to be running at the same time. Oh sure, for each specific workload there'll always be a question over whether x cores at y GHz vs (n*x) cores at (y / m) GHz gives the best performance, but the long term trend seems to be heading straight down the road signposted "More Cores Please".
Personally speaking, I can't wait to see these multi-core beasts hit the market, so long as the renewed level of competition between Intel and AMD keeps prices at a sane level - I could really do with refreshing my desktop system at some point in the next year...
This post has been deleted by its author
"This is spot on. My game uses 1 (ONE) core of my expensive I7. The rest is idling around."
Quite - this seems to be mostly overkill. Scopio's custom 8 core AMD CPU only runs at 2.3 GHz and can max out a 6TFLOPs GPU (That's roughly equivalent to an Nvidia GTX1070)
Haven't benchmarked Scorpio yet but a 6 core XB1 build roughly matched the same game running, singlethreaded on an FX8370, running without the benefit of DX12 parallelism. I'm expecting Scorpio to compete with 2 FX cores. So huge core counts aren't needed for game players.
Building games though, I much prefer 5 hour Ryzen clean build times with 8cores/16 threads to 9hr with 4 core pair/8 threads on FX at much higher clock and power drain. 16 core Threadripper will probably be the sweet spot before diminishing returns for my workloads.
"Haven't benchmarked Scorpio yet but a 6 core XB1 build "
Microsoft have described Scorpio as a "full custom CPU design", so I would assume it's likely a fair bit faster than the 31% clock speed uplift might indicate....Sony meanwhile went with non-customised Jaguars on the PS4 Pro.
Here here! I would also be perfectly happy with a higher GHz 4 core (8 thread) CPU with more PCIE lanes! 44 lanes? I can fill that easy. I would love to RAID5 four M.2 cards as my primary drive and RAID10 four SATA spinning rust for document/data storage. Add another SATA for a Blu-Ray player. Add two GPUs in SLI with full 16 lanes each. Of course you have all of those shiny new USB3 controllers right? Ethernet. Audio. And... Gah! Never enough lanes.
Oh, sure, there are a lot of services running in the background and blah blah blah, but most things don't really use much CPU. I have plenty of processing power for everything I do. A "standard" gaming rig is bad enough, but hard drives are depressingly slow and as soon as you try to counteract that, splat. Faceplant into the old PCIE wall.
It's been too long that I have been designing systems around how I want to suffer. For once I'd love to build a system where I can have everything that I want to work as it should have been able to. No compromises.
Intel trickle feeds us a few megahertz here, a few there, a couple of cores here, a couple there, & charges handsomely for the "priveledge".
AMD tosses us a CPU with more cores & arguably better performance at a far lower price, & suddenly Intel scrambles to offer "something better".
Coincidence? I think not.
I love the fact that the big names in computer vendors either have currently or "plan on offering soon" AMD based desktops & laptops for customer choice. I wonder what price differences we can expect to see in such systems given Intel's propencity to make our megahertz mega-hurts?
(I'm not sure about that pun, but I'll leave it because it's poking fun at chipzilla.)
Competition: it's the lube the customers use to get corporations to get up off their ass & start offering REAL value for our money!
Given that every time I have looked at a laptop with AMD in it, it has been underpowered and over priced and as for desktops, they have been underpowered and overpriced too. That is not AMD's fault but if manufactures and resellers fumble the ball again it will be difficult for AMD to get the market share their products deserve.
"That is not AMD's fault but if manufactures and resellers fumble the ball again it will be difficult for AMD to get the market share their products deserve."
Tbh, it really was AMD's fault. Prior to Ryzen, their most recent competitive offering in the desktop space was probably the Athlon 64 from 2003. They ended up in marginal desktops for the last decade or so because desktop Opteron and onwards genuinely were marginal processors, running years behind Intel's Core series. Performance-wise, a high-end machine had to have an Intel chip.
It's just the pattern in the processor market, really; Intel overwhelmingly dominant at almost all times but with AMD injecting a tiny bit of competition every 15 years or so.
Well, not entirely. AMD have had several brief tilts at superiority, just as you say, but before and after they had one very long period of clear superiority on almost every metric. This was back when clck speeds were moving through (roughly) the 800MHz to 3000MHz range. Intel's Pentium III was reasonably competitive but way too dear; the Pentium IV was hopelessly outclassed for its entire market life, and as for Intel's wrong-headed fetish for the disaster called Rambus, the less said the better.
Intel's then-new Core chips leveled the playing field, and the Athlon replacements were pretty sad efforts. As you were.
@Nauelus This laptop retailed for £399 when it was new:
http://www.currys.co.uk/gbuk/computing/laptops/laptops/hp-14-an060sa-14-laptop-silver-10156921-pdt.html
and if you look at notebookcheck:
https://www.notebookcheck.net/AMD-E-Series-E2-7110-Notebook-Processor.144996.0.html
That CPU is only fit for a netbook. I've seen A4 latptops for £350 and £399 as well which is just crazy money.
@James 51 - not sure what your point is? That AMD were able to churn out shit processors for shit notebook computers while Intel were dominating all the actually profitable spaces quite totally? You can pick up an i3 laptop for £400 which will comfortably out-perform the E-series on more or less every metric.
Don't get me wrong - when AMD do deliver, they produce great equipment at amazingly low prices. But the general rule for probably 25 of the last 30 years has been that an AMD processor line is inferior to an equivalent-spec Intel processor line (with the occasional very honorable exception). They ended up in low-end kit because by definition they cannot be present in high end kit - a laptop with an AMD processor in it is a cheap laptop with low-end hardware because the AMD processor IS cheap low-end hardware.
It strikes me that "Core i11" is far too close (depending on font) to "Core ill" and Intel would want to avoid sick jokes. Similarly, "Core i13" is going to trigger too many superstitions to be a good name. If they do introduce a new level either they're going to have to stick to hex - "Core iB", or, given the random fits of pointy headedness that tend to strike marketing departments, they'll have a complete image change, and call it something like "Thrasher 42" and baffle everybody.
"an 18-core, 36-thread beast may well excite some workstation buyers"
Probably not, given that it's been possible to get 22 core Xeon processors for some time now. It will be nice when Kaby Lake finally comes to Xeon properly, but I can't imagine anyone buying a single-socket only gaming CPU that supports a maximum of 64GB non-ECC RAM and has 1/4 the cache, and actually considering it a workstation.
Speaking of which, I'm pretty sure these are Kaby Lake, not Skylake as stated in the article. Certainly Intel's website suggests that's the case, and I can't imagine why they'd be releasing new Skylake parts at this point.