At this point if you are in for buying a new PC/server, your best option is AMD and will remain so until Ice Lake is relased.
24 posts • joined 25 Jul 2017
> "I don't talk to customers about nanometers," said Shenoy. "At the end of the day what they care about is delivered system-level performance...Our roadmap and the products we're putting forward gives us confidence we'll continue to win."
But customers do care about TDP, price/performance ratio, platform features (like PCI-E lanes, memory speed), the number of cores per socket and, sorry, Intel, you're royally f*cked: there's this interesting Threadripper 2990WX and AMD Epyc Rome is on its way. No wonder you fired Brian Krzanich but that was too little and too late. The ship ran aground. Luckily for you it's temporarily.
Releasing the fourth iteration of the SkyLake uArch (Intel Core i9-9900K, Core i7-9700K, Core i5-9600K CPUs) could have seemed like a crazy idea several years ago, now it's reality.
I applaud the researchers however there's one thing I cannot understand at all: Intel was let know about some of these vulnerabilities (Meltdown and Spectre v1/v2) at the very least a year ago and to this date they don't have a single CPU where these vulnerabilities are mitigated at the hardware level. And according to rumors and leaked documentation they'll soon release yet another Skylake reiteration.
Re: This is why science rocks
As much as you're upvoted, you must accept the scientific fact that placebo ... works. If placebo works, homeopathy also works. It doesn't mean I condone the people and companies which sell homeopathic drugs for obscene sums of money.
I could also talk about how our brain directly affects our body functioning, how people are capable of raising their body temperature at will and many other still poorly understood things:
OK, I had to be more specific: we created computers which resemble modern PCs roughly 30 years ago. To be precise the first PC was released in 1981. And, of course, the first digital computer was created shortly after WW2 but it was its performance and usability were just lacking so I wouldn't call it a "computer" ;-)
Can we please drop "I" from "AI" because there's very little if no intelligence in all existing AI algorithms. I'd really love AI to be replaced with "automation" or "fuzzy algorithms" because it's what it is.
People are throwing "intelligence" around as if we understand what intelligence is. We don't. There isn't even a universally accepted definition of it.
The worst of all, is, of course, the fact that people believe that that the brain is a computing device which processes information. We do NOT know that. It's akin to medieval people saying that the Sun is burning. Yes, the Sun is "burning" but it's not a chemical reaction, in fact it has nothing to do with chemistry.
Likewise with the brain - we see what it does. We created computers roughly 30 years ago and there's some resemblance between what we do and what computers do. Thus the brain is a computational device. This corollary is false.
Um, that's gonna be:
* Intel Core i5 2500
* 16GB DDR3 1600MHz RAM in dual channel mode
* Kernel built in RAM disk (which pretty much negates any IO induced slow downs for this particular use case - we're talking about compilation)
which shows that performance in my case is limited only by the CPU.
And this "new" "full" info adds mostly nothing to my initial configuration because the only thing which severely affects performance is your CPU and its architecture. I like how people give me thumbs down - it shows how little they've read about Meltdown & Spectre and how different CPU generations and vendors suffer from the fixes.
This performance "review" is worth shit without knowing the exact hardware configuration.
Let me give you some real worthy data: I have an Intel Core i5 2500 CPU and Linux kernel compilation (lots of small C files) slowed down by at least ~35% after applying Meltdown/Spectre patches (I'm running Linux kernel 4.11.14).
Let me remind everyone here that if your CPU is less than Intel SkyLake(Kaby and Coffee are the same uArch)/AMD (Ry)Zen then your performance might suffer a lot, a whole lot.
While I agree that current AI is hundreds of light years away from AGI, I hate when people throw mind, consciousness, thought and knowledge into the mix. Let's be honest and admit that we don't have the slightest clue as to what these things are and if they are really required for AGI to exist.
And to all those people who keep fearmongering about AGI or being scared of AGI: please read this article https://en.wikipedia.org/wiki/OpenWorm
After studying the simplest nervous system on Earth (302 neurons) for several years we still don't understand how it works.
Seriously, who cares?
Windows Mobile along with its ugly 8bit tiles has been an abomination from the get go and deserved to die. Too bad, parts of this abomination found their way into Windows 8/10 and now people have to "enjoy" moronic apps on their desktops, including UWP'ized Start Menu and maimed ugly cr*ptastic Control Panel, a.k.a. PC Settings. And Microsoft keeps persisting with UWP despite failing spectacularly in mobile.
I've been talking about that ever since Windows 8 was released.
Design created for the sake of extreme simplicity without also being functional leads to disastrous results.
More on that here: https://tech.slashdot.org/story/17/01/27/1425205/ask-slashdot-a-point-of-contention---modern-user-interfaces
Re: What's the answer?
There are better ways to fight malware which also allow you to run any software you want/need. For instance like it's done in iOS/Android - every app runs in its own sandbox. Of course, such apps can interact with the kernel and penetrate it but it's relatively rare and it could be fixed fast. When that's not enough, you can run every app in a VM (but that's not a complete panacea since hypervisors also contain vulnerabilities).
And if that's not enough for you, you can run a potentially hostile app in a VM which runs on a separate PC in a separate network segment while you can access this VM only via RDP/VNC which is 100% secure.
I'm with Zuckerberg here. Currently we have no freaking idea what
are and the algos we've come up with up to this moment are very advanced algos for solving very narrow tasks. And they often suck at what they do - much touted image recognition requires terabytes of data and fails at recognizing very unusual things toddlers have no troubles identifying right off the bat. Many computer scientists reckon that we'll soon have another AI research winter.
There's very little if any progress towards AGI. We may all sleep well for at least 200 more years.
This is ridiculous.
Come and see C:\Windows\WinSxS which is filled with literally _gigabytes_ of libraries you'll never need (like localized versions of many core libraries even though you don't intend to install and use any language pack) while a lightning fast, super easy to use application which weighs less than 1 megabyte gets removed.
Microsoft has lost touch with reality.
> "despite Firefox getting much better during the same time window"
For the past two years you've rendered dead more add-ons than you've done for your entire history. Many people used to use Firefox solely due to its add-ons and now a lot of them have ceased to exist due to Mozilla's relentless efforts.
Multiprocess conversion should have started at least 14(!) years ago just when efficient multicore CPUs became a commodity (Athlon 64, 2003) and if you had done it back then, you wouldn't have let down thousands of add-ons developers who are now simply fed up and have parted ways with you.