
No thanks. Still don't have a reasonable linux driver for the original one
Doubt they'll get round to a decent linux driver for the new ones for at least two or three years.
Intel is getting increasingly serious about integrated-graphics performance, and to prove it they've done what any self-respecting marketeer would demand: they've rebranded their top-performing parts. Meet "Intel Iris Graphics", slated to appear in the top-performing parts of Chipzilla's soon-to-be-released "Haswell" chippery …
Would be nice to have decent Windows drivers too for most of their silicon - at least some that support OpenGL and similar stuff.
And where are the actual specs for these new chips, not just meaningless graphs without any context, detail or background? So proper comparisons can be done, rather than this all singing fashion show of dressing up old stuff in new clothes?
Ever since i740 Intel has promised "amazing oomph" and each time failed to even come close to their competition in anything related with 3D (gaming or design), due to both poor performance and driver quality.
Their products are perfectly fine on regular desktop usage and I'm typing this on a machine with Intel graphics.
Intel has many times bigger workforce than AMD or Nvidia combined, they're working on bleeding edge CPU's, NAND, lithography and so forth, yet they are constantly years behind AMD/NV products if you measure the gap in pure performance. Why is this?
Why? Simple: They don't have to be better to win, just good enough. More cash comes out of the large majority that just want a display, not to play video games. Also, don't forget that their competition is aiming a wicked eye at them by pushing GPU cycles, cores, and math. If Intel sits still, they could loose the desktop/mobile market once and for all, and we all could be whisked away to Cuda`Ville.
No matter. If the "Intel HDA" driver or its proprietary cousins actually performed in an non-ass way, I'd actually care about this news. However, being that the driver might as well come shipped blacklisted, Intel is wasting their time.
What I am going to be interested in is the bang per buck. How much would an equivalently performing Intel / Nvidia or AMD combo cost for a laptop? If the Intel chip is cheaper, then it becomes very interesting for users that don't do any serious 3D gaming or major CAD works.
If it is more expensive though, or years behind, then I agree that something is amiss.
Why would they bother? Desktop users with graphic performance as a primary concern have plenty to choose from in video cards with Nvidia and AMD parts.
Intel is much more interested in design wins where power and physical volume are driving factors. The return on investment is far better for enabling better graphics performance with decent battery life in a notebook than for doing anything other than cutting video on the desktop. And as long as the corporate sector is satisfied with Intel's latest, which is still an improvement over the Ivy Bridge GPUs, they will continue to own more desktops than AMD and Nvidia combined by a huge margin. If a cubicle drone can get Skyrim to play decently on his Intel-only box, bonus!
Silicon Graphics, commonly abbreviated to sgi for as long as I can remember may have a case there. Given those chips will be featuring alongside apples retina display, expect rackables baby to get sued by apple anytime now for their previously infringing workstations.
Seriously, who are they kidding? Why not claim they are seventy-five HUNDRED (7500) times faster than ViRGE, the 3D decelerator? While they're at it, why not remember that their Core i7 CPUs are several hundred thousand times faster than 8088?
75×rubbish is just rubbish, but more of it. Their drivers are bad, the performance is in the basement compared to integrated GPUs from AMD. While they could be on to something with Iris, the competition would need to stand still for the last five years. Wake up call, Intel! You are NOT competing with 2006 chipset-integrated Radeon or GeForce! You're going to compete with 2014 APUs which are going to include hUMA (which for most users will mean PS4-like GDDR5 system memory). Your GPU may well be 75 times faster than in 2006, but AMD's GPUs made more improvement in the last 7 years and you are not going to fool anyone.
> you are not going to fool anyone.
Er, who do you think they are trying to fool? HD 4000 is already plenty good enough for anything other than more recent games and CAD work, handles transcoding quickly enough, and is happy to run a few monitors and decode some full HD video.
Gamers and CAD users know their own needs, and will usually buy a machine with discrete graphics hardware- after having researched benchmarks, game frame rates and any reports of driver issues.
Seconded.
In any case, AMD check-mated everyone in the GPU integration game by making it cache coherent in their announcement for their next GPU. That is not just "faster", it is differently faster - GPU ops no longer have the latency associated with them and the GPU becomes one enormous co-processor.
Everything else (including what Intel does) is bundling and bill of materials savings. 75 times faster snail is still a snail.
A simple glance at the diagram shows that they're comparing it to 2006 series which was the first Core-2-Duo (code-name Conroe). Reading graphs is not rocket science :/
What is amazing though is that the units have decreased TDP yet increased the overall perfromance, meaning more performance per watt. From the demos Ive seen these processors are serious about gaming and any 3d application (play BF3 and MOH just fine). You can watch ~20x 720p videos without breaking sweat (under 30% CPU load) (if your SSDs can push the data through). Given that Intel didnt buy ATI like AMD they do have a steeper learning curve...
In any case it will be interesting to see just how well they fit into tablet markets when you dont need high-end workstations but a combination of on-demand performance and long battery life.
@Steve.T: Reading comprehension, man. It was obviously irony. Should I have used HTML5-compliant <sarcasm> tags?
They can be used for light gaming, assuming you're happy with 1366×768 resolution at absolutely lowest settings (some games provide Intel-specific setting, which offers quality even below the basic).
Oh, and funny you should mention AMD bought ATi. Remember Intel740? Thought not. Intel bought Real3D and released their GPU in 1998 -- eight years before AMD bought ATi. They had EIGHT MORE YEARS to develop the (admittedly rubbish at the time) solution into a solid product. When AMD bought ATi, they were struggling with their lineup, slowly recovering from 2000 series debacle with notably improved 3000 series, but they weren't well entrenched until releasing the 4000 series and Evergreen. Integrated GPUs from ATi were already vastly better than Intel's at that time and it was without much prior support from AMD that the GPU was excellent. Intel's GPUs continued to lag behind AMD's, and when AMD integrated them into APUs, Intel was again outstripped.
Between 1998 and 2006, Intel had time to improve their GPUs. They failed. They had eight years of possibilities to integrate the CPU and GPU within the hardware, even when the GPU resided in NB, but they didn't care about it. Since 2006 they have slowly improved, with each generation about doubling the performance, but it was still way behind the curve. Seeing Intel's lack of initiative I have to call bullshit on this 'Iris'. Maybe Haswell is not going to bring anything new to the table in terms of graphics (aside from increased clocks) and Iris is just a way to counter the lack of performance by doubling the number of GPUs.
As for playing video streams -- Intel's CPUs DO NOT use the GPU portion for decoding the stream. The CPU has a dedicated processing unit for this. And although it is impressive in its own right, it is supposed to play high numbers of video streams without breaking a sweat.
And your last paragraph -- as long as Intel is trying to stick x86 into everyone's face, they will continue to fail. And it's funny how Intel continuously claims that their target is ahead of them. When i740 was released, high performance GPUs were their future. They failed. Then they said their goal was best integrated graphics. They failed. Then they were supposed to release Larrabee, which was supposed to introduce Intel to the enthusiast GPU market. When that failed, they said Larrabee was intended for heterogeneous computing all the time and they never intended it to be a GPU. Now you are saying their goal is best performance in tablets? Ain't gonna happen. Iris isn't going to convince anyone, either.
Agreed. They are leaps ahead of what they were before. It's no longer a case of "argh, integrated Intel chipset with obscure set of digits that'll take a day to track down" and now "that's not so bad, it works ok now".
They're not speed demons, but at least now they are perfectly adequate for the majority of computer users' needs. i.e. Using a computer in place of a typewriter and browsing the web a bit.