I just love XBox 360
I really do, I've given up on the Playstation 3 as it's really just no good.
I never liked Blu Ray anyway.
I think the 360 should stop getting such a hard time, lets get together and have a nice picnic.
The Xbox 360's infamous Red Ring of Death problem was the price Microsoft paid for attempting to save money by designing its own graphics chip for the console rather than buy one off a specialist supplier, it has been claimed. Speaking at a chip conference in California, Gartner chief analyst Bryan Lewis said the software …
As I understood it, ATI designed the graphics chip but didn't manufacture it, they gave the design to MS who had to get it fabbed themselves. This was always the plan because with Xbox1 nVidia manufactured the graphics and chipset and refused to give MS a price cut. So this time, MS cut the graphics company out of the loop to avoid being shafted again. Presumably though, they didn't have ATI's expertise in the long term effects of high temperatures on system components and have had to go back to them to get the problems fixed.
It's hardly surprising the graphics chip is responsible for the failures, all modern GPUs pump out large amounts of heat, and anyone who has dismantled their 360 can tell you the GPU's heatsink is tiny, it's hidden under the DVD drive and is attached to the main CPU's heatsink. It's not surprising it can't dissipate heat very efficiently in that set up.
was that they didn't need upgrading or replacing? Okay, you maybe need a new peripheral to play a racing game, or perhaps a larger storage card for all those saved games. But overall no change. At this rate you'll end up with Xbox owners in the classic PC upgrade scenario. OS upgrades are bad enough, but hardware changes as well?!
Yet more reason why I'm strictly a PC man now.
So why buy hardware from them? (A mouse is a lot more simple than a 21c console)
For more techno detail see: http://www.ps3news.com/X-Box360/the-truth-about-xbox-360-hardware-failures/
@Mark - you should go on Jerry Springer. I saw a bloke marry a horse once. You could marry your xbox.
The whole point of consoles is that you can play in the comfort of a massive sofa with mates and a big screen tv.
<rant>You only need to buy a new version of the hardware if you bought something that was designed badly and has an inherent fault. If you really want to of course you could buy one again after a few revisions so that you can have a slimmer case, lower power consumption etc. The key fact is that a game made for the 360 or the PS3 will work on any revision throughout the life of the console which is not the case with the myriad of options (that are constantly being replaced by manufacturers) with a PC build.
Although there have been several versions of the PS3, you won't need to buy another version (in theory) for up to 10 years. My guess is that you'll have upgraded your PC significantly by then and probably replaced it - which is fine if you can afford to do that and don't mind doing it every couple of years.
I've not yet made the jump to a next/current-gen consoles and am still using an original PS2 - the first revision - that makes it almost 8 years old and it still plays all the new games coming out for that platform. 8 years later. Can the same be said for the Pentium 3 at 766MHz with a Nvidia GeForce 2 or Radeon DDR (DX7!) and Windows Me? No?
Kinda makes 300 quid seem like a good deal.</rant>
Software firms should stick to software.
Hardware firms should stick to hardware (plus firmware / drivers - done by SW dept).
And there should never be any crossover.
It's like buying an ATI OS.
@Christian Clark, who customised it?
Okay, I didn't read the link, but I have work to pretend to be doing. So the answer may already be there. Apologies if it is.
The article makes it sound like Microsoft designed the graphics core in-house and had it fabbed somewhere cheap... and then when that didn't work they went to ATI and bought a design? Wouldn't that make the two generations of chipset incompatible?
As one of the posters above mentioned; Microsoft licensed a design which is pretty common and then got a cheap vendor to produce the silicon. Which is a world apart from Microsoft cracking out the VHDL and wipping up a GPU.
Steps to repeat:
1. Turn on Xbox 360 Elite
Result: "wrrryingyingyingyingyingyingyingyingyingyingying..."
2. Turn off Xbox 360 Elite
Result: "yingyingpoeoww...." (then silence)
3. Turn on PS3
Result: (sound of string orchestra tuning up, then silence)
4. Turn off PS3
Result: Silence.
Conclusion: For that reason alone, I opt for the PS3 version of any title if I can. Maybe I'm getting old?
Totally agree, god forbid they ever got together to build something where the hardware and the software were designed together, from the ground up, as an integrated package. (Yeah, I know. Pigs fly, hell freezes over, etc.)
What would happen to all the support line people without the ability to pass the buck and blame it on the other company? They might actually have to solve the problem! Oh, the HORROR!!
doh!
>>For more techno detail see: http://www.ps3news.com/X-Box360/the-truth-about-xbox-360-hardware-failures/
it may well be true, but www.PS3news.com (note the 'PS3' bit in there is significant) is hardly likely to be objective or positive about the Xbox now is it? LOL
ATI designed the GPU in the Xbox360 - to Microsoft's specifications.
What seems to have happened is ATI did the basic design, MS provided technical specs as to what they wanted. MS may have taken that ATI design and added/modified something at the fringe, but does anyone seriously believe that MS modified the ATI GPU core? One thing that comes to mind is the inclusion of the so called embedded RAM. It's not so much embedded as it is on the same daughter card. It's not part of he GPU core. Was this one of the key revisions to the design that Microsoft requested? Did Microsoft in their arrogance further tweak the design before manufacture?
Well in any case the damage is/was done, whatever causes the heat within the GPU package on the 360.
So now MS go back to ATI and ask for help to fix the heat and move to 65nm. Anyone want to guess whether that BM of E-RAM makes it onto the same die as the GPU core this time? Perhaps with some ever so slightly redesigned custom gate work? I sure hope so, otherwise those Jaspers are going to be hot as well.
Going back to the story, I'd have to guess that this was indeed the order of events. ATI provides a core to MS, MS tacks on some extras, the GPU core is too large to include the E-RAM on-die and still be feasible to build. So a daughter card arrangement is hastily created. Heat problems are not seen in initial tests which are performed in nice controlled environments with good airflow and air-conditioning. Eventually real world testing by consumers shows that in anything less than ideal conditions, or extended play of games that push the video hardware, heat is a problem. Cue Microsoft going back to ATI with a tale of woe and asking them to help them, cue Jasper and a die shrink.
Unanswered questions that need answers. How many of the Xbox360s that Microsoft claim have been sold are replacements for systems that failed? Do they count the refurbs that are sent to consumers when their original system fails? Why does Microsoft think it can do hardware? When will people learn that Microsoft has high ambitions, but always delivers buggy versions that require at least one major revision to work properly? What in the GPU package is causing the meltdown? Who designed that? Who tested it? Is it really fixed in Jasper or are we just kidding ourselves again?
I have two 5 year old gaming notebooks, both with identical ATI chips (Radeon 9000), both have had graphics card replaced in the last 18 months due to them simply blowing up. Heat could well be a factor as the heat buildup is considerable; the design uses the keyboard baseplate for dissipating heat. The new cards seem to have a different type of thermal pad on top of the GPU.
I'd love to hear why "in theory" you'll not need a new PS3 for 10 years..... i'm sure it's got a good graphics chip and CPU, but 10 years ago i had a state of the art Voodoo 2 in SLI configuration, with a whopping 8Mb of video RAM, that was able to play any game in full resolution, frame rate etc..... so i'm guessing technological advances in the next 10 years may well render the current PS3 worth nothing more than a doorstop (which incidentally it does resemble rather well!)
You are clearly a very astute man. I mean I only ever make technological purchases based on how it will effect my ambient room noise.
Do you apply this logic to your love life?
For example say jessica alba, kate moss, (insert your ideal women here)
was a noisey minx in the bedroom.
But David Blunket sung you some soothing tunes.
Does that mean you'd pick him?