... if this late switch was part of the cause of the infamous ring of death.
One of the developers of the original Xbox has apologised for jilting AMD at the altar ahead of the games console's launch 20 years ago. Seamus Blackley took to Twitter where he alleged that a phone call from then-Intel CEO Andy Grove to Bill Gates ushered a change in CPU choice. His thread revealed that the decision was made …
As other people have said, the RROD was specific to the Xbox 360 - and that machine also used an IBM triple-core PowerPC CPU to boot, rather than anything from either Intel or AMD.
I'm honestly not sure what difference it would have made if the original Xbox had retained the AMD chip.
At a glance, it looks like the plan was to use AMD's budget CPU - aka an 800mhz Duron.
So it would have been faster (both in terms of architecture and clock speed) than the 700mhz Celeron which was eventually used, but it would have also ran hotter.
So swapping the Duron out for a Celeron may have actually improved the reliability of the OG Xbox; with the Duron, we could have ended up with the RROD appearing a generation earlier!
I guess the big question is: would that extra processing power have been put to good use? The OG Xbox was already more powerful than the Gamecube or PS2, and we were still mostly using analog TVs with SCART or component out; a few people may have plugged their console into a VGA monitor, but things like HDMI weren't released until 2002.
So arguably, it wouldn't have made much of a difference from a gaming perspective, especially for third party cross-platform titles which would have primarily targetted the PS2. Though I do wonder what it might have meant for first-party titles like Halo, since that was definitely Microsoft's killer-app.
That said, the OG Xbox did become something of a media-appliance in it's later years, once people figured out how to mod it and get things like XBMC (still going strong as Kodi, 20 years later) and arcade emulators running on it. And there were quite a few people who tinkered with upgrading it to a 1 Ghz P3 or similar. since this made it capable of playing HD media...
So it would have been great from a hobbyist perspective, though whether Microsoft would have viewed this as a good thing or not is debatable!
Beer mat time!
Microsoft managed to shift about 25 million OG Xboxes over 4 years, and at retail in 2001, a Duron 900 was around $115.
As such, and given the razor-thin margins for console hardware... let's say $80 for the CPU and ~6 million units per year.
Which comes to around $480 million in revenue. And at a glance, AMD generally earned around $4 billion in revenue at the time, so it would definitely have been a nice lump to add to the bottom line.
But with the aforementioned razor-thin margins, I don't know if it would have given them much more cash for R&D - even if they earned $10 profit per chip, that's still only around $60 million per year.
Which I'd love to find down the back of the sofa, but which doesn't go too far in the chip-fabrication industry!
OTOH, there would have been a lot of "soft" benefits from being in a Microsoft-branded machine, and it could have opened the door for a lot more design wins from Dell, Compaq and the like.
Alas, the OG Xbox was internally controversial at Microsoft for several reasons - and over budget to boot - and Intel decided to play hardball by calling Bill Gates and offering significantly cheaper CPUs.
And the rest, as they say, is history...
No. I remember a box from LSI called the Octopus. It would run 8 and 16-bit CP/M executables both on the main console and as many dumb terminals as you had serial ports for. Oh, an you could switch virtual consoles on the main console too. There was, eventually, a PC "compatible" add-on card but we saw that as a huge downgrade since it turned a multi-user computer into a single user device. It'll never catch on :-)
A Z-80 for the software you know and which works, and a 8088 so you can say you're up to date.
Main problem, apart from the price, was that the 8088 hardware wasn't PC-compatible, so while it ran MS-DOS, the programs had to go via it for everything rather than hitting the hardware directly.
I love 0:46:
"As I said, the design here was driven by spending time with gamers, and actually putting the control in their hands. We tried out over 100 different form factors - y'know to find what was the most comfortable, and what would give them the best gameplay."
It boggles the mind how godawful the 100+ other controller prototypes must have been for the final monstrosity they selected to have been the best of the bunch.
I just realized he didn't say they tried them out *before* selecting the monstrous disaster that shipped. My 'weasel words' filter must have been glitching.
What he was probably saying out the other side of his mouth was that they found out too late, so they hurriedly put a bunch of razorwire-wrapped joysticks in their hands so that they could spin it as having been tested and approved by gamers.
Or maybe they gave them to a munch of infants and selected the ones that they chewed on the longest.
Those are far more plausible scenarios than believing that a game would have selected that over 100 other designs.
> It boggles the mind how godawful the 100+ other controller prototypes must have been for the final monstrosity they selected to have been the best of the bunch.
It looks like Microsoft fell victim to two things when designing the original Fatty controller.
The first was that the designers were perhaps a bit too giddy about the creation of a new console, and borrowed lots of features from other controllers, including the "dual cartridge slot" mechanism from the Dreamcast.
Which to my mind was a daft idea - where Sega used their controller slots for rumble-packs and the overly ambitious mini-gameboy memory cards, the Xbox controller already had vibration capabilities, so the only thing they ever got used for was for memory cards. Which themselves weren't particularly popular, since you could save games to the internal HDD.
The second problem was that for whatever reason, Microsoft wasn't able to get their supplier to provide them with high-tech circuit boards. Which put further constraints on how small they could make the controllers.
So, yeah. Given the design and technology constraints, it probably was the best option ;)
Personally, I never had an issue with the size of the original controllers - though I do remember the various digs Penny Arcade made, including the comic where one of the characters is all but invisible as he sits under a coffee-table sized controller...
There's a reason most console controllers now look something like the original Playstation controller (itself an updated SNES controller):
"In development, we simulated every possible joypad situation. We imagined what it would be like to have to continually put the pad down while mapping a game, or playing while lying on the floor, and many other cases. After that we had to decide on the weight of the buttons and the pad itself. We adjusted the weights one gram at a time and eventually we found the correct balance. We probably spent as much time on the joypad's development as we did on the body of the machine."
I think at this point, we've got two main evolutionary paths.
In one corner, we have Nintendo, who took the distinctly "functional" NES controller and rounded it off to create the SNES controller. Sony took that, and added wings to make the original PSX controller, after which it's evolved to include analog sticks, vibration motors and much more besides, such as the motion controls, touchpad, speakers, lightbar, microphone, variable-resistance triggers, etc.
Then we have Sega, who started with something similar to the NES controller on their Master System, which then morphed into the original three-button Megadrive/Genesis controller, before mutating into the six-button version (as needed for Street Fighter II, the big killer-app of 90s arcade and console gaming). And from there, it got tweaked for the Saturn, before Sega decided to start dabbling with the newly rediscovered concept of analog controllers with the massive UFO-shaped controller which they released for Nights.
And it's from that controller which the Dreamcast controller was derived, and in turn, the Dreamcast controller was the key inspiration for the OG Xbox controller, which then slowly evolved all the way up to the current Xbox Series X.
In the meantime, and perhaps oddly, Nintendo has done what it always does, and gone off to one side with it's own unique interpretations; the SNES controller was effectively completely discarded in favour of the "trident" controller for the N64, only for the Gamecube to bring in a mutant hybrid of the two designs.
Things then got a bit wierder with the Wii - the Wii Remote was little more than a wand with NES-style buttons stuck on one side, which made it distinctly unergonomic as a controller. And their standard Switch controllers are pretty much a straight clone from their Gameboy/DS systems.
Though interestingly/ironically (for given values thereof), Nintendo's "Pro" controllers for the Switch and Wii U more closely resemble Microsoft's controller designs than Sony's.
For me, I think Sony has gone down a bit of a unwise path with their controller design. A lot of the newer stuff built into it was a reaction to the wild success of Nintendo's "motion control" stuff on the Wii, but once the novelty of waving stuff around wore off, few if any gameplayers retained any interest in it. E.g. Sixaxis flopped miserably, and the number of games which make effective use of the touchpad as anything other than a Big Red Button is pretty small.
As a result, much of the tech in the controller is hideously underutilised and is arguably just a huge waste of resources on all levels...
I actually love the OG Xbox Duke. I have fairly large hands, and that's the chunky controller seats really really well in my hands. I also have controller S, and while the connection to 360's controller is obvious, I still prefer to use the Duke.
Also, story of the Duke is board manufacturer screwed up, so they had to package the huge board into the case.
Machiavellian and Microsoft? Yup. Too many people today (not here) have forgotten BillG's old ways because he retired from Microsoft, gives to charity, and participates in Reddit's secret santa, but his actions definitely earned him his original reputation as an exceptionally ruthless cutthroat in a field full of ruthless cutthroats.
The image rehabilitation he's had over the last 20 years is unfortunate.
The millions of people whose lives he has saved from Malaria, HIV and other horrible diseases may beg to differ. How many millions of people you've never met are alive today as a direct result of your actions?
Yes, Bill Gates is a dick, but show me anyone who has built an era-defining company who is not ruthless? Of course, he still uses his money to wield power and influence (and you wouldn't?), and he may even cause harm. But these negatives are vastly outweighed by his philanthropy and the millions of lives he has saved.
I wouldn't worry, AMD also play last minute games and have been known to cut developer support without warning making it difficult to get your product running well on theirs, when their users notice even marginal sub-Intel performance they kick off and that's when their PR publicly dumps on your company and your render coder. So classy, AMD.
Microsoft actually got punished for this; one of the major exploits used to access the XBOX bootloader depends on memory address wraparound behaviour of the intel CPU, which the AMD chip didn't have. If they had stuck with the AMD chip, it's likely that the XBOX wouldn't have been hacked so early on.
Biting the hand that feeds IT © 1998–2021