What exactly is the objective for getting Linux to run on PS4? Is there a benefit?
Ever since fail0verflow first told Chaos Computer Club Sony PS4 machines could be persuaded to run Linux, a fair amount of work has gone into replicating his demonstration. The problem for other PS4 hackers: to avoid take-downs and other sueballs from Sony, fail0verflow published only a Linux-to-PS4 port, leaving the rest of …
Back when the PS3 had Cell it was useful and cost effective for computation, as it provided more performance than contemporary CPUs (at least for certain HPC tasks) and predated APIs to program GPUs.
Now that the PS4 contains a rather ordinary x86 chip and a midrange GPU by today's standards, it is no longer cost effective in any way to buy PS4s for computation. The only reason to run Linux on your PS4 is to say you can, there is no benefit to being able to do so as far as I can tell.
"The only reason to run Linux on your PS4 is to say you can, there is no benefit to being able to do so..."
Sorry to dissent, but being able to use a gaming console as a general use Linux computer has several clear advantages. At the very least, being able to double your kid's room console as a Linux PC will save you the need of purchasing a dedicated PC, so the kids can e.g. surf the web and do their homework in a safe(-ish) environment. Other uses include media center, graphics and sound editing...
Not everything is HPC! :-)
At the very least, being able to double your kid's room console as a Linux PC will save you the need of purchasing a dedicated PC, so the kids can e.g. surf the web and do their homework in a safe(-ish) environment
While I agree with your thinking, the fact it needs to use an exploit to work means that it won't be suitable as a proper tool/kids PC as the exploit will almost certainly get patched pretty soon.
I'm not sure that "I couldn't do my homework until someone hacks the latest PS4 firmware" will get much sympathy.
Yes, you could just not install firmware updates that block the exploit, but then,assuming Sony are consistent with how they handled firmware updates on the PS3, games and things like Netflix will refuse to work until you update the firmware, turning the machine back into a single-use box again.
If and when Sony block this, it will just fuel the creation of custom firmware which will allow the playing of back-ups. I look forward to the new arms race. IMO if the customer BUYS something, they should be free to mess about with it any way they see fit. Sony is free to ban network access as that part belongs to them.
"Yes, you could just not install firmware updates that block the exploit, but then,assuming Sony are consistent with how they handled firmware updates on the PS3, games and things like Netflix will refuse to work until you update the firmware, turning the machine back into a single-use box again."
It should be worth noting that the article notes this is only confirmed working on a PS4 with firmware version 1.76. Last I checked, the PS4's most up to date firmware is version 3.15, released at the end of January. Now, if they can get this running on Sony's latest firmware, or even make it firmware-agnostic, then I'll be impressed.
Any, the age old myth from the bedroom experts that the PS4 has an off the shelf GPU.
PS4 is very removed from PC architecture, removing all the legacy busses and bridges that constrained PCs for years. Most importantly, the GPU and CPU are on the same die, sharing the same 8GB of insanely quick GDDR5, there is no external bus between the CPU and GPU, unlike your PC.
This makes PC comparisons worthless and extremely naive..
"Any, the age old myth from the bedroom experts that the PS4 has an off the shelf GPU."
It IS an OTS system, with some minor adjustments. AMD has been selling their APU CPU/GPU combos for years before the PS4 came along, and GDDR5 was nothing new, either. Thing is, GDDR has a graphics-oriented performance optimization which is why it isn't used with standard DDR memory channels. Both the PS4 and Xbox One are customized to some extent, but neither really use cutting-edge hardware (that still belongs to the realm of gaming PCs) that could be considered novel or revolutionary. Even the PS3's Cell architecture got leapfrogged early into its working life with the GPGPU push culminating with the release of OpenCL.
So it's a slightly non-standard PC then? Maybe a Mac comparison would be better.
Note that CPU/GPU shared memory setups have been common on low-end PCs for years, although having both on the same die is a recent thing in the x86 world (AMD APUs and now the Intel Haswells playing catchup). Having only 8Gb to satisfy both seems a little low in this days and age.
8 Core CPU at 2.5GHz with actually quite good floating-point performance (the FPU is significantly upgraded over a standard Jaguar core) as well as 1280 compute cores on a 256-bit GDDR5 bus at 5.5GHz, and all for around US$399? Not cost effective for computation? You cannot be serious.
You simply cannot build an equivalently powerful Linux box for twice that much! It is still a very 'useful and cost effective' proposition for computation, as well as a bargain desktop PC if Linux can be installed and if a fully hardware accelerated graphics driver can be written. Those are the big 'ifs' at this stage, but the value proposition here is potentially very large.
I'm actually quite piqued by this from an engineering signal analysis point of view. The biggest impediment to doing FFT on cuda is getting the data in and out of the card fast enough. If this thing's got the lot on-chip and you can get data in/out of the box at USB3 speeds then I can see serious use for this in the engineering community.
So where are all the hacks for running Linux on your smart TV ? Your fridge ? Your washing machine ? Your [insert any device you "own" with embedded firmware that limits you to using the device for the manufacturers intended purpose which does not include general purpose computing] ?
If this is a question of "principle" and "rights" then these should be out there too, right ? How very dare I be prevented from using my Bendix Twintub to do whatever the hell I want and be forced to use it only for doing laundry!
imho it takes a special kind of person ("special" as in "needs") to buy something and then complain that it doesn't do what something else they could have bought instead can do and set about making that "right".
You want a general purpose computer... ? Go buy one.
You wanted a general purpose computer but bought a games console instead ? See previous suggestion.
"So where are all the hacks for running Linux on your smart TV ? Your fridge ? Your washing machine ? Your [insert any device you "own" with embedded firmware that limits you to using the device for the manufacturers intended purpose which does not include general purpose computing] ?"
The smart TVs are definitely being worked on. The justification being support gets dropped and the internal software goes obsolete well ahead of its time. Thing is, this software are usually made by security-savvy companies like Samsung and LG who tend to sign their code and employ lockout mechanisms, so progress is there but very very slow.
As for other home appliances, that's generally tinker territory so you have to look into real hobbyist boards to learn more about what's happening there.
So where are all the hacks for running Linux on your smart TV ? Your fridge ? Your washing machine ?
Smart TVs, at least some of them and maybe even most of them, already run Linux*. As does any internet connected fridge. (Don't ask why. I'm still trying to figure out why internet connected fridges EXIST, let along why they run Linux). As for washing machines, I've yet to see one that has an actual OS at all.
Your [insert any device you "own" with embedded firmware that limits you to using the device for the manufacturers intended purpose which does not include general purpose computing] ?
I've heard about someone getting Linux onto a programmable coffee maker, but I remain skeptical about that one. I'm fairly certain the toasters are a joke and I know the dead badger is, though it's one that I quite enjoy to this day.
*That's Linux as in "Android is Linux". In other words, just the kernel and no GNU. For that matter there are smart TVs that run a fork of Android.
The hope of piracy for cheapskates, same as PS3 Linux.
Nobody was interested in Linux there. Anyone that used it will know it was a total turd that took ages to boot, and had very limited memory to run apps, Firefox was unusable under ps3 Linux.
It's stupidity quick computational abilities were it's only use, but no person that bought a ps3 for gaming (and thus setting their career destiny to burger flipping) would even know where to begin with that.
"Repurpose it to run Steam?"
A thought, but I think it needs some better specs to be better suited to the job. And I'm not going to continue the joke, as I'm pretty sure a 2015-spec device like this should be readily able to handle a game that is by now eight years old. Perhaps it's time to refresh the joke with something more modern, like perhaps "Caffeine".
What I'd be curious to know is some more meat-and-potatoes stats: such as how well such a converted PS4 would handle itself under say a 1080p or 4K H.265 encoding load, say to provide a benchmark.
I can see two reasons. First and foremost there's a camp of Linux geeks who want to see Linux running on everything from toasters to dead badgers. If you give those guys a device they'll hack Linux onto it just because they can, whether it makes sense to do so or not.
Second, and more real-worldly practical, if you can put Linux on a console then you can use it as a PC as well as a gaming console. Which effectively means that you have two devices with widely differing capabilities in one box. Which, I think, is pretty useful.
Exactly, now that you can run Ubuntu binaries on Windows why would you want to run Linux on anything.
I know you're trolling, but I feel like this is a question worth answering anyway.
The ability to run Ubuntu binaries on Windows doesn't have any impact whatsoever on the factors that drove me to choose Linux. The biggest factor is that Windows doesn't give me the level of control over my system that Linux does. No matter how much you tweak it or deactivate services there are still blackboxes running in the background doing who-knows-what and taking up system resources to do it. On Linux I can (and do) know exactly what everything running in the background is doing. If I don't approve of one of those things or don't feel that it's worth the system resources it's taking up, I can get rid of it. It's also really telling that Windows takes up 16gb of hard drive space all by itself whereas you can easily get a fully functional modern desktop in under 4gb with Linux. Adding Ubuntu compatibility to Windows doesn't change that.
The other big factor is that I usually build my own PCs. Unlike when you buy prebuilt machines Windows represents an extra cost when you're building one yourself. That really hasn't changed. I still can't legally download Windows for free, and moving my current install to my new computer is a huge advantage when you're upgrading to a new machine with Linux.
Plus I've been running Linux as my primary OS for somewhere over a decade. At this point I'd need a pretty compelling reason to switch back to Windows. Being able to run stuff I already run doesn't cut it for that.
And at the most basic, just as Linux isn't for everyone, neither is Windows. Over the years I've come to the conclusion that Windows is not for me. I can run it proficiently (and have to for work), but for what I do and how I use my computer Linux is just a better choice.
I seem to recall that the PS4 is more or less a standard x64 PC and GPU, but with a bit of AMD's shared memory gubbins or something...
I have read Linux users lambasting AMD on forums for not releasing great graphics drivers, I don't know what the current state of affairs is in this regard.
I don't know what the current state of affairs is in this regard.
It's not, at least not as far as I can tell. Certainly my graphics performance on Mint 17 beats out my wife's on similar (not exactly the same, but pretty close) hardware running Windows 10. If AMD's drivers are in some way sub-par then they still shine by comparison to the train wreck of NVidia's drivers.
Nothing like a PC, not even remotely similar. The only people thinking that are plebs that don't understand hardware architecture designs.
PS4 has significant changes that mean you can't compare the APU to similar components in a PC.
Almost all the comments here are from wannabe experts copy and pasting things they read somewhere else.
GeoHot gave them an excuse, nothing more. His crack never posed the security risks Sony claimed it did. All it did was allow Linux to fully utilize the hardware. Since it did not affect game mode it couldn't have possibly been used to pirate games. In fact I'm fairly certain than had they not yanked OtherOS entirely over it - and thus given the Linux hackers a reason to look for exploits - there still wouldn't be pirated games on the PS3.
It's really quite simple. The first link in the chain that leads to pirated games on a console is always an exploit used to install Linux. Don't give the Linux guys a reason to hack the platform and they'll never find an exploit for the less talented w4r3z dud3z.
All we need is a Ps4 with Linux on it to run office softwares, Linux softwares, Linux updates to the Ps4, and to search online and browse via Linux, download drivers, not just one Linux OS but lots of other variations we desire. We really are bound to computing and our beloved Ps4 and our soul mate Linux. We have nothing else in this world. Nobody's requiring a pirated game, but lots of emulators for older forgotten video game systems, a tribute to the entire world and scientific softwares to make our kids at the Ps4 bright and welcome to the future world. Where the truth bound Linux is an OpenSource and the Ps4 once bought is certainly inside our home. I wish Sony provides a version too, along with the rest of the world and anybody who's willing to help with affection to the Linux world and Gaming. Why hate one another when we can love one another and hold our hands together.
>Sony also advertised that it had a PS2 emulator, something that was also silently dropped in later iterations.
From what I understand that was actually a hardware emulator which was basically a PS2 built into the fatty launch PS3s which is part of why they were so expensive on launch and so well fat. Later cheaper console releases dropped the hardware to reduce cost. There is a whole lot to hold Sony's feet to the fire on over the years (like they have invented just about every one of the most draconian DRM systems on media ever) but this one may get a pass as the market spoke.
there was a 1500A+ engine which should have just slotted in. There was also a 1750 E-series from the Allegro and Maxi which may have fitted......the thought of the twin carb version of that in a Mini is interesting. How would you stop the beast though? The brakes would have to be as big as the wheels!
You may want to file away that referring to something in terms of male genitalia can be considered a negative in America (as in "That's balls!" when something isn't right). So your description of "Dog's bollocks" was misconstrued as something a dog would urinate on.
The only real issue with gaming on the PC is the lack of titles from certain genres that only see console releases.
My last console was a PS2, I found the PS3 obnoxious, not at all that powerful and the games expensive, never bothered with another console again after I learnt that you could copy the disks to the HD of the console but not play without the disk.
Also on the PC you can mod the games to your heart's content.
"...gaming on actual PCs is still the dog's bollocks."
I'm quite puzzled at the downvotes your comment received. That PC gaming is far better than console gaming in most respects is self evident to anybody who has used both. Would any of the downvoters care to elaborate?
Or perhaps it's just a misunderstanding caused by differences between Leftpondian and Rightpondian dialects.
That's idiotic, who else could have put the equivalent of a mid-range discrete GPU on the same chip as a x86 CPU equivalent to the one in the PS4?
Oh right, no one. And your idea that AMD hasn't done anything interesting since the K6-II is just as stupid. AMD lead Intel in performance from the launch of the Athlon until the launch of the Core 2.
Maybe next time you should mention some of the stupid things AMD has done recently, like selling the same "high-end" chip series for 4 years despite it not even being competitive at launch. Or how they've now lost the lead they had on Intel in iGPUs.
I said exciting. Not interesting.
Quite a lot of AMD kit is interesting...until you buy it.
The phenom was ok...I had one for a short while as a stop gap.
I tend to judge processors based on their usability years after launch. For example I consider the Core 2 Q6600 to still be viable today even in a gaming machine. Theres no Amd CPU from that period that I consider to still be worthwhile.
Hell my two main machines use older Intel stuff (i5-2400 and Xeon X5690).
The 5690 is 7+ years old and still a beast. The i5 is 3 generations old.
Neither machine is sluggish / and both machines are very cheap to maintain. Ive never been able to do the same for AMD kit.
Hands up if anyone is still running a 7 year old AMD CPU.
"Hands up if anyone is still running a 7 year old AMD CPU."
I use a laptop that has an old Athlon 64 X2. May not win any speed awards, but with the RAM maxed out, it gets its kicked for office-related work. It can still handle Chromium and LibreOffice with few complaints.
I also have a similar CPU in an HP mini-PC. It's currently in mothballs but I have used it in the past as a TV computer. I think I'll take it back out of mothballs once I check the RAM in it and get a fresh Mint image.
I'm going to share some engineering knowledge about why AMD is vastly superior for consoles and why their hardware was chosen. This is very well known by engineers who work with these things. In very simple English, AMD is the very best manufacturer for high performing, low level graphics API gaming. When the "operating system" steps out of the way and drivers are a non-factor, AMD hardware outperforms Nvidia hardware and Nvidia has a lot of work to do before their hardware will be competitive in this market. AMD is generally thought of to have excellent hardware engineers and very bad software engineers because their drivers are bad on "high-level" APIs but extremely fast on "low level" APIs. I'll explain what that means here... Between a graphics card and a game exists two layers of software: (1) hardware driver, (2) graphics processing API. Drivers are not magic and are things we're mostly familiar with but the graphics processing API might be a new terminology for you so I will briefly explain what it is. An API can be for something called DirectX 10 or it can be for something called OpenGL 3.1, these are different graphics related APIs. An API is the final interface between a game's actual code and the driver. Drivers are written to support very specific versions of graphics APIs. While a single driver from a graphics card manufacturer might support DirectX 11 really well, it might also have very poor support for OpenGL. The story of what an API is gets more complicated when we factor in the difference between "high" and "low" level API. A game written for a "high level" API requires the driver to be very intricately programmed. A game written for a "low level" API requires the game to be intricately detailed to directly use and manage hardware resources and the driver to be a simple pipeline right to the raw hardware. AMD's hardware engineering is superior to Nvidia's for a low level API, so their hardware will be chosen for gaming consoles. AMD software drivers for a platform like Windows DirectX 9, 10, or 11 is inferior to Nvidia's, so they do not outperform Nvidia on high level APIs. DirectX12 and Vulkan are low level APIs, and AMD outperforms Nvidia on those as expected.
Well, here's an interesting thing to consider. Graphics depend a lot on floating-point math, but an interesting trend has been emerging in AMD vs. nVidia regarding that. AMD cards can handle double-precision floating point with only a modest performance penalty compared to nVidia where performance pretty much chops directly in half. Plus current DX12 benchmarks show AMD's R9 290x performing comparably to the nVidia GTX Titan X, which is about twice the price.
Exactly. Microsoft is brilliant and they are not someone who'd hurt the needs of kids and they know how to love one another. Arent you noticing what all Microsoft xbox one is giving to everybody. Linux is going to be provided, I'm sure of that. I consider good men, the men who are divine and allows love to happen.
But these days, it does seem a bit redundant.
Getting Linux running on the PS3 was interesting at the time, as it was pretty powerful for the price in some number-crunching scenarios, thanks to the Cell architecture. But Moore's law had already marched on a fair amount by the time Sony withdrew support for Linux, thanks in no small part to the rise of the GPU as a device for massively parallel processing.
These days, "consumer" hardware is very much a commodity. Android-based USB-powered thumbsticks can be picked up for less than 15 quid - or, if you want to build something for scientific purposes, for the same price as a PS4, you could pick up ten Raspberry Pis and slap them together into a cluster.
Or you could nip onto Ebay and pick up a OEM small-form-factor PC; at a glance, there's plenty of multi-core, 3ghz machines with 8GB of ram available for less than a third of the price of a PS4[*]
And with all of the above, you don't have to worry about the functionality vanishing if/when Sony patches the exploit.
It's still an interesting experiment, but it's definitely of limited use in the real world!
[*] This is exactly what I did a while ago; said box fits comfortably under the TV and does a good job of running Windows 10 with Kodi, Steam, iTunes and a few other bits and pieces. Plus, it's all controllable from my phone - including the TV itself!
But what about something performance-intensive, like a game or perhaps media encoding at high-def (1080+) or latest codecs (HEVC)? Most of those tiny PCs use Intel Atom-class CPUs that are known to be a bit skimpy on the power (IOW, it may have to answer, "But can it run Crysis?" in the negative) whereas we know the APU in the PS4 has to be able to crank out SOME level of performance in order to play games like Fallout 4. I'd like to see some comparisons about its number-crunching performance.
Performance-intensive stuff: most of this comes down to the GPU these days. The Pi itself is a key example of this; the fairly underpowered ARM chip (at least in the original iteration) relied heavily on the Broadcom GPU.
A fairly quick glance online shows the PS4 GPU to be roughly equivalent to a Radeon 7850 (http://wccftech.com/playstation-4-vs-xbox-one-vs-pc-ultimate-gpu-benchmark/). These look to be available for around 75 quid online, and come with 2GB of dedicated ram.
Admittedly, there's something of an apples/oranges comparison here, since I'm looking at second-hand prices. Then too, the PS4's custom-tuned architecture may well have some speed advantages - though conversely, GPU performance under linux is still generally behind that of Windows, and that's even assuming a hack like this is able to get access to all the hardware, and that drivers are available to take advantage of it.
Still, for around £150, you can get a quad-core machine with 8GB of ram, 2GB of dedicated GPU ram and a GPU equivalent to the PS4. And generally, that'll include a Windows 7 licence which can be upgraded to Windows 10 or junked and replaced with Linux.
And then you can spend the rest in the pub ;)
Media encoding was, is, and will be for some time forward remain a CPU- rather than GPU-intensive job due to the potentially-divergent process of motion estimation as well as the need for good memory throughput. So I'm looking for something that has a good CPU at a decent price, and last I checked, none of the mini-PCs on offer have an octo-core CPU, let alone one with at least decent floating-point performance needed to do media work with any proper speed.
Why are you all trying to justify why you would want to run linux on a PS4? It's just throwing up straw men for the critics to burn down.
Why would we do this? Because we're geeks, it's fun, and it's what we do. That's the only explanation needed, and it's the most important explanation in the world.
Oh and to the guy who commented on gamers ending up flipping burgers, we bought my son an Xbox 360 in 2010 and he hardly came out of his bedroom for 5 years. Now he's 18 and he's got an apprenticeship with a major UK pharmaceutical company. He's pulling down £15k for working in their IT dept, stroking the mainframe, which is good because he would be hopeless at flipping burgers.
W00t. Finally I have a reason to get one, if it checks out.
But its a bit late in the product cycle... so need to check the competition carefully.
How does the compute capacity of a PS4 compare to modern 1RU servers? Mac minis? In its own rack it may be quite effective for running KVM, etc. Are Sony still selling them under cost (NOT!)