
And I feel slightly sad if this IS indicating...
that we are moving from creation devices to consumption devices. Woe betide the users of the future.
PCs, once the final destination for almost all of the world's memory chips, now consume less than half of the world's DRAM shipments. During Q2, only 49 per cent of DRAM shipments ended up in desktop and laptop PCs, down from 50.2 per cent in Q1, market watcher IHS iSuppli said this weekend. This at a time when new machines …
Nothing is changing, it's just that many PCs in the past were [mis]used as consumption devices, and vast quantities of silicon and power were meaninglessly wasted in the pursuit of trivia.
Now at least the mindless consumption is a bit more efficient.
Why?
In the past people bought devices capable of creation with all the overhead that they didn't need.
Now there are products that suit them better. As far as I can see no creation device has disappeared so for those that want to create nothing has changed.
Why is it so worrying that additional more appropriate devices are finding favour?
Presumably you'd have us hark back to the "You can have any colour so long as it's black" days.
Posted from a creation-centric device by the way....
Depends on what you are creating... a multi-point touch interface is better for some creation tasks than a mouse and keyboard. Writers will continue with a keyboard, but musicians already find the iPad useful for emulating an audio mixer, or for displaying sheet music.
There is no reason why a mature version of, for example, Photoshop for touch-screen devices couldn't work. CAD too, in the workshop and on site, entering measurements into your model as you go, or using the tablet instead of a printed drawing. Video editing, likewise.
The segment is still in its infancy, and we still have an arbitrary idea that touchscreen=ARM and Mouse=X86... a distinction that is being rapidly eroded from both sides (add a keyboard to an ever more powerful ARM tablet, add a touchscreen to an x86 PC).
I would like to see more tablet-as-content-creator applications... I'm disappointed that no-one has really pushed the idea of using a tablet as an extension of the desktop PC's interface- definitely more ergonomic than reaching out to poke your main monitor.
Nice to see the 'death of the PC' mistake wasn't made here. Yes, they use more chips per device than the tablet/phone competition. We just don't replace PCs very often, they've been more than powerful enough for almost all tasks for 5+ years - the only reason most need replacing is actual catastrophic failure.
DDR3 in particular hasn't been around long enough to be in many PCs ready for replacement.
...and the RAM in PCs also comes in convenient DIMMs you just move over to the next machine/mboard, further reducing demand for new RAM (albeit not by much I'd guess - most probably ends up landfilled).
Your PC probably doesn't need more than 4Gb. This amount has been a sensible size to handle just about everything you are likely to throw at it for some time. You can stuff in more, but it's unlikely to make any perceptible difference to it for general use.
Meanwhile, the amount of memory being crammed into servers is heading into Tb territory.
The "consumption devices" tend to be heavier on flash memory than traditional DRAM.
Yeah, much as I hate to say something akin to "640KB should be enough for anyone", unless they're doing heavy audio or video work, the typical home user is not really going to critically need instant random access to more than 32 billion bits of information until there's a huge change in the way of using computers (a la our Microsoft holodeck friend's patent in another article..)
I think most devices are heavier on non-volatile storage than volatile. In fact the ratio is usually much lower for "consumption devices" than "creation devices", which is why they use flash memory rather than (say) magnetic disks.
When we got to 4GB memory growth in desktop PCs pretty much stopped, although a few boards support more.
The thing is the OS makers never set a definite 64-bit goal and stop supporting 32-bit.
Consequently, developers for the most part never made the move to 64 bit, comfortable in the little 4GB world where they could feel accomplished making use of *all* that memory.
I have a nasty feeling that as a consequence we are living in a stunted world of software that would be very different, if only the OS guys had mandated 64-bit desktops instead of them being used by a relative few heavy media types.
And of course, with no reference, we will never know for sure that this isn't true.
But if you extrapolated the growth of PC memory size before we got to 4GB (probably logarithmic growth too!), I wonder where we should be now?
Your point seems to be that software could be much better if it used more RAM, but that the software that needs more RAM is already 64bit. Umm... I don't quite understand what is being stunted here. I must be misunderstanding you.
I think its good that developers aren't trying to use all my memory- isn't that the definition of 'bloatware'?
I've got 4GB in my desktop and its a 775 Core2 so its been around for some time and I still have little real need of upgrading.
While the enterprise is doing stupid things with server virtualisation, desktops are found to be adequate because (shhh!) not many people do much with them.
Still, it will be 8GB next time around for me.