Torvalds
" I despise widescreen displays" yeahmmm good at linux, not so much on a logical display.
Linux Lord Linus Torvalds is thinking about making Google's Chromebook Pixel his main computer – once he installs a proper Linux distribution on the machine, that is. Posting on Google+, Torvalds lauded Google's newest creation, writing "... the screen really is that nice" [his emphasis] and that "I think I can lug around this …
Other than watching video (although I can ignore black bars), in what other respects are widescreen displays superior or more logical?
Gaming? Well, my old 20in Samsung running 1600x1200 is much better than my old 21in Samsung running 1650x1050 for gaming, although my preference was for 1920x1200. Display aspect matters for me when it comes to gaming, and the only reason I don't hate my new monitor is because 1440 vertical pixels is quite a lot.
Office tasks? Displays with proper depth are much preferable to widescreen for office tasks. The only advantage for widescreen is multiple windows open side-by-side, hard to do on 1366x768 wouldn't you say? One of my greatest joys when I was programming was to turn my Sammy into portrait 1200x1600 to review the code. Perfect.
16:9 was forced on us by the screen and device manufacturers because it was cheaper to produce and they were/are in a race to the bottom (both in price and in quality IMO), not because it was better or because customers liked it. Customers liked it because it was cheaper. Cheaper > Better when the masses are buying.
So why was 16:9 cheaper than 4:3? Apart from TV panels helping the process, I believe it was because panel manufacturers could fit more 16:9 ratio panels through the process than the other ratios. 16:9 panels fit better on the substrate and resulted in less waste around the edges. Therefore more 16:9 panels could be made and were cheaper than 16:10 and 4:3, and therefore more price-conscious consumers bought them.
To reiterate: they were cheaper and not better, it was economics and not technology.
Icon: Paris thinks size matters as well.
"Other than watching video (although I can ignore black bars), in what other respects are widescreen displays superior or more logical?"
Umm ... anything where width would be an advantage..?
I do a lot of photo processing (3:2 DSLR images, typically in landscape), and the extra width over a 4:3 display is useful - it matches the native aspect of the image, and allows for the layout of the application's toolbars and palettes (even when the image is in portrait, as the proliferation of palettes sit out of the way to the side).
I also do application development - both Microsoft SQL Server Management Studio and Visual Studio have palettes that site to the side, and encroach by an annoying amount on the area of the screen where the code goes (yes, I could minimise them, but I use them enough for that to be annoying) - on a 19" panel, this is not a pleasant experience. I typically need multiple windows open, so I can't span the program across my pair of monitors.
For Microsoft Office & web browsing, I can't see either being better that the other, though.
Personally, I have a 20" widescreen monitor at home, and a pair of 19" 4:3 monitor at work, so the above is a direct comparison of my usage of identical applications in both environments.
Look at most computer applications and operating systems, they have toolbars top and bottom. So if anything, a computer display should be taller not wider so as to maximise the working space on screen.
Obviously there are plenty of exceptions. IDEs often have panels on the left and right, but they still have toolbars too.
Dunno, not needing a microscope to see your screen is a good one. I'm sure you can get lots of pixels in a few microns if you're that dumb.
Some of us work at a desk and play.. games.. the last one being a novel idea to Linux kernel people I know. I'm sure Valve are super-chuffed by his machinations for one.
I'm with you streaky, have an upvote. What the hell is with 6 downvotes on streaky's post? He's absolutely right: many of us use desktops for work and play, and while I do have a Sammy Slate and it's very useful, I also find my desktop PC just as useful. And there's obviously a market still there, otherwise they wouldn't still be making motherboards, graphics cards and hard drives.
I really don't get this "tablets are the ONLY thing now, PCs are so dead" craze. You'd think owning a desktop is like smoking in public, the way people are carrying on! Tablets are great, yes - as an adjunct to the desktop workstation. But they don't replace the workstation. Try using Photoshop or Cinema 4D on a tablet sometime. Or playing games. Sometimes you just need a keyboard, mouse and a big fat monitor in front of you, and that isn't going to change.
I think you need to check again, there are already some refutes. Streaky will do fine with a 10 x 6 pixels display, it seems. You and him do not understand the interpolation method and other ways of making the image sharper. Someone has explained it already.
"Some of us work at a desk and play.. games.. the last one being a novel idea to Linux kernel people I know."
Eh ? What has the kernel to do with the display resolution ? (aside from general issues like scheduling, multi-threading etc for which the Linux kernel is already perfectly fine for gaming)
Morons like streaky are why screen resolutions haven't increased in 10 years.
The point of having so many pixels is so you do not see individual pixels. That's what Apple was advertising with the Retina branding, and what Google is now calling the Chromebook Pixel.
I don't want to see pixels. I've seen plenty of pixels. I want the pixels to be so small that I see smooth fonts and sharp pictures. That's the point of having such sharp screens.
I don't want to see pixels. I've seen plenty of pixels. I want the pixels to be so small that I see smooth fonts and sharp pictures. That's the point of having such sharp screens.
.. and then people go and play classic arcade games..
There's a Nyquist element to it; the lower the resolution the lower the frequency of signal an image can contain — in layman's terms, lower density = less contrast. You can antialias so that the pixels aren't obvious but there's a physical limit to the amount of information you can present. When you step up to a display that includes all that extra information you probably still can't see the individual pixels but you can tell that edges are sharper and more lifelike, on text, on images and everywhere else, and you can then perceive a certain lack of sharp focus when you go back to the old display.
So it's really nothing to do with whether you can see the individual pixels or not, it's about how much information (in the digital signal processing sense) can be packed into an area and therefore how close an approximation a screen can be to actual printed text.
Except....
Your eye is a high-pass filter. I.e. the extra effort to increase pixel size comes to nothing the moment the pixels move around or have diagonal lines and curves.
Then you have to anti-alias them. So, at some point there is a trade-off between physical display density, how your eye works and computation, with a certain point of diminishing return on making the pixel smaller.
"Morons like streaky are why screen resolutions haven't increased in 10 years"
Actually I believe you misread what I actually said. I want an ~8K 16:10/30" monitor like yesterday. Apparently I belong to a small group that knows it's already possible and the panel cartels are the ones stopping it.
I was arguing against a tiny screen being useful for everything ever. When Torvalds has RSI and is part-blind in 10 years and is bitching about it on the kernel lists we'll be having words. I remember these sorts of things.
Tiny screens aren't really useful for much, unless you really really need them. Anybody who writes lots of code and things they're super-useful patently needs their head testing.
You don't need to view everything 1:1 at 72dpi..
Games.. I don't see how this causes a problem, just use a lower resolution that's the same AR and have it scale up.
That said, I can't imagine many, if any, games have this AR as a selectable option, but I'm sure you can edit a config file or two to get it right. If not, as long as the system can letterbox it, a 1.6 AR option will suffice, losing you only 50 pixels top and bottom.
"Linux Lord Linus Torvalds is thinking about making Google's Chromebook Pixel his main computer – once he installs a proper Linux distribution on the machine, that is.".
Yet on that page you mentioned (link to Torvalds Google+ post) we read the following: "And it is a beautiful screen, to the point where I suspect I'll make this my primary laptop."
No where in the entire article is he referring to any other computers. So how come you manage to conclude that he'll only use this laptop from now on?
If you read down the conversation thread, you'll find that he compares the Pixel to his Macbook Air (his current primary laptop, as world+dog knows), so even from that limited context, the article draws a reasonable conclusion. Knowing Linus, I can say with some certainty that whatever device he chooses as a laptop is going to become his almost exclusive work-related input device -- for a year or so, until something better comes along.
(Remember that he's working exclusively in Linux, so the idea of switching to another keyboard and monitor just to work on one of his other machines is a completely foreign concept to him. *NIX guys don't think that way. We use our favourite laptop as the input device for *everything*, and let X do what it was designed to do.)
"Remember that he's working exclusively in Linux, so the idea of switching to another keyboard and monitor just to work on one of his other machines is a completely foreign concept to him."
I see the point about X and remote access, but ... really? I don't care how good your laptop is, it will never come close to the quality of the keyboard and screens I have on my desktop. Portability is its only advantage.
For Linus and many other FOSS developers, our "work" isn't something that we leave at the office at 5pm. We carry it around with us, debugging code in the office at home, writing documentation on the train or in the coffee shop, answering email everywhere. Our other machines serve specialised functions (servers, compile farms, test beds), but don't have a keyboard or monitor unless we're testing GUIs on them.
This is precisely why Linus makes such a big deal about his laptop! This is why he complains so vociferously about poor keyboards and low-resolution screens. His laptop *is* his primary computer, and once he gets it set up the way he wants, he intends to use it everywhere instead of wasting time setting up multiple machines.
Sure, since laptop resolution has been crap lately, I've got dual high-resolution monitors in my various offices. But if my laptop didn't have a great keyboard, I wouldn't have bought it in the first place.
I don't think that there's any objective evidence that open-source applications are of any poorer quality than the proprietary ones they replace. In my experience with Windows, for instance, I've been forced to use a lot of really terrible applications that someone actually paid money for. And the Linux kernel itself demonstrates the excellent quality of software that can be produced by non-corporate FOSS developers.
But don't get me started on documentation! I'm not sure why the online help provided with Windows and Mac applications is regarded as a benchmark for quality. It's almost invariably terrible, too shallow to explain anything more than the basic functionality of the menus. You nearly always have to Google to find out how to do something unusual or complicated in Word, for instance. Compare that to the exhaustive documentation, tips, examples, and configuration information provided with a cross-platform FOSS application like Lyx. Or the Linux kernel documentation. Yes, FOSS documentation is very inconsistent (the documentation for LibreOffice is as bad as MS Office), but IMO its average depth and completeness is as good as commodity commercial products.
Just as well someone wrote some FOSS code for IBM to get involved in and develop their OSS code on, wouldn't you say?
Which is surprising really considering that it was all so badly documented that it wasn't usable for proper work... like developing a commercial platform on... oh hang on...
Methinks they translated Linus just fine. "Main" does not equate to "only". In the usage of the article main and primary are basically synonymous. Granted the article slightly over-reached by changing Linus's statement to computer, vs just laptops.
Because Linux does not have a good business model.
In contrast, Apple may stomp on human freedom and Microsoft may produce gawdawful software while dumping all of the harms on end-users who never had a chance to say no, but you have to admit that their business models work quite well.
Time for a better idea: reverse auction charity shares. (I feel like a broken record, but not nearly as broken as the Linux-related so-called business models.) Linus Torvalds has effectively proven by demonstration that just because you build a better mousetrap, it does NOT mean the world will beat a path to your door.
..."business models" aren't what motivate Linus or most of the other brilliant Linux developers.
In 2000, Steve Jobs tried to hire Linus (away from Transmeta, at the time) into a rather senior role. Linus said no. At the time he clearly recognised the business sense in Steve's plans for OSX and was confident that it would be a success -- but it wasn't the sort of thing he wanted to be working on.
Marc Ewing's Linux distribution has turned into a successful business model -- but that's because of non-techies Bob Young and Jim Whitehurst.
It doesn't matter what motivates Linus, what matters as far as this article is concerned is what motivates PC makers. The answer is of course, money. Linus even answeredit himself by saying he's going to have to load a linux distro to turn it into a 'proper' computer. Like it or not, most of the PC world still uses Windows and as long as they do, PC makers will do just fine.
This is intended as a consolidated reply to all of the replies related to my original comment about business models. My apologies if you feel that I missed your point and should have included a response. (However, this forum is so poorly structured for conversations that it is unlikely anyone will notice.)
I'm not saying that Linus Torvalds is wrong because money doesn't matter to him. Actually, it would be more accurate to say that he is lucky enough to be in a position where he doesn't have to worry about money. Most people are obliged to care significantly about their sources of money. One of the features of 'reverse auction charity shares' as a business model is that more people could get involved--as long as they could convince other people to help with the money side. (I used to be a professional programmer, but I've sort of risen in the world and I don't even want to program now, but I would be glad to help pay for other people to program things. In particular, there are a couple of smartphone apps I'd like to support...)
The non-monetary "passion of Linus" is a good thing. Recent evidence was his impassioned rant against a certain company's certificates within the kernel. His passion is why he cares and it drives him to make Linux better. However, money still matters.
My own take is that people who want to do charitable work should be encouraged to do so, but not by requiring them to starve to death. I do think they should probably work for lower salaries, but that is partly because they are doing what they want to do. In other words, they should sacrifice the maximum possible salary for their skills in exchange for their increased freedom to do what they want to do, but the pay should not be zero.
I'm speaking as someone who has paid for shareware a number of times over the years. Not sure of the statistics, but I'm pretty sure I've been a more frequent 'donor' than most people. Most of those products disappeared, and mostly without replacements. Some of them were pretty good, but if they were viable for commercial software, then the commercial versions eclipsed the non-commercial versions. If they were not commercially viable, the good-natured programmers eventually lost interest.
In contrast, there are some shareware programmers who are just hoping to strike it rich. That's another terrible business model, more like lottery tickets.
Let me reiterate my main point: Money matters and a bad financial model can negate the best idea.
Most companies use open source software to "outsource" part of their own internal software development. Why bother making your own kernel/ip stack for your router when you can just use Linux.
It just makes sense to contribute to a larger project instead of creating that large project all by yourself and locking it away.
That is by no means something new. When colour TV came to Germany, manufacturers designed a "common chassis" for all colour TV sets which manufacturers would then license and build. The only difference with open source is that the monetary aspect disappears. So instead of charging symbolic fees you charge nothing.
I don't think that participating in a FOSS project can be described as "outsourcing". Rather, it is a cooperative business model for these corporations. They have realized it is better to join forces on a massive engineering project instead of creating several crappy alternatives.
The success of the Linux kernel has already validated that strategy. And, cooperative models of doing business have been around for a long, long time and will continue to be.
Now let the kneejerks roll in their "it's communism" bullshit.
The small percentage of (F)OSS projects(1) that can be considered "high quality"(2) are typically either academic stuff payed by our taxes or spin-offs/dual licenced with "the community" serving mostly as QA and minor code monkeys and companies using the results as the core for commecial systems. That is a business model that works - just ask any sweatshop owner
(1) Most are actually NOT GPL "semi-free" stuff but really free BSD or Apache licences
(2) And this includes the WSB-type "80 percent clone of commercial product x" stuff
"Linus Torvalds has effectively proven by demonstration that just because you build a better mousetrap, it does NOT mean the world will beat a path to your door."
This is true. I love my Linux machine, worked as soon as I downloaded it 2 years ago. But there's no money in giving away good product.
You cannot comfortably use Windows on a high resolution display. First if you enlarge your font your windows will look funny (as most Windows toolkits don't scale), then you will end up with either ridiculous full screen situations where you have just a fraction of the window actually used, or you'll get one of the many non-resizable windows where you have a tiny little view into something larger (i.e. a log file) you cannot enlarge.
Most Windows applications simply were designed in the 1990s. Back then 640x480 was still a common resolution and more than 1024x768 was virtually unheard of in the PC crowd.
While I am a fairly happy windows user, I do have to agree with that. Legacy applications especially tend to assume screen resolutions. But then again, so do web sites. Which is plain crazy. Especially with widescreen being common now (I'm not going there re: popular or unpopular). Seeing a website taking up a strip 1/2 a screen wide with 2 huge blank vertical white spaces either side looks ludicrous.
I know M$-haters tend not to worry too much about facts, preferring the geek equivalent of 'what my mate in the pub told me', but claiming Windows apps haven't changed for 20 years is bell hooks on an Eadonesque scale. It'd be like me saying 'Macs haven't changed for 20 years because they're called Macs HURRR'.
Windows XP and Windows 7 work beautifully on my hi-res displays.
Erm, if you whack up the DPI then it helps. But the problem is people who write bad software not so much the APIs.
Most people don't want things to scale up, they want more real estate. But there is a limit as to how small things can get.
OSX's display doesn't scale well either until Mountain Lion added the doubling mode.
Really, its true. The "best" of almost all (excluding Apple retina) are 1080 lines, that is poorer than my 17" CRT monitor of 10 years ago! Almost gone are the 1200 line models or 4:3 aspect ratio, both much better for everything except watching DVDs. But that is cheap, so the computing industry has gone to using TV screens instead of 'better'.
I would be perfectly happy with normal (i.e. non-retina) resolution if I could get more height, but today you pay stupid prices for an "ultrabook" that has less display area then my father's ancient el-cheapo 15" 4:3 laptop.
I for one am not buying!
You're not wrong.
Both you and Linus T. are not wrong.
My work laptop is a pretty high-end HP EliteBook w8760 mobile workstation. I love it as a bit of hardware, even though it does weight slightly more than the Moon.
My real point is that it is not a cheap bit of kit - and the screen it comes with is a HP DreamColor 1920x1080 panel, which in and of itself isn't bad at all, and is usable at that native res on a 17" panel.
The previous model (the w8740) however came with a 1920x1200 17" panel. The newer model is merely a refresh of the older model, less than a year after the original. But they changed the screen.
On such a high-end piece of kit, why on earth would they do it? Other than the obvious reason to bump their profit margins even higher...
The obvious reason is to bump their profit margins even higher.
Apple has panels nobody else uses, almost all are 8:5 / 1.6:1 AR, so clearly Apple thinks it's worth getting panels in the correct resolution, size and ratio for their machines.
Everyone else does not, and uses off the shelf 16:9 panels.
I think its disgraceful that Apple has exclusive access to certain panels. CrApple is a major contributor the problem through their obnoxious bully-boy exclusivity business tactics. LG (I'm assuming it is LG causing this nonsense?) or whomever should be selling these more widespread, as it is hardly crApple that is manufacturing the panel.
I'm desperate for a new laptop, but am still using an old Acer from 2006 with a beautiful 16:10 LG-Philips panel. I refuse to buy crApple and refuse to buy a new laptop with a worse screen than my ancient machine.
All I want is:
16:10 17" or 15"
Colour accuracy covering the full sRGB spectrum (IPS etc.)
Decent resolution but doesn't have to bes "retina" (though disgraceful that smartphones and tablets are putting laptops to shame in this regard)
SSD
Good keyboard and track pad
>5 hour battery life
How difficult is this laptop manufacturers?
Actually the Pixel with it's 1.5kg is quite a bit HEAVIER than a Surface/Pro, will need a fan since it uses a i-series CPU and does not offer any benefit except the display(1). And "Retina" style displays are still debated. For the price and weight of that thing I can get a Sony Vaio Duo with more memory, longer runtime with the sheet battery and more capabilities.
(1) LTE may or may not be useful. Depends on wether it runs world wide and you have a useable net. I prefer a MIFI or tethering to a mobile
Managing to get Google to actually deliver any hardware from the play store is virtually impossible - here in the UK anyways!
So, at least Google is delivering to someone!
Backstory is that I tried to order a Nexus 7 on 14th Feb, yet despite promise of fondleslab within 5 days,
15 days later after daily calls with lies, deceit and general user contempt from Google frontliners(who, by the way, cannot call you back or even have you call them and they can only email other departments and wait 24-48hrs for a response) , and 1 re-order, the fondleslab had not even been allocated yet and was in exactly the same status as it was after I pressed the purchase button!
So, good luck to Linus if he can actually get them to deliver.
As for me, I gave up and bought an iPad from a local bricks and mortar store instead.
FUD from the Penguin farm again. There are quite a few netbooks and at least one Ultrabook available with Pingucrap OS preinstalled. Just ask Dell, Acer etc. And companies / non privat users (1) can get them "bare" and install their own.
(1) EU laws say "maschines sold to privat users must include an OS". Since 92+ percent use Windows - they come with Windows mosty
"EU laws say "maschines sold to privat users must include an OS""
Can you be specific about that?
AFIK this was something MS insisted on to get good DOS/Windows pricing for OEM deals, which of coure all the big PC vendors want. You can buy a PC from small custom-build PC shops without an OS, so I smell BS here.
You can buy those machines because you actually buy the parts and pay the shop to assemble them, you do not buy a pre-assembled box. Try the same at one of the chains that sell complete systems. Or look at some netbooks that call with the basically useless Free Dos to fulfill the law. Linux would work as well
I love my Asus 1201n netbook. Admittedly with a dual core CPU and NVidia video it is not as lame as the original generation. I will never buy a notebook that weighs more than 4 pounds again.
But there is still a market for small light low-powered units. Just maybe not at the $400 price point.
Mark
Widescreen is nice for watching movies... mostly. Maybe okay for playing certain games. It sucks for normal computing tasks. How this became a fad I'll never know. Must be because it was perceived as "ach-dee!" We're probably lucky that it's hard to manufacture circular or triangular displays or we'd probably have televisions and monitors that look like something from Star Trek episodes if the idiots that market these things had their way.
Widescreen is quite useful for learning certain things on - programming: you can have your IDE on one side and the instructions on the other and not have to swap between them all the time. Mind you if the instructions had been written in re-sizeable HTML and not paper fucking shaped in the first place you wouldn't need to go through that shit and a normal screen would more than suffice!
This post has been deleted by its author
There are no magic one size fits all resolutions. 2560-by-1700 may make sense for graphic artists, but I highly doubt it does for compiling and frankly it's a super dumb idea to think "retina" is always better than fhd.
One can find many real-life scenarios where "retina" is actually worse than fhd: browsing will offer much laggier and jitterier scrolling than on fullHD for example, and it will do so while consuming more power which isn't ideal for laptops.
Many apps don't work well at all with very high res (adobe cs6 before the special patch for example), although I hope that Microsoft and Apple will work on much better scaling algorithms in their next releases of Windows and OSX, which would help in most situations.
Where I do agree with the Great Swearing Dictator is that PC manufacturers mostly produce crap today.
They should stop whining and start producing good PCs for a change. If they think their me-too 0-innovation (TM) let's assemble-crappy-components(R) attitude will suddenly produce miracle profits just because they are now selling tablets, they are in for a rude awakening.
You can boot other Linux distros, even install Windows 7 if you want, or Linux, or put it back to a Chromebook.
If you buy a Windows 8 laptop, that's not an option, as Microsoft are locking down the market with EFI Bios bootloader signing, which ensures a Windows 8 laptop will only boot Windows 8.....
Perhaps that's why the PC market is dead.
FUD and lies again. You must be able to switch UEFI off to get the "Win8" sticker. So all Win8 laptops and tablets currently sold - and this includes the Surface/Pro! - can use other operation systems.
Most people look at their perfectly working W7 or W8 system, then look at the list of "stuff that won't work under Gnuliban OS" - and keep using Windows.
Unlike Windows and Visual Studio, Linux and the gcc toolchain can work properly on very lean systems. You can run all of them in a meaningful way on less than 10 GB of disk space.
You do NOT need to download gigabytes of .net frameworks, SQL servers and similar crapola just to get gcc going.
There was a time people did software development on machines with a 1200MB hard disk. This is impossible in the commercial World Of Bloat, but certainly not on Linux if you rip out all the useless stuff of Shuttleworth and DeIcaza.
Back in the old days we did software development without a harddisk! Or 9MB harddisks for complete systems! (Heck, I even used a hardware-frontpanel on a Simens R30 once or twice and real TTY printer/keyboards a few times)
And we happily left that stuff behind for IDEs and source code debuggers with breakpoints and other product enhancing environments. Maven/ant/make are fine for the CI box. But not for development
And we happily left that stuff behind for IDEs and source code debuggers with breakpoints and other product enhancing environments.
Some of you have. I still hate IDEs, and I've used dozens of them, on everything from MS-DOS to zOS.
The day I see an IDE that has the power and convenience of a bunch of windows running a decent shell (eg ksh or bash), vim, a good standalone debugger, and a proper build system ... I'd be looking at something identical to my current arrangement, so there wouldn't be any reason to switch. My IDE is the whole damn OS and tool set. Bundling feeble substitutes into one sluggish monolith is not an improvement.
Maven/ant/make are fine for the CI box. But not for development
Perhaps for you. I find it much easier to coax my makefiles into doing what I want than convincing some IDE black-box build process to stop doing something stupid. (Visual Studio is particularly strong in this regard - its stupidity is positively brilliant. Ah, the number of times I've had to turn off that idiotic "local copy" option on a project, or hack a project file to force VS to stop assuming there should be a "designer view" on a source file simply because its name contains a certain word.)
Linux Torvalds does not understand that most of the PC Manufacturers have very small profit margins on the "crappily made" hardware, since they must sink significant $$dollar and other resources into supporting all the calamities of the Windows Operating System (OS) software that purchasers expect of them, not of Microsoft.
There is also no incentive to make a great laptop like the Google Pixel if only to run a second rate OS that does not justify the investment. Sort of like building a luxury and beautiful home, only to furnish it with less that really good appliances, furnishings and fittings.
This idea that the desktop is dead seems to passed around by hacks (like it's some kind of contagion) who appear to repeat this blather without thinking it through..
I can appreciate if you are a casual user (the odd email, social networking, y'tube, etc, the odd little game) then tablets et al are absolutely ideal, certainly better suited than a comparatively cumbersome desktop.
But if you're in front of a screen most of the day then it should legible and large enough not to cause eye strain, and of course the devise needs to be powerful enough to do the work required, which is why a tablet/notepad is not a workstation.
For gamers, music producers, video producers, graphics artists, 3D artists, and more, a large screen (or several), powerful cpu('s), large fast storage, and powerful graphics are usually viewed as 'the more the merrier'. No one wants to spend hours squinting at a small screen and waiting ages for things to render - which is why people who need desktops use desktops.
You can understand why this may not occur to casual users, but it's akin to someone who only uses their little Ford Fiesta to pootle down to the local shop and back and occasionally trundle into town stating that F1 or WRC is dead simply because it doesn't figure in their experience. Go ask people who need desktops if the desktop is dead and you'll get a 'NO' every time.
PS - 16:10 wins over 16:9 (or even 4:3) every time, for obvious professional reasons
Not sure why he'd get the ChromeBook Pixel rather than a MBPro 13" Retina, especially if he's going to strip the OS and put Linux on it? MUCH better hardware and battery life, more storage, faster and more modern I/O (USB 3, BT 4, SD/XC, Thunderbolt), more OS options, more RAM. The only thing the CBP has over the MBPro is the Touch Screen, and I don't see that being too useful. I guess if you'r Linus, adapting a touchscreen driver may be easy, but I have yet to be convinced of the usefulness on a non tablet device of a touch screen.
MBPro might have a better hardware and battery life but it will be of no use if Apple does not publish hardware specs so Linux people can write decent drivers. This is the problem behind what you can call poor hardware support in Linux: the manufacturer releases Windows drivers but does not bother to do the same for Linux and does not want to publish specs for other people who would be more than happy to write drivers for them.
YES!! A screen in between 4:3 and 16:9!!! YES!
Sorry I love my old 14" 4:3 SXGA (1440x1050) laptop screen and keyboard. I HATE the HD ready 1366x768 laptop screens. To get a similar space you need a much bigger screen on the 16:9 format, which translates into a far heavier laptop.
3:2 is in between the two. Seems like a decent compromise.
Yay! Now I wonder if the keyboards any good...