I love the 13" form factor, but dislike the terrible resolution.
Fancy a 3840 x 2160 display in your next 13in laptop? Form an orderly queue outside Sharp's offices then, and loudly demand it turns its latest prototype panel into shipping product. The 13.5in screen contains just under 8.3 million white OLED pixels filtered for RGB colour. Its dimensions yield a pixel density of 326 pixels …
Even Sharp's doc is vague on whether that's actually 8.3 million white pixels that are filtered through a lower resolution RGB (of some sort) filter (which would make it, at best, 1280x2160 in reality with RGB stripe, or 1920x1080ish with PenTile RGBG), or something different.
The web always has been designed to scale from tiny resolutions to large ones. That's the whole point about HTML, otherwise you could have gone for a simpler standard.
So webdesigners will have to learn how to not be idiots and specify sizes in something other than pixels, and images will simply be scaled. Once your output medium has a good resolution you won't care. (Just look at printers, you hardly ever print a picture at the printers native resolution)
The end result will be a more usable web, one that can be used on both your mobile phone and your retina display wall-sized desktop :)
OK, it would be nice to have a screen that could actually display a 1::1 rendition of the mutli-megapixel snaps that our hyper-giga-sooper-megabyte cameras and phones (complete with their mess-produced, fixed focus little plastic lenses) can take. But that's about it. All that will happen then is people will begin to see the Emperor's New Clothes of a 14Mpix camera that is bugger all use if the shot isn't perfectly focused, and taken with a decent lens (read: costs more than the camera) with a noise-free image sensor, and no camera-shake.
So far as looking at internet p
orn ictures goes, unless they get re-scaled to a suitable DPI, which obviates these extreme resolutions, they'll be about the size of a postage-stamp. Text, likewise.
As for movies, even 1920x1080 formats will need to lose the benefits of all those millions of pixels just to fit properly on the screen - unless you're planning on watching 4 movies simultaneously.
Finally, who actually has eyes that could discern such high resolution? Sure, if you have eyes like an eagle and are viewing in a well-lit (but reflection-free) environment then you might possibly get some benefit from a 4MPix screen on a 13-inch display, but for ordinary people: with or without fully corrected vision, viewing from a sensible distance, this seems like a "we'll do it because we can strategy - just like the megapixel marketing campaigns are with digital cameras.
1920x1080 is quite simply too small.
My previous laptop had 1200 vertical pixels, and I really notice the missing screen space in many applications.
Higher screen resolution also lets you make writing easier to read - compare this text drawn with 5x7 pixels to the text on your current monitor.
Now make the dots smaller while keeping the text the same physical size (using more dots) - again, it becomes nicer to read.
When we look at a 1080p movie, the upscaling of movie pixel to screen pixel again affords the possibility of nicer pixels - when in motion you can estimate what the 'missing' pixels would have been, increasing the effective resolution and making for a nicer movie experience.
Secondly, if the film is instead digitised at a higher resolution we'll get the real detail in the film - maybe even as high as the digital projectors used in the cinema.
Movie houses aren't going to sell those discs/licences until enough people have these 'above-HD' resolution screens for it to be worthwhile.
I'm sure there was study done by IBM years ago that showed that most people need a resolution of about 300dpi to be able to comfortably read text for an extended period. Most printers, even silly cheap ones do more than that, yet screens have been stuck in the miserable 96dpi for ever.
While there will be some changes required - I, for one welcome our high resolution overlords.
"Retina" displays have 4-6x the pixels of a normal display which fine and dandy but it means you need a GPU which has 4-6x horsepower and 4-6x the video ram just to display them at their native resolutions.
And for that you get a desktop with teeny tiny windows because the dimensions are effective halved in each direction. So you're forced to use bigger fonts to attempt to compensate and then you get screwed up layouts and other glitches. And forget running games at this resolution since most games would chug at far lower resolutions than that.
So I'm not altogether sure what the point is unless Sharp intend to stick some kind of grill aperture over the top and use it to show 3D. I suppose the extra resolution would compensate for the need to send half to each eye.
Don't worry about the GPUs in the desktop or mobile PCs, they can deal with high resolutions quite fine in 2D and have plenty of RAM. An entry level integrated card with 64 MB of RAM can deal with 4000 x 4000 resolutions even triple buffering.
The 3D games will scale, that means, using 4 screen pixels for one game pixel. Not a problem.
This high res screens mean progress. You may not see the point, but I am literally seeing it on today's low res screens.
Maybe the problem is your printer? Maybe it's physical resolution is 300 dpi and it interpolates anything higher.
Or maybe the problem is your images resolution. If you print small (size wise) images they are going to look like cr*p even on the nicest dye sub printer.
Er, the difference between 150dpi and 600dpi is really obvious. Even 300dpi looks jagged and crappy from a normal reading distance. 600dpi is basically the minimum; more is better if you're printing color, so you can get a good lpi. Sounds like you're in need of glasses.
Biting the hand that feeds IT © 1998–2022