And they will come and they will buy.....
Because is is written in the Holy Book of Job
In addition to updating its existing MacBook Air and MacBook Pro lines, Apple has introduced what it's calling the "next generation" MacBook Pro, complete with a 15.4-inch, 2880-by-1800 pixel, 220ppi "retina display". "It's the most beautiful computer we have ever made," said Apple marketing honcho Phil Schiller during the …
Internal display. T60 with 1440x1050 display can be upgraded to 2048x1536 using the IAQX10N TFT panel. Pretty easy upgrade, too. Just make sure you get a T60 with 1440x1050 screen to start with - otherwise you'll also have to change the backlight inverter. You may also need to re-flash EDID on the new TFT panel to get it exactly right, otherwise some modes might not work.
If it was 1/.2 the price and came with XP I'd be sorely temped, or maybe Linux. Or Was simply 1/2 the price and let me triple boot or VM those.
8:5 aspect screen if square pixels (16 x 10). A bit better than 16:9 for Internet, email, PDFs, CAD and schematics. (1.6 vs 1.78 is 16:9 and 1.33 is 4:3)
I am just trading in my crappy 5 yr old work HP with 1650 px for a spiffy new work Dell (if there is such a thing) with... 1366 px. & I had an Acer Ferrari 5 yrs ago with 1650 too.
This resolution thing is crap in 2012. Cheap kit with cheap screens. Not everyone wants to hook up to an external monitor.
Yeah, but unfortunately it'll look like jagged shite unless Apple's willing to license the patent for their super-obvious idea of handing out the appropriate one of multiple fixed resolutions for UI widgets to applications.
This post has been deleted by its author
It gets worse. Try configuring a Dell XPS 15 or XPS 15z and basically you just cannot get what you want. One version has the processor but doesn't let you upgrade the screen resolution past the now ridiculous base level. Another lets you upgrade the screen but not the processor past an i5. It's just pointless.
People will buy them 'cause the equivalents (Sony Vaio for example) are both more expensive and marginally lesser specced (max 12GB of ram, max 512 gig SSD if you buy from Sony, lower screen res etc etc etc).
It's also heavier, thicker and won't run OS X.
Well done Apple, I'm going to be £2.5k worse off by the end of the day. : )
This "retina display" stuff is pure hype when applied to laptops because of the typical eye-to-screen distance. With an iThing, it kinda made some sort of sense, because of the "squint at the screen from a distance of 6 inches" use case, but with a laptop that's not ever going to be relevant.
Consider: my current relatively-bog-standard laptop offers a 1920x1080 screen, and the individual pixels are thoroughly invisible as it is.
Beyond the nonsense of the "retina" tag, note that Apple's 2880x1800 display only really makes sense as EITHER a doubling of a 1440x900, which makes for nice clean graphics at a less-than-stellar resolution, OR you have to use customized applications which employ oversized fonts and icons to be able to read the blasted things on the native resolution (hence the otherwise odd remark that Adobe and Apple were working on new software "to take advantage of" (which should be read as "to be usable on") the new display.
Fun point to ponder: what will that display do to 1080-line video? It'll have to scale by 166%, which basically means taking a group of three source pixels (A-B-C) and creating 5 display ones (A-A-B-B-C), which will probably look less-than-totally wonderful! [ Sure, if you have the GPU horsepower, you can produce intermediate pixels A-AB-B-BC-C, but that will create some subtle banding, too. So it looks like MBP folks are doomed to either HD video in windows, not full screen...
Nope: we know what "HD Mk II" looks like, because it's alive and well in the content production world (i.e. "Hollywood"). "2K" is either 2048x1152 for 16:9 HDTV aspect ratios, 2048x1536 for 4:3 aspects, or 2048x856 for Panavision ratios. "4K" is double those.
Due to the pain suffered getting from 4:3 to 16:9 in broadcast, I wouldn't bet on 16:9 going away anytime soon!
The vast majority laptops come with 1366x768 display - and most companies don't let you configure your own laptops anymore either, at least with meaningful options.
I think only HP, Dell and Sony do, out of the big name ones. And HP has a miserable reliability record, and Dell's consumer-targeted systems aren't much better (in reliability - per Consumer Reports), and Sony is more expensive for a comparable product than the competition....
We wouldn't be pissing and moaning in the comment section for every article about a laptop release, whining about the crap resolution if they came with 1920x1080!
Dude: if "only" HP, Dell, and Sony offer 1080p laptops, who is left? Remember, we're talking about competitors to an Apple product, so any second or third tier vendor isn't germane to this discussion.
Beyond that, why do you think I used the word "relatively"? Bottom line is that you can get a 1080p laptop if you want one.
(And incidentally, I've not had reliability issues with HP's higher-end laptops, although I have heard of issues with the more consumer-oriented ones)!
My laptop is only 1280*800, but at 12" that's fine for me.
The funny thing is, back in 2003 I got a laptop with a 15" 1400*1050 screen and that wasn't all that unusual at the time for mid-high end laptops. The equivalent now really would be though, that's effectively more than the resolution that these "retina" displays are doubling.
And in 2008 I had a smartphone with what would now be classed as "retina" DPI - no one made that much of a fuss about it then.
It's a 15" laptop - at that screen size you may as well play SD video. I don't get the fuss about video - watching on a 15" screen is hardly critical viewing. If you're editing then presumably the video is in a window as you've got app chrome.
If you're watching are black bars really an issue? If you really care about getting the most from a HD video surely you'd not be watching it on a 15" laptop. You can pick up a 40" full HD screen from a decent manufacturer from Richer Sounds for sub £500.
Ouch, the SSD will push the cost up especially since 256gb is very tight for us video editors. So it would have to be a 512gb which takes it to the level of my 2008 Macbook Pro. So much for progress!
Looks like Apple are also going to be fitting proprietary SSD's. Presumably they've patented these to the hilt so that Crucial and the like can't produce cheaper clones.
All very expensive but then again my 2008 Macbook Pro has lasted me twice as long as any Wintel laptop I've ever owned despite getting the same constant usage. So Apple must be doing something right. The only reason I want to upgrade is for a speed and memory boost in CS6. 4GB only goes so far these days.
I know it's not ideal, but I think the idea is more that you use very fast thunderbolt attached storage (as shown on their site) to edit your video on leaving the OS + software (+ caching?) to run on the SSD. Otherwise you'll always be short of either space or cash. From their bumpf, the SSD is there to help eliminate bottlenecks and allow for a faster pipeline. I doubt they'd intend you to edit HD video on it for space reasons alone. Pricing will be interesting though. Memory is where I think they'll jam it in you as it looks like it's fixed in so it's a one-shot opportunity at purchase.
Doesn't quite work like that. I wouldn't edit from the internal HD but you do have to have all yours apps and stuff installed. Alot of it is quite heavyweight and when you have plugins etc installed it gets tighter.
On a 512gb system I generally run with 100gb free. The old 256gb setup was tight.
If they offer a 320gb option, then that might be possible. However you need to leave yourself enough leeway for whatever Adobe etc have plans for in the future.
They're not exactly bespoke SSDs, on previous 2010/2011 MacBooks they've traditionally been Intel SSD units with a few tweaks, like the MacBook's thermals being measured from the thermal sensor on the HDD/SSD drive
You can fit a third party SSD/Hybrid/HDD into any MacBook Pro, but this usually results in the MacBook's cooling fan running flat out all the time. Some people can live with the [barley audible] extra fan noise, others are mortified by it.
Cooling Fan? Not really. I've got an Intel 600GB SSD in my Macbook Pro, fan hasn't come on since I put it in, and I haven't seen a beachball either. Best purchase I've ever made bar none, if you work in IT and earn more than minumum hourly rate, it'll pay for itself in a matter of months.
Because of course, every editor is chained to their desk 100% of the time and is never required to, say, edit on location.
Of course you have the meaty setup back at the office. But you still need to be able to work on the move or on location. A quick turnaround video at a conference to be shown at the end of the day for example. And you still want the same tools you use back at the office.
With SSD it just means you'll have to fork out a more for the kind of storage you like. It's not the end of the world.
Ouch! You *seriously* don't want to do video editing on SSD. Well, you can, if you don't mind premature bricking of your system drive. You're not even supposed to defrag an SSD (unless you're a moron). Imagine what gigs of edits are going to do.
This begs a question, Macs being in general the favoured platform for AV editing; how many peeps in that industry are being mis-sold these things, or are simply unaware of their unsuitability for same compared with good ol' spinning iron?
Yes, Pro's actually utilise the Mac Pro - regrettably the word 'PROFESSIONAL' and 'APPLE" are not synonymous with each other anymore - epitomised by the 'piss poor' Mac Pro upgrade that is not a upgrade.
I'll not start on the iMac or Mac Mini, but its clear Apple no longer gain a toss about the desktop/professional user segment of the industry - they are more concerned with iOS toys that users upgrade on an annual basis to show-off to each other - not a great deal of creativity or design there.
Hackintosh looks more and more the way to go, unless Apple spins off its actual Macintosh component part of the business, the part that is now ignored in favour of fashionable toys with little real world use - well. unless you are blind and can happen to use Siri!!!!!
Yawn more SSD fud. You do understand that to calcite the mtbf rates they just batter the ssd's with data until the wear levelling routines can't cope and all the cells wrote capability expires.
Actual type of data is irrelevant.
It's no different to any other storage medium be it tape or disc - they all have levels of wear.
Regardless of technology you should have a robust backup strategy. Also most of the joe smoes I provide pc support for wouldn't know a SSD if it bit them on the ass and most of the have had an unbacked up hdd crash due to virii and general abuse.
In short bash the SSD is getting old.
(happily running a pair of x18m intels in raid 0)
Sorry. Total fail. My system drive *is* an SSD. I love my SSD. I'm not 'bashing' SSD. Try to read words that appear before your eyes. When someone is concerned about a technology being inadvertantly mis-applied it ain't 'bashing'.
Incidentally, from your pseudo-scientific prose, and your apparent belief that I was referring to data type rather than density over time of write-operations, I infer you don't really understand the tech as well as you should.
oh but you DO, performance from recent SSDs is absolutely epic when editing/grading etc. I'm using some Intel Cherryvilles at the moment and RAIDed bandwidth is over 1GB/sec. You NEVER wait for the disc to catchup. TimeMachine your boot drive onto a nice RAID1 pair of spinners and you're safe enough.
What the hell were you doing to them?
I've been a laptop user since 1998, had around a dozen in that time (I have several on the go) and never ever had one fail on me. A few of them are still working with friends and family.
Look after your kit and it will look after you.
Installed Lion on my desktop at Christmas. Made a mental note to check everything was OK before installing it on the laptop.
6 months later I'm suddenly reminded I forgot to upgrade the laptop to Lion. I hadn't noticed. Kind of demonstrates the point that Apple have found a way to charge for service packs. I hadn't noticed enough difference to trigger the memory that I needed to update the laptop!
So I wonder what Mountain Lion will look like on the new Macbooks? More of the same?
"Except its not a service pack, those you get for free through software update."
It pretty much is a service pack. I've looked down the list of features for moving to Snow Leopard and they're almost all things that would be included in a service pack from MS for Windows (e.g. Windows includes upgrades for newer processor features in a service pack, with OSX they were listed as a feature of a new OS), things that have been available for a long time elsewhere. Only exception I can see is the genuinely useful upgrdate to support MS Exchange.
Basically what makes it a service pack rather than a new OS is that the new "features" aren't new features, but small upgrades to performance. You get the new version of QuickTime. Seriously? That makes it a new OS? Ditto the new version of Safari. What else? Quicker wake from sleep? These are service pack things.
Whilst you say that most people wouldn't notice the jump from Windows 2000 to XP? Or from there to Windows 7? Millions of ranting posters on the Internet disagree that there aren't major differences between these O/S. Windows 7 is massively different to XP. There are vastly more differences between an O/S upgrade on Windows than there are between, in the case the user was talking about Leopard to Snow Leopard.
Apple have found a way of charging for Service Packs. That's obvious to anyone who reads down the very padded feature list of upgrades for them here:
"Well with Snow Leopard they completely redid the Kernel Removing a lot of the legacy 32 bit stuff and replacing it with 64 bit stuff."
Yes, that's what I was actually referring to when I wrote about "things that have been available for a long time elsewhere". Also, it's neither a full shift to 64-bit (which would constitute a significant differentiator in an O/S change), nor no change. It's basically them saying: that "64-bit OS" that we gave you last time, was really largely 32-bit. I.e. they're not suddenly giving you a new version of an OS that is 64-bit. They're selling you the second part of an implementation of something you've already bought. Which to me says: "service pack", not "new OS".
Your comment about how "a new Kernel makes it more of a new OS than rebadging" different versions of Windows shows a significant ignorance of how the OS's are put together. 2K to XP was a "new kernel" in your terms above as they introduced 64-bit at that time (and they didn't subsequently charge extra for the second half of the job). And a lot of other changes too. Your description of Win98 to 2000 being a re-badging not worthy of the "new kernel" difference that you get from Leopard to Snow Leopard is even more extraordinary as an argument because 2000 derived from NT, not from Win98 (98 grew into Windows ME). It was significantly different underneath as well as different on the top.
Really, you can make positive comments about screens on Apple devices, or whatever you want to argue. But stay away from arguing that the upgrade from Leopard to Snow Leopard is anything other than a service pack that you get free elsewhere. You ain't going to win that one.
Your description of Win98 to 2000 being a re-badging not worthy of the "new kernel" difference that you get from Leopard to Snow Leopard is even more extraordinary as an argument because 2000 derived from NT, not from Win98 (98 grew into Windows ME). It was significantly different underneath as well as different on the top.
I said no such thing. Please do not misconstrue my statements
I said 98 was a rebadged 95, xp was a rebadged 2000, and 7 is a rebadged Vsita.
I never said that 98 linked to 2000, or 2000 linked to vista.
its one thing to disagree, its another to lie about what I've written, especially when its right there in front of you.
"its one thing to disagree, its another to lie about what I've written, especially when its right there in front of you."
Sorry - you reeled off a list of Windows OS and it appeared to me that you were saying they were just re-brandings. I didn't get that you were saying each pair of those listed was just a re-branding. But it still doesn't stand up. You say explicitly that XP was a "rebadged 2000". And yet you list "a new kernel" (by which you mean Apple's partial 64-bit upgrade in their last OS as being a more significant features than such re-brandings. Are you actually unaware that XP was the first fully 64-bit capable OS that MS produced? How is it that when Apple sell you a partial 64-bit implementation, that constitutes your far more significant "new kernel", but 2000 to XP is just "a re-badging"? XP was sold in both 32-bit and 64-bit versions. You could get either and in fact, you would typically be given two discs and install whichever you preferred. The entire area you're focusing on is an odd one as well, because the 64-bit changes were what I was referring to when I mentioned things that had existed elsewhere for ages.
Frankly, it's downright weird your attempting to cast the Leopard to Snow Leopard upgrade with the Win2k to XP upgrade. The former boasts such "features" on its site as a new version of QuickTime, a new version of Safari and (aside from the 64-bit stuff which you've actually already been sold in Leopard, it's just that the job was incomplete), it's all at this level of difference - i.e. stuff that would just fall into an MS service pack or be handled as just a new (free) software release. The Win2K to XP upgrade which you term a re-branding had (from Wikipedia's handy list):
* 64-bit capable OS (the item you use to show how the OSX upgrade is actually more than just a re-branding, only it's done in one new OS, not spread over two OS upgrades)
* Anti-aliased graphics & Clear Type
* A tonne of significant GUI changes (modern Start Menu came in then, massive re-design of Explorer)
* File metadata
* File system search overhaul (had a bunch of new parameters)
* Simultaneous Multithreading
* Major re-design of memory management (too complex to summarize here)
* New roaming user profile system
* Lots of stability improvements (not bug-fix style improvements, conceptual ones like device driver roll-back)
* Several new hardware support types (Firewire, USB 2.0, Bluetooth)
* Dual monitor support.
* Remote Desktop
* Remote Assistance (where support can log into someone's machine and take it over remotely)
* Simultaneous user log ins (the way you can switch between them without the first logging off)
* Data Execution Protection (if you don't know how significant that is, you're not a systems programmer)
* Built in Firewall.
* Wireless roaming (basically automatically pick up a WiFi network when it reappears)
* IPv6 support.
* Support for tablet and pen-touch devices.
* .NET framework
All these features either came with the OS at the time (almost all of them) or were added in Service Packs (e.g. Bluetooth support, which would no doubt be something you bought as part of an OS upgrade with Apple). And the above is *very* far from a complete list. There are about five times that list on the WIkipedia link I wrote.
Now will you accept that you are wrong to say that 2000 to XP was just "a re-badging" and that the Leopard to Snow Leopard upgrade is basically equivalent to an MS Service Pack? Even the key thing that would be new OS worthy (64-bit support) is actually something that was introduced in Leopard. It's just that they left a bunch of 32-bit stuff still in there that they've only just got round to re-writing and you're now being charged for this feature twice (they've made the addition of 64-bit support a selling point in two of their OS's now, whereas in XP which is just "a rebadging", you got 64-bit support complete and if there is anything that was re-written for this later that I'm not aware of, it came as a service pack which is my point).
Apple charge for service packs. You can say you're okay with that like the other poster, but there's no way you can compare Apple's OS upgrades with more than a service pack from MS.
XP 64 bit was just one option - one that had to be requested specifically, not the default. The default option as 32 bit - the Rebadged Win2k - and IIRC as it was a change from 32 to 64 bit not available as an upgrade, only a new version.
Ultimately the features that you've highlighted seem to be more or less comparable to the Leopard to snow leopard change - almost all under the hood, or completely irrelevant.
"XP 64 bit was just one option - one that had to be requested specifically, not the default"
Now you're reaching further than Mr. Tickle would dare. So if XP were only available in 64-bit you would accept my point, but because it was also available in 32-bit (increasing your choice), you dismiss the point? And for your reference, when I bought XP, I was given two discs, one for each, though that's hardly relevant - you just asked for whichever you wanted. You made a big deal about the partial 64-bit support in Snow Leopard meaning it was a "new kernel", but somehow in XP, it's irrelevant because you have a choice not to use it? Do you have any idea how biased you sound?
"and IIRC as it was a change from 32 to 64 bit not available as an upgrade, only a new version."
Well that's rather the point, isn't it? It's a major new O/S. There's no relevance to the above.
"Ultimately the features that you've highlighted seem to be more or less comparable to the Leopard to snow leopard change - almost all under the hood, or completely irrelevant."
Rubbish. Anyone can go down the list I gave (which is approximately a fifth of the new features in XP - I just picked ones I found interesting or wouldn't require me to type a long explanation) and judge for themselves. You are hopelessly, irrevocably biased to dismiss that massive list as "almost all under the hood or completely irrelevant). Try applying your standards to the feature list for Leopard to Snow Leopard and see if anything survives. Only two things: further 64-bit support (which now that you find XP had it, you're trying to dismiss as a distinguishing feature of a new OS) and *cough* support for MS Exchange *cough*.
I've heard of Apple Fanpersons, but it's rare I actually meet someone who lives up to the stereotype like you do.
Yes and no. They are *kind of* charging for service packs but there are usually there are more UI tweaks than you'd see on a windows service pack.
Moreover, I kind of don't mind paying for service packs at sub £20 per year for both my macs against the £80 I just paid for a copy of Windows 7. Total cost of ownership is the same. What's really impressive is that Apple have found a way to get home users in the habit of paying for software. It's (duh) making it a cost that people don't mind paying. Just how long did it take Microsoft to realise that trying to charge ma and pa £400 to get a copy of Word led to massive levels of piracy?
"Moreover, I kind of don't mind paying for service packs at sub £20 per year for both my macs against the £80 I just paid for a copy of Windows 7"
See, that's a fair argument. Acknowledges that it's pretty much equivalent to a service pack for Windows, but says they're happy with it as a different business model. I don't have the same preference but they're arguing from reasonable facts. Depending on the life-span of your computer, this may be more economical than buying everything up front for slightly more.
Really, one can't compare the two directly because Apple is actually selling the software bundled with their own hardware normally. How the profit breaks down internally, how much one drives sales of the other, is something for Apple's financial people to comment on. MS don't sell the hardware, they sell the software. So it's a case of comparing a business selling software and gives you regular free updates for a very long time afterwards with a business that sells hardware + software combos and charges you for periodic service packs. Which is best? Depends what you want. I prefer Windows as an OS to OSX, so that's my choice right there. If you're doing it purely on cost, it would be a more complicated analysis (but Apple aren't known for being cheap, so I think MS would come out any comparison well). But in any case, I stand by my comment that Apple charge you for service packs disguised as a new OS. Whether that is better or worse than buying Windows up front (and whether you like Windows or not), is up to the individual.
Leopard to Snow Leopard was not a service pack, it was a do-over stripping out PowerPC code and re-writing the kernel and a host of other components. I seem to remember it cost about £25. SL to Lion cost about the same but was nothing like a service pack either, loads of additional features and (questionable) changes. You don't know what you're on about.
"What other company could sell a base model $2,199 laptop?"
I have to keep clarifying that I'm not a marketing bot, and I don't represent Sony in any way, but I'm always puzzled at how they seem to be entirely ignored in Apple articles, when they're clearly the PC manufacturer in most direct competition with Apple, selling high-end custom designs at equally high prices.
The Vaio Z has always had a high base price and has always had a solid market, which is why they keep making it. The last couple of generations had a base price of $1,999. The new one has a somewhat lower base price, admittedly, but many Z years were right around $2k.
I was holding out for the new MBP to see if it'd be a good successor to my last Z, but sadly it still weighs too much. I don't know why Apple seem to design the MBP range to a 4.5lb weight. It's just too damn heavy. I guess it's mostly battery, but I don't really _need_ a seven hour battery life all the time (or even very often). I much prefer something in the 3lb weight class with a 3-4 hour life and an optional extended battery...i.e. the Z.
I love the screen on the new MBP but it's just too big and too damn heavy. Now I need Apple to make a 13" MBA with a retina display, or Sony to make a Z with a retina screen. Sigh.
You can hack the boot camp MSI installer file to install the trackpad multitouch drivers. I've done it on Win 8. Use somehting like Orca, and dump all the Install Condition strings, as I recall.
It's still not very good though. the swipey swipey helps a bit, but the fundemantal jarring between desktop/metro, and the fact you have to go to Desktop to do anything more complex than open a file means that it's still a pretty poor user experience.
On a proper tablet it's better, but as soon as something goes wrong (need to check your IP address etc) the touchscreen interface for Win 8 Desktop is awful - not accurate enough to click on systray items etc.
Anywhere, what? Yes, good for apple - as noted elsewhere, hopefully this'll trickle down a little and improve display quality somewhat across the board, even if it is only resolution that increases, rather than colour accuracy.
@Chad H. High DPI support is improved in Windows 8 over 7. Desktop too, not only the WinRT/Metro API which has design elements specifically aimed at higher DPI scenarios. XP is somewhat less effective than 7 in this scenario BTW. These aspects of Windows 8 aren't all that well known right now but expect we'll be reading much more about this topic in the not too distant future as HD MacBooks get into the hands of Windows developers (not holding breath while Windows OEMs get their act together).
'Retina' (sic) display MacBook Pro is listed on Apple Store as from £1799. At £300 over the base MacBook pro model (still with a quaint 1,440-by-900), considering other facets like more memory and hard drive not too bad a premium for a state of the the art laptop display. Shame about the unimaginative PC vendors Dell, HP, Acer, SOny etc. always leaving it to Apple to grab headlines but hoep this is the first gust of a wind of change.
The 'Retina' 15" has over 5M pixels compared with 1.3 M pixels on the basic MacBook Pro 15.
Pixel counting seems to have worked well for camera marketing perhaps its about time MP was headlined on notebooks, tablets and monitors if many people (as it seems) fail to grasp that 768p is a 20th Century resolution.
Pixel count is for morons. It has been used to sell crappy cameras to morons in the past decade, making photo buffs look like alien to the rest of the world when not giving a shit about the gazillion pixels on the new cameraphone-of-the-week. Please don't let that happen with monitors too.
I trust that the display on this particular model is good, but if MegaPickles begin to be a sale argument the market will soon be flooded by horrendous displays with very poor refresh, contrast, brightness and colour rendering but 16.5 GigaPickles. History shows that using pickles count as a sale argument lead to worse, not better, products.
You're kind of missing the point. It's not about the number of pixels but pixel density, and the point at which individual pixels become difficult to discern from one another at a given viewing distance. Hence the reason why Apple most frequently use the term "Retina display".
The pixel count is really only interesting from a graphics and battery performance point of view, as it has a detrimental effect. And I'm not convinced I'd like to play Diablo 3 at the full 2880x1800 resolution if they're going to be rendering 4 times as many pixels, as it they will probably have to compromise on the use of other rendering effects.
> You're kind of missing the point. It's not about the number of pixels but pixel density
Im not missing the point, you kind of are are. It's always about pixel density. On a 15" display 1080p is already overkill unless you're looking at it from 5 cm (in which case you have other problems). Retina display is purely a buzzwork which has no grounding in reallity. Proof is, they use the same for the same density in the iPhone and a laptop-sized machine. Ever wondered "retina display" would look like on a theatre screen? Exactly the same but you would need a 30-tons truck to carry the Blu-Ray disks.
On the other hand, IF it becomes a selling point, the cheap-and-dirty manufactures WILL build extremely crappy displays with gazillions pixels per square cm, exactly the same as they built extremely crappy camera sensors with gazillions pixels per square millimetre.
The Canon 5D Mk I is dated but still a decent camera. Guess how many megapickles the sensor has. You don't know? Look it up while the rest of us watch your chin hit the floor (note: I am not in any way a Canon fan).
In most cases the number or the density of pixels is rather unimportant. The *quality* of these pixels, in the other hand...
On a standard-sized laptop display and unless you have very specific uses like high-res movie editing, 1080p is more than you will ever need. And if you need more than that, chances are that you should be using a multi-monitor workstation with some _real_ grunt, storage and graphics anyway.
Some of us Reg readers have worked in the computer graphics industry, GPU development, vision science etc. I do wish people with limited technical knowledge would stop making misleading statements like '1080p on 15"' is overkill simply on the basis of a personal opinion. Theres a lot of nonsense written on the net to confuse non-specialists. The human eye is a complex sensor with a lot of variance from individual to individual. Sure viewing distance, pixel density, colour gamut are all factors and the monicker 'retina display' is marketing speak.
There is no question that 16:9 aspect ratio isn't the best match for human field of view in near viewing space of notebooks or that in portrait orientation this shape is perceived as too thin. Apples choice of 16:10 yields better usability for most application contexts.
Text readability is a key feature for notebooks and at typical viewing distance 16:9 1080p LCD type is less than ideal for text even for many over 60s (vision acuity always decreases with age although there is a lot of variance among individuals). Anti-aliasing goes some way in compensating but better to use higher pixel densities as the tech is becoming available. More wiggle room with photographs. Do the sums and 2000p is overkill for most purposes so we aren't far away from reaching spacial resolution nirvana for notebooks.
This MacBook choice of 1800p is more of a jump than is strictly ideal at this stage (larger display buffers need more GPU work to sustain frame rates in games etc. and mobile GPUs aren't quite there yet so apps may choose to use lower resolutions plus upsampling internally). 1800 comes from quirks in the OS X and iOS app models which mean doubling display size from 900p is simpler to run existing apps acceptably. Less of an issue with Windows apps built in last few years so for a premium Windows display I'd personally compromise with something like 1400p on 15" for state of the art in 2012 (not that anyone ships notebooks with the panels yet).
Incidentally having worked with far east and US display manufacturers some years ago, it was interesting to find quite a few people in the OEM and display business don't get resolutions either.
In a few years time everyone will be using higher dpi screens and we'll look back at the 768p notebook era as we now look back at 1990s CRT displays.
Oh come on, computer displays have been pants for decades. Laser printers have been cranking out 300, 600 and higher DPIs for decades. Why have computer displays been stuck at 75-100 DPI for so long?
CRTs held back resolution for quite some time, now that we're using LCD and OLED we can easily improve. It is just that nobody has been interested in doing so. Now that Apple have given the industry a kick up the arse maybe we will see things improve.
If you watched the WWDC presentation you would see that they showed a 1080p video fit perfectly in the edit panel in Final Cut without needing to be scaled down. Quite handy to have 1080p easily fit on the screen alongside all of the edit palettes.
But retina's useable space is only 1440x900 right - so for productivity it's still not as good as 1920x1200?
Since Apple doubled the resolution (1440x900 -> 2880x1800), they can double the DPI and all the UI elements will be the same size, but be twice as sharp and detailed.
I don't think so but will OSX allow the user the option of NOT doubling DPI. So UI elements are a quarter of their typical size, but you can fit four times as much stuff on the screen.
I know little about OSX, but I'd imagine there's also a middle road of choosing some non-integer multipler (i.e. not 1 or 2) that would yield an "best-of-both-worlds" effect. Everything is a little sharper, you can fit a little more on the screen and it's only a little smaller. The wild card is how OSX deals with non-integer DPI multiples.
I have a 15inch with 1440x900 MacBook pro and the resolution is ok but have to wonder about the 11inch and 13 inch 1080p ASUS...ultrabooks
I would prefare Two turds (2/3'rds x 1449x900)
Visual measurements show that good human eyes have ~4K pixels horizontally. So as numbers go, the new Macbook is not there at 2880 pixels. But it's close.
And now they've (falsely IMO) claimed a retina display how will they pitch the next resolution upgrade? "The all new retina display plus"??? Oh well, marketing will figure that one out with even higher resolution BS.
Over your entire field of view (assuming a fixed eye).
But after writing my post I recognised that a viewer would never view the screen so close that it occupied the entire retina (though I *have* seen some folks do that). But for most users for whom the screen is only impacting part of the retina, the pixels are indeed small enough to be classed as retina resolution - or close enough as makes little difference.
@Jim. Only the central part of the human eye is high definition, we make up for it by eye movements etc. Horizonal field of view is not far off 180 degrees but definition decreases as move out to peripheral angles. For displays we need to arrange to deliver to that central region wherever eye is pointing and required ppi depends on distance. Those 4K type figures have little value stated out of context.
After years looking for proper laptop/screen combination available somewhere near me, I have finally bought an HP dv6-6b19 with proper full HD no-gloss display with tolerable viewing angles (no IPS though). I was considering a MacBook Pro, but I didn't like their display, even the (almost) full HD 1050p matte option).
And a week later they release this retina thing. Successful trolls are successful.
I recently bought a monitor that can support the same high resolution as the Apple Cinema Display (2560x1440) but after using it for a few minutes I realised that nothing is readable at that size! Load windows and it's fine, but mac, no deal!
I searched around various Apple forums and it seems that everyone is saying the same thing. There are some hacks you can use to try make it a bit better but what's the point in running that high resolution if you have to hack things together to get readable text.
So what more on these new 'retina' resolutions?
Even Paul Thurrott (news editor for Windows IT Pro magazine) is of the opinion that Mac OS scales better than Windows. See the section "Better Resolution Agnosticism" in this article:
eh? I've been using 30" and 27" inch Apple Cinema displays for years and they're great. Don't push the screen back to get the same angle of view as before, keep it the same distance as your prior monitor and enjoy the larger desktop size - that's the whole point. The original 30" was - essentially - two monitors in one.
Hmm... In 1979 we were shipping a full PC based 16 million point color display desktop 24 inch diagonal graphic system with CAD, PC layout and Mechanical Design and selling a 36 inch roll plotter along with it. The Display/computer/storage/keyboard package sold for less then three times the cost of Apples latest offering 33 years later. Of course Jobs had not yet ahem..licensed? Xerox PARC or even really learned what the hell a GUI was about in 1979.
Apple is way ahead of the pack.. you bet!
Just understand that the lone wolves are at least a decade ahead of Apple. Now the general population just has to learn that wolves need to feed on pieces of the fat boys carcass or the everyday user will never see real change, and the U.S. can go back to farming.
What this means is that inventors need to be able to get paid by the big companies that innovate/steal their inventions or they will dry up and disappear.
All those who claim that invention stultifies innovation have never actually created anything that is new or been involved in the process in which a Jobs or his ilk conveniently innovate someone else's ideas to slake his/her ego and thirst for power.
Things do change and it may well be that Apple has now made a place for real creators to invent and a new era of a creative Apple contributing as many new ideas as it takes but don't hold your breath.
I hate apple as much as the next Android/Ubuntu user but price wise this seems like a rather good deal. It's pretty difficult to find an ultrabook below the £1200 mark and you can garuntee it won't come with these features (even if you ignore the retina display) so £1500 ish sounds reasonable.
Oh wait! 1GBP = 1USD in consumer electronics world so this will cost a fortune...
Not taking anything away from Apple's new kit, but cheaper ultrabooks _are_ out there. Google Shopping tells me that Toshiba Satellite Z830 can be had today for under £700 (as long as you're prepared to take your chances with an outfit called "John Lewis"). IIRC, the Reg reviewed the Z830, mentioning that it had an RJ-45 port which is why I kept it in mind.
Of course, if you want an i7 processor like the new MBP, you have to pay for it... that's not an Apple thing!
This post has been deleted by its author
"(as long as you're prepared to take your chances with an outfit called "John Lewis"). "
What's wrong with John Lewis? The John Lewis Partnership is great (they own Waitrose and their own department store chain). They're employee owned and company profits get divvied up amongst employees.
If one wants the 27-inch Apple LED Cinema Display with 2560-by-1440 resolution but cannot afford the $999 price tag.
One just needs to go to ebay and buy the same panel through another manufacturer.
The equivalent to the above Apple display would be the YAMAKASI CATLEAP Q270 SE 27" which is at $349 on eBay this morning.
Granted one need a Mini DisplayPort to Dual-Link DVI adapter at $99 from Apple but for $448 that's a steal.
Give it two-three months and the same thing will be possible for laptops.
What would be the point unless you run an operating system that is truly display resolution independent? Font rendering normally is but GUI controls and icons are not often designed to be, so actually being able to click on buttons accurately might prove extremely difficult.
I use a Dell 27" 2560x1440 monitor at home with 108.8 dpi and I find that UI controls are getting a bit small compared to my old Dell 24" 1920x1200 monitor which had a 94.3 dpi. This new 15.4" 2880x1800 display has a dpi of 220.5. You're going to need software that can scale to that pixel density and still be usable!
IIRC, NeXT computers used to use Display PostScript so that it was truly resolution independent.
The Apple 27" panel is an LG unit and the same as used in HP's ZR 27 model. Apple's does, of course, have a few niceties included like USB audio, iSight cam/mic and Thunderbolt, but the HP has more inputs, side USBs and costs around half as much. I bought the HP for home and use the Apple at work, don't know which I prefer.
I've had a few heated debates in the past with my flat mate who used to work for Apple, mostly over how they can release a top spec, horrifically overpriced laptop and not include simple connection ports than are a modern day standard like HDMI. His usual reply is "Well you just have to buy an adapter" I then point out that if I've just spent £2500 on a laptop, why should I then have to spend more to get functionality that is present on my £400 13" netbook? Sure someone who spends £2500 on a laptop probably isn’t going to blink at another £40, but for not including something as obvious as a HDMI port in past models Apple infuriates me.
Where am I going with this? THEY NOW ADD A HDMI PORT AND REMOVE THE NETWORK PORT?!?
If you can afford a £2500 laptop, then a £25 adapter isn't going to kill you. Anyway, isn't wifi common as muck now-a-days? Why worry about a dedicated Ethernet port? They'll be cheap thunderbolt breakout boxes if you want some extras when you dock the laptop at home as a desktop replacement.
Sheeeesh, it's difficult to please people today. Apple offer an amazingly spec'd laptop that will last you a life time and still people want more.
I agree in most cases Wifi is available, but sometimes it just isn’t the best option compared to an Ethernet connection. And while as i mentioned I’m sure forking over yet more money isn’t a problem for most, it still irritates me that it has to be done for a top spec £2500 machine.
It’s a very nice machine, but overshadowed for me by sheer stupidity. What more do people want you ask? Only a little bit more I’d expect, 1 Ethernet port more to be precise.
Finally I have an upgrade path from my old Dell D820 15" 1920x1200 (WUXGA) 16:10 display.
It seems all laptop manufacturers have recently switched to the cheaper mass-produced 16:9 TV 'HD' panels. For any developers, artists and musicians this loss of vertical resolution is a huge loss. Those saying ">1080 is pointless" have clearly been won over by the 'HD' marketing droids.
I've held off up(down?)grading for this very reason but my 2Ghz Core Duo is struggling under load. Dual booting this macbook pro for desktop apps (Linux) and Cubase (OSX) seems like the perfect solution.
yea its pretty pricey, and you can try and compare it to an ultrabook. Thing is ultrabooks still run that piece of shit windows. Whatever version you talk about windows is shit.
espically if you are trying to do anything creative which is what the macs are aimed at, not the facebook douchbags who buy it to brag about browsing facebook from a shiny new HD mac despite spending 2k for a internet browsing machine
"yea its pretty pricey, and you can try and compare it to an ultrabook. Thing is ultrabooks still run that piece of shit windows. Whatever version you talk about windows is shit."
Well that's a convincing and well supported argument!
"espically if you are trying to do anything creative which is what the macs are aimed at"
Last time I checked, Windows ran Photoshop perfectly well and Wacom tablets worked fine. Or are you talking about sound, because Cubase, Sibelius and I'm sure others too, all run fine on Windows. Maybe you mean 3D graphics and animation? Well, I have VUE installed on Windows 7 and though I don't have a render farm on Windows, I know I could and I bet it would be price comparable to the same on Macs (better probably). Same with Maya and basically a long list of creative software. Not to neglect a certain word processor for the writers amongst us.
So really, I have no idea what you're talking about unless you're talking about this false image that some try to present as Macs being the tool of creative people and Windows being the domain of "I'm a PC, I do spreadhsheets" trite little advertising stereotypes.
Seriously - Macs, Windows, both are fine for creative work. I don't know what you're talking about.
Or is there some magic creativity enhancing property of M
troll alert. you really never passed lemming appleville in college. photoshop? runs faster on a generic pc win win7 x64 and gpu variation acceleration. premiere? same thing. vegas vs final cut pro. again. and fcp x sucks so bad that there is a backlash and most people can get refunds. wow. refunds for an apple product.
corel painter? pc. wacom interfacing, easily pc (quad monitor support is a breeze and under $400 with 2-4gb gpu support).
audio bobs? pc. usb3? pc. vast amounts of cheap generic 7200rpm drives, raid 5 even? pc. esata, pc.
so troll on about how "art" is "easier" on a mac. #derpathon.http://www.reghardware.com/Design/graphics/icons/comment/wtf_32.png
Ever since HD became the new consumer word screen development has gone back 2 steps. I have ben using 1920*1200 for years, actually from the moment this resolution was available.
Now every vendor+dog pushes 1920x1080 because of HD video. Well I'm sorry but HD is just not HD enough.
I personally am happy that crapple are pushing a higher resolution screen, maybe the rest of these twats will get on and start shipping 2k and 4k screens....
I can't think of anything better than having 3840x2160 or higher on a nice 27" or 30" monitor.... I would make my life much easier in photoshop, gaming would actually become "extreme" (my card has 3GB vRAM so I can at least run 2560*1600 in games) watching films can be scaled up with no problems... any half baked £100+ PCIE card with 1GB VRAM can upscale a simple video stream with out even breaking a sweat....and I can display 3 or 4 things on-screen that I am working on.,,
Bring it on I say......
As for laptops...they suffered the same shit, for some reason this abundance of shite 1366x768 screens seems to be everywhere.... someone somewhere obviously has a few warehouses full of these cruddy things and is somehow STILL pursuading vendors to use them...
12" screns should have 1440*900 minimum.
14" 1600*X (X being whatever ratio for 4:3 16:9 )
15" 1920x1080 (BO(RING)G STANDARD HD)
17" at least HD (Had a Dell with a nice 1080p about 6 years ago....niiiiiicccceeeeee!)
As for HP DELL LENOVO(and IMB when it was IBM).... well.... having enterprise experience in all three vendors, I can tell you that their business models are exceptionally well made and very reliable, if a bit dull.... you get what you pay for. Buy a shite consumer all-plastic(im talking about the chassis) POS with a crappy glossy screen.... well more fool you for not investing in something a bit better....
Another act of stupidity I see all too often. Vendor wants X for SSD and RAM on laptop. X is about 50% more than the parts cost. Twat buys laptop and complains it is expensive. Why not buy the machine with the smallest cheapest setup you can get for RAM and HDD .. then install your own..... its not farking rocket science.... you can clone a drive in minutes and installing ram is as easy as putting toast in a toaster ffs.
Impressive specification in the new Mac Book Pro.
Apart from the performance, the build quality of Apple notebooks are very good - clean, light and strong. And I will remain content with my 2010 MacBookPro 17" for another 2 years at least. I'm no fanboi - I also use a Windows 7 / Ubuntu dual boot desktop and a Windows XP netbook.
By then, if I was considering replacing, in 2 years time perhaps hexacore or octocore notebooks with multi core graphics chips, 100Gbit optical Thunderbolt and even faster and bigger flash would be the new state of the art, making those who buy today's latest release as envious then as I would be of them buying this latest update out now.
One can't win at the game of having the latest, as technology marches on. The key is to accept this and be content with what one has for the useful life of the item and realise that most tasks can still be done on that, even if it involves a little more waiting.