Shiny, shiny
Manager at the last company I worked at insisted on an ultra-thin, ultra-expensive, top of the line MacBook Pro. He carried/flaunted it around for a month before he realised he wasn't connected to the corporate VPN.
Amid the hoo-ha surrounding Apple's WWDC announcements were some nuggets aimed at encouraging enterprises to get snuggly with the fruity firm's devices. Sure, every other hipster likes to carry a MacBook around (the other half can be found complaining about their Surfaces), but Apple has yet to make much of an impact in the …
"Although I hate everything Apple"
If you hate everything Apple, you must give away any device you have and not use technology at all. You are the beneficiary of many things that Apple has done that shape today's computing. You will continue to benefit from what was announced at WWDC in the future.
About the only thing I can think of that Apple actually invented from scratch that I use is CUPS on *NIX systems. Which works well and was light years beyond its predecessor. Many other things they really didn't innovate, but instead built on others' inventions or outright ripped off. (not that Microsoft or Android can't be said to have done the same) I am truly glad that Apple has survived and thrived and offers something different, even if it's usually not my thing.
I suppose you could argue that Apple made the tablet and current format of the smartphone ubiquitous, but they were by no means the first to invent them. And they brought the personal .mp3 player to the masses, but again, weren't the first. But to be fair, I'm not a fanboi and I may be speaking with some ignorance, despite having been a technophile for decades.
Homage should be paid to Apple, but perhaps more so to the likes of Xerox and Commodore, as well as Nokia, Palm, and Blackberry. (and Microsoft) And let's not forget IBM. (although on some level I think the world would be a better place if we were all using Amiga-format instead of PCs) And Microsoft's Kinect is also a real innovation. You could argue that MS's Active Directory is innovative, but you'd be forgetting Novell Netware in doing so. And the ARM processor has some spiritual heritage to MOStek as well as Digital's PDP series. And ARPANET deserves more than a little mention. I could think of probably 50 more without trying too hard.
All of these shaped the face of modern computing, for better and worse. Unfortunately it might be said that aggressive and cutthroat marketing tactics and competition shaped what we're doing now as much as true innovation ever did, and the same could probably be said of life on this planet as we know it.
I am an Apple user but by no means a fanboi. I have plenty of gripes about APPLES OS's, I won't list him here as it would be a bore, but some of my crucial points should be listed.
Starting around OSX, I have had to do sometimes 4 or so updates a *WEEK*. None of the updates were especially troublesome I should add, but they were point release and to have that many in a week were a PITA. The last point upgrade caused me significant issues (Mojave), and I had to re-install the OS twice if memory serves me. Frankly, I was scared of the upgrade to Mojave that I decided to take my system to the APPLE STORE and let them do it. There were several screwups, and I lost data as a result. I still am suffering 6 months later.
Apple used to be miles and miles ahead of MS, but they are falling behind as most (if not all) of the items they add are mostly worthless to me. Apple support seems uneven as they switch you around to different people; you get increasingly different answers to the problem. Once in a great, while you find a reasonably knowledgeable Apple support person, you spin the wheel on the quality of support.
have you heard of it?
I have had my ass pinched by Apple glitches more times than I care to admit, but Time Machine backups will almost always rescue things after the (crash) fact.
I agree with most of your post. Apple has a long history of asking the corporate world to commit to tech that they abandon on a whim.
Company dead since SJ died and they turned it over to that Compaquer.
I bailed on iOS development back in 2013 when I realized they were going to make their developers do an eternal monkey dance to no good end. Web apps are the way forward.
Unicornpiss "About the only thing I can think of that Apple actually invented from scratch that I use is CUPS on *NIX systems..."
Very little has been invented by companies from scratch.
"Many other things they really didn't innovate, but instead built on others' inventions or outright ripped off"
Wrong actually. You are thinking of Xerox PARC. Yes they did interesting things. But at the same time (or later) than Jef Raskin (father of Macintosh) was also working in that space. Raskin primed Jobs to go and see what Xerox PARC were doing to get the point of what Raskin was also trying to do. Raskin did his Ph.D on Quickdraw in the 1960s. Xerox had told PARC they weren't interested, so PARC invited in IBM, Tektronix, and Apple to make a product out of what they had. Jobs was the only one of the four (including Xerox) that got it.
Apple innovated and took the risks. The success of Mac was hardly assured and difficult. Like Xerox, IBM, and Tektronix, people (particularly computing people) would not get it.
"And let's not forget IBM" what the PC – nothing that two kids in a garage couldn't invent and really IBM's strategy to put Apple out of business. The less said about that the better.
"All of these shaped the face of modern computing, for better and worse."
Yes, it is a wide mix. Apple's innovation was to take disparate technologies and put them together into something people would want to use. That should not be dismissed as you have done.
"Unfortunately it might be said that aggressive and cutthroat marketing tactics" By whom? Are you suggesting Apple? No, Apple more than others has got products right – they spend years in development, the others months in ersatz copying. That is their marketing. Others aggressively market, particularly you're IBM. See Richard DeLamater's "Big Blue: IBM's Use and Abuse of Power".
<sigh> I am not dismissing Apple and their influence. But I dispute them being such a great innovator. Apple may have marketed the mouse-driven GUI, but Xerox invented it, along with Ethernet and many other things. You may note from my tirade that I don't really love IBM-format, but you can't deny that we're rather stuck with it and it has become the major standard.
The point I was trying to make with "aggressive marketing" and competition was that most companies have indulged in this, including Apple, and for better or worse, that has shaped what we're using now as much as invention. There are few companies more sue-happy than Apple. Everyone should defend their IP, but a lot of the real innovators get crushed in these types of lawsuits, which are often just so companies can save on having to pay well-deserved licensing fees or stave off legitimate competition. (and I'm not singling out Apple here) Glad SCO got their asses handed to them in that regard.
Apple was invited in to Xerox despite vehement objections of the PARC researchers. Alan Kay had the idea of a portable "notetaker" with a touch screen keyboard, bit mapped graphics etc. just about like a tablet 40 years later.
There was so much good stuff at PARC that Xerox ignored.
"There was so much good stuff at PARC that Xerox ignored."
I believe that Steve Job's main talent was recognizing really useful new tech and potential products that would appeal the most. Cook doesn't seem to be all that good at it and Apple has lagged where it used to lead and just relies on one product line (iPhone) to carry it along.
@Unicornpiss
"...they really didn't innovate, but instead built on others' inventions..."
That is pretty much the definition of innovation: improve an existing product, combine existing products in new ways. I didn't expect to find myself ever in the position of defending Apple. But even I have to admit that they had been a very innovative company. They didn't invent graphical user interfaces, touchy screens and, contrary to popular believe, not even squares with rounded corners, mp3 players or the lowercase letter i. Apple though took those things and a few and innovated them into a new and in many ways better experience than what the single inventions had been before.
This is innovation. Or, in the case of Apple, was.
This comment is utter nonsense. Nobody's obligated to give away anything or stop using anything simply because they are unhappy with one specific vendor. The notion is ludicrous. Furthermore, the idea that Apple's impact on technology is so widespread that it's impossible to find any device or software they were not involved in is equally ludicrous. Frankly, this kind of over-the-top nonsensical response is feeding the stereotype of the Apple Fanboi. Apple's impact on computing is undisputable, but the Apple of today is completely divorced from the early tech innovator.
You can say "It needs this and that" all you want, but if they pay the wage and are high up enough you have to warn them of the risk in writing (to cover yourself) and then let them have it anyway. Then hand in your notice. Because if they aren't going to listen then no point being there.
I thought the handsets basically become useless if the currently assigned Apple ID user doesn't wipe the device which requires signing into their Apple ID (No good if a member of staff suddenly leaves and you get it sent in a hopefully padded envelope)?
Although I admit, I stay the other side of the room when anyone comes near our department with a corporate assigned idevice.
The article is talking about new features that let employees use their own personal iPhones:
"The key thing here is that the Managed Apple ID co-exists with the user's own personal Apple ID – the two don't interact, and the user can get to personal and work data without worrying that their own data might get wiped.
Under the hood, an entirely separate APFS volume is created for managed accounts, apps and data on the iThing, cryptographically separated from the user's own business."
In addition, a corporate-provided device can be managed by the IT department, which has control over the device higher than the individual user. Because the user can't easily subvert these things, some IT departments prefer IOS to Android for corporate-supplied mobile devices. This becomes very different when BYOD is considered, but I shudder at the various implications of that, so I'm going to avoid considering it.
"with some help from best buddy Microsoft, Apple has put forward its strongest case yet that its machines are no longer so special that administrators need recoil in horror at managing them."
70% of our service tickets are for the 10% of the Macs marketing and sales insist on having. Their wireless tech is garbage, support is at best, hit or miss and spare parts cost a packet; if they even deign to sell you spares without you're delivering a broken part first, then forcing us to wait three weeks until they themselves can bring one in. The costs to buy and maintain are eye-watering.
$6K for a new iMac you can build (identically spec'd) for $1,200 yourself? Keep it. They are not "special" in the slightest. They are the Jordache jeans of laptops; you're just paying a premium for the logo.
"$6K for a new iMac you can build (identically spec'd) for $1,200 yourself?"
I thought this had been disproven? Most tech sources I read tend to believe that building like-for-like systems tend to be fairly similar in terms of outlay and, in many cases, more expensive.
https://gizmodo.com/is-the-new-mac-pro-worth-the-apple-tax-1835216170
"Consider [the $1200 CPU, four Thunderbolt 3 ports, 8 PCIe slots, 12 ECC RAM slots, quiet cooling etc], and what it would cost to build up a similar machine, and the Mac Pro starts to seem not just logical, but practically affordable. A device similar to the base model $6,000 version of the Mac Pro would cost over $9,000 from HP and close to $6,300 from Dell. And that doesn’t include the extra Thunderbolt 3 ports, or capability to support 1.5TB of RAM."
Still - it'll be a long time until I have six grand burning a hole in my pocket...
I think Apple do hit it right for their one specific SKU/use cases. For that they are reasonabl ebang per buck (and service on top).
Difference is, in the past at least, other companies had cheaper SKUs or better specced ones (though more expensive). Such that, if your use case varied, or hardware was more complex (Windows XP system running a CNC machine :P) then yes, Apple starts to look expensive.
A bit like RED cameras. For those who need them, the cost is cheap. For those who don't, they needn't even ask what it costs.
What you are describing is an engineering workstation. Not a personal computer. Or, what single U servers might look like.
Thunderbolt ports are super if you are using a Mac. Useless for anything else, so I don't see that as a compelling sales point. If you are moving large amounts of data over one of these ports you are doing it wrong. Otherwise, USB 3 is fine.
1.5 TB of RAM. Yeah, sure. I have hypervisors that are running 512 GB of RAM and they have a ton of headroom. They could take the 1.5 (its the processor architecture, not Apple) but it is cheaper just to buy another server and have the extra CPU headroom.
Just more idiot tax.
"1.5 TB of RAM. Yeah, sure. I have hypervisors that are running 512 GB of RAM and they have a ton of headroom."
If you are rendering 8k video for a feature film, things get different. Apple products get used quite frequently in that application and for visualizing large data sets at universities and even NASA. The application and available software drive the hardware. In this case, the availability of off the shelf, warrantied hardware may spawn some new applications. BYO is certainly a way to go, but it's not supported by one vendor as a system. That can be a problem for some.
"$6K for a new iMac you can build (identically spec'd) for $1,200 yourself?"
I thought this had been disproven? Most tech sources I read tend to believe that building like-for-like systems tend to be fairly similar in terms of outlay and, in many cases, more expensive.
I would agree with you. The 6K vs 1.2K is a bit of an exaggeration.
To your point... if you match the specs on the MB, Memory, CPU, PS Wattage... depending on vendor / quality, you can build a machine that would be cheaper than the Apple product. The difference is styling and branding.
In general if you are going to build a custom rig that matches the Macs, you will end up spending more because you will tweak the system with different and potentially better material. Faster memory, more storage, adding NVMe... more efficient power supply, better cooling like using an AiO liquid cooling for the CPU. (Or if you want to go overboard, a customer liquid cooling loop for CPU, Mem, and Graphics card. )
So you will end up spending more money.
Note that you will differ on the Motherboard. You can buy Xeon boards that suport 1.5TB of RAM, not sure about Thunderbolt 3 ports, but it will not be an Apple board.
I wouldn't go to DELL or HPE for this but would probably build it special using SuperMicro or a different supplier. And yes, you'll end up spending more than what you would for a Mac.
I got my open-box iMac Pro on ebay for $3500 when it was still selling for $5400, could not be happier.
One scratch on the (not $999) monitor stand, otherwise flawless. Maybe it was fenced?
It's the best development hardware I've ever owned. Only a fool buys new Apple hardware at retail.
MacOS - meh... but I can work around the limitations.
"MacOS - meh... but I can work around the limitations."
The Cheese Grater I'm on now is also running Win7 and Ubuntu at the same time as I'm using MacOS to play the part of a commentard. I got used to having all three OS's when I was working in aerospace. I prefer a Mac for day to day stuff but I needed Windows for CAD work and Linux for the flight software. I really couldn't fit 3 boxes in my cubicle with any room left and I'd be able to degauss a submarine with the amount of wiring that it would have needed.
The price tag is eye watering but not necessarily too much. It comes down to how much time you can lop off of jobs you are doing. If you are just fiddling about on the internet and mis-typing stuff into spreadsheets, it's overkill. If you are editing video or rendering 3D graphics or creating ads in Adobe products, it could mean weeks of saved time over the course of a year. That could be very significant. It could also mean getting projects out of the door and billed on time when you stupidly let the client dictate a schedule you knew was too tight. The MacPro will be a top end machine even in it's most basic configuration. I would like to see them do another model with the specs of an iMac in the same package as the MacPro. I don't like having the monitor and computer gubbins all in the same box that isn't all that upgradable.
"70% of our service tickets are for the 10% of the Macs marketing and sales insist on having"
To make claims like this, you need to say where you work, what your mix is.
My experience is the opposite. I'm much more often having to fix Windows computers for those in trouble. Windows seems made for support – that is users need to call in support people to fix their problems, whereas Macs just work.
"you're just paying a premium for the logo"
Rubbish.
Apple devices aren't compatible with mega-corporation standards. These standards were written by lawyers and old IT contractors to block every theoretical mishap. It's a huge pile of monitoring software, periodic scans, remote control, and draconian user privileges. Microsoft takes some care to stay compatible with those standards but Apple will break them on every update. There are ways to Windows and Linux without administrative privileges but it's harder in MacOS.
Mega-corporations probably aren't Apple's target but it's hard to serve only small businesses that aren't terrified of risks.
And yes, the WiFi sucks on all Apple devices. They seem to be eternally stuck in low power mode because the latency jitters as packets go in and out of alignment with the DTIM interval.
I have coped well with searching the internet for problems I’ve had on the Mac without the need to call technical support. The Windows 10 laptop I’ve been using results in a mixed bag when looking for internet solutions. That being said I’ve never used Windows without being accompanied by a barrage of issues. Some solutions solvable but a few not so. Some ‘solutions’ require delving deep into the PC and deleting potentially vital files that might screw up the system - and those are simply guesses at fixing the problem. I’ve never had to re-install macOS but have re-installed Windows countless times when something has screwed up (and not always through me being a clueless idiot).
"70% of our service tickets are for the 10% of the Macs marketing and sales insist on having. Their wireless tech is garbage, support is at best, hit or miss and spare parts cost a packet; if they even deign to sell you spares without you're delivering a broken part first, then forcing us to wait three weeks until they themselves can bring one in. The costs to buy and maintain are eye-watering."
Well, it sounds like it's being managed appallingly. But what country you are in can make a LOT of difference to the experience. And whether IT is supporting user decisions, or trying to passively aggressively force them to change their mind.
I've been the Mac guy at various mixed platform shops, and in general they are much less work. If they are being garbage, then they are for the same reasons as PCs are - lack of IT management, lack of consistency etc. They also often seem to arise from shadow IT as much as any sort of plan, so it's less a case of "Macs are hard to support" but "IT won't support the Macs". Once you get everyone on the same page, with the users being pissed they can't have free reign but at least stuff works, and IT pissed that they've got to fix "gay" computers, but at least the staff have to follow some bloody rules.
Making WiFi work is an engineering issue. You test the kit, you work out the issues and you fix it. And yes, I fucking hate the reliance on it, but unless your department of pretty suits is next door to a smelter I'd expect it to be solvable.
If you've got more than 50 (IIRC) Macs on site, you should be able to be your own authorised repair shop, for your own devices (ie you can't fix other people's). Got to get the usual Apple certs, but if you're pulling parts then you need that for the "where does the sticky tape go?" pictures :) That skips a bunch of the garbage around parts, replacements etc. But needs buy in from IT.
While I'd like to pick and choose what systems and OS I support, as a computer mechanic I accept I'll be fixing whatever comes my way.
Last time they could not fit any other small GPUs in that tin... this time they've left a LOT of extra space. Might be the other way around. They'd not find one big enough to swap out. XD
(Custom designs always hit the problem of not being in such high demand to make it work putting out lots of development and versions.I mean, how many GPU versions were made to fit in the trashcan anyway? IIRC just the 1!)
"how many GPU versions were made to fit in the trashcan anyway? IIRC just the 1!)"
That's why I skipped the trash can. I've got the last version of the Cheese Grater and moving would have meant adding external boxes for drives and not being able to fiddle with graphics cards and other hardware based add-ins.
This idea of having two IDs on a device, with strong separation between the data they control, is right out of the BlackBerry / RIM How To Do BOYD 101. They did it yonks ago with BlackBerry Balance. Even the means, i.e. a separately encrypyed file system, is exactly the same.
I hope Apple have checked they're not stamping over some patents. BlackBerry have a pretty successful record thus far of sueing the rip off merchants.
For those that don't know, BlackBerry Balance was an excellent solution to BOYD. Things like the calendar worked really well - you could see into both work and personal calendars, but nothing could move between them. Made booking a meeting as easy as it ever could be.
Trouble was, or so it seems, was that BlackBerry Balance was way too sophisticated for the dullards brought up on iPhone and Android to understand, so no one got it. Now it seems Apple think they've invented something new. They haven't.
There are a lot of ways to implement this. You can only patent a particular method or methods, you can't patent the concept of "separating work and personal usage" or "creating a separate encrypted volume".
Well you probably can, since the patent office will let you patent almost anything, but it won't hold up in court if you patent something so general. I mean, they could try to patent "encrypted messaging on mobile" since they were the first to do it, but I can't imagine that would hold up in court if they'd tried to sue Apple over iMessage.
Its also possible Apple and Blackberry already have some sort of patent agreement, if Apple had been sued in the past.
There are a lot of ways to implement this. You can only patent a particular method or methods, you can't patent the concept of "separating work and personal usage" or "creating a separate encrypted volume".
Sure, but it looks like Apple have duplicated BlackBerry's implementation, using two separately encrypted file systems.
Well you probably can, since the patent office will let you patent almost anything, but it won't hold up in court if you patent something so general.
Here in Europe you can patent software that's related to a business process.Now the idea of seperated data is old - multi level security systems like these (because that's what they are) are a well established if rarely implemented concept. However the processes for managing an MLS in a mobile context is something that BlackBerry did all by themselves a long time ago, and they've likely got that all sewn up in patents. And that's possibly the case outside of the USA too, as it's a business process. I've not done a patent trawl, but BlackBerry do seem to have been careful to patent their IPR.
Apple has been producing gear used by professional graphics and video people for a long time. So has everyone else. I think they beat Microsoft to the punch on things like color profiles for monitors, but there just isn't a lot of difference between one Intel desktop computer with a 6K monitor and any other.
Moreover, people who do that kind of work for a living day in and day out (I know several) don't bother much with monitors which swivel from portrait to landscape, but instead just have multiple monitors: one for HQ video or imagery, another to control the software, a third for bits and pieces being edited in.
Not real sure what's going on here, but it seems the usual sort of Jobsian smoke and lasers, the modern version of smoke and mirrors.
But I gotta say, while I wouldn't buy one from Apple I've seen 6K monitors and they're quite nice....
It’s standard to work on a lower res monitor when editing video as it’s common practice to edit with low-res files and then output to the final hi-res format (and hand over to the colour grader) when you’re done. There are a number of widescreen monitors that would also be very useful for this.