TLDR
Much as the fruity one never responds to ElReg requests for comments, I never read Apple press releases disguised as conferences.
Apple opened its 33rd annual Worldwide Developer Conference on Monday with a preview of upcoming hardware and planned changes in its mobile, desktop, and wrist accessory operating systems. The confab consists primarily of streamed video, as it did in 2020 and 2021, though there is a limited in-person component for the favored …
-> Apple's Photos app, which can be found in all of Apple's operating systems, will soon be able to share photos with family via a shared iCloud Shared Library.
Apple's Photos is a sort of joke app. I'm not interested in sharing photos via iCloud. I want better photo management full stop. I don't know why Apple even bothers with it any more.
The M2 Macbook Air is not available immediately, because there's only one supplier of the notches, and they are affected by the lockdowns in China. Supply chain experts are concerned that, as other manufacturers adopt notches, world production capacity of notches is insufficient. Sources claim Apple failed to stockpile notches after the looming notch crisis became apparent.
And slightly more expensive.
While Apple will probably refresh the new 14" with the M2 shortly, I was depressed (but not surprised) by the lack of ports on the new 13"
Apple seems to have noticed too, as the images used were careful avoid showing the sides during the event. That was instantly suspicious, and the specs confirm only two thunderbolts on one side and a headphone jack. No SD card support and no HDMI.
So it's a shiny toy for those that spend all day plugged into a dock or who like fiddling with dongles. Maybe not a good fit for road warriors doing presentations, or anyone who winds up standing on a ladder with three cables plugged into their laptop. I'd rather not have to carry around an AppleTV, a port replicator, a card reader, and 2 power cords just to run a slide deck for a meeting.
Though HDMI is slowly moving towards the sunset of the far shores, I am happy to see it on the real MBPs, though a current display port would also be fine.
When I saw them continuing the 13" MBP, I had a chuckle. It's now the only device left with a touch bar, which is going to become less and less relevant (I have to say, I did have a soft spot for it, but have not missed it since switching to an M1 Pro 14, which I guess shows what a novelty it really was).
It's a little surprising to me that the MBP 13 even has a place, but the answer I suppose was in the keynote - it is the world's second most popular laptop. Keep that cash cow running until the touch bar becomes simply a glowing bar of function keys...
Can't we turn back the clock, the 2013, 13", 8GB, i5, 512GB MacbookAir, wot offered a user upgradable / replaceable SSD and battery, and came with both a displayport, and SD card reader, not to mention the M2's big selling point, a Magsafe connector. Oh, and there's a native version of R-Studio, for the Bootcamp'able, into full far Microsoft land, geriatric beast.
Do you use it to read floppy discs as well or load data off of cassette? Not to denigrate any storage medium particularly, but a MacBook Air is just as capable of reading those as it is of reading optical media.
Which is to say that it can’t. At least, no more than any modern Mac can. Not without an external device. /pedant mode
Me to salesman: "I'm interested in a new phone. But I don't see any prices"
Genius salesman: "That's because you need to specify which subscription you'd like to buy the phone with"
Me "Oh, I'm reasonably happy with the carrier I'm currently with"
Genius salesman "Isn't it time you stepped up to Apple's level service?"
Me "Not for a monthly fee, no.
(I walked out and bought a different phone brand that actually had prices next to each phone.)
Really? Say what you like about Apple, but arguably they were one of the driving forces behind opening up the phone market to being able to buy contractless phones. Prior to that you were buying mobile operator branded Nokias or Ericssons. Sure you could buy them off the shelf, but it was very rare that anyone did, and I can't recall seeing any for sale on the high street as a standalone device 15 years ago.
Can't have looked hard then... What *was* the norm was that non-subsidised phones cost a bloody arm and a leg (so they were parked in the 'expensive phones' row). What Apple did was make paying an arm and a leg (and a kidney) fashionable and acceptable, so much so that people (as any fule no) camped out for absolute days and forewent their rent and lived off crackers and water to be able to be the first to walk out of the cube of consumerism with the first new edition.
You can still get your iPhone 13 Pro Super X Max Mega Edition for £49/month (and a tenner up front) in a similar fashion as you got your super-cool Nokia 8110 Matrix banana slider phone back in the nineties (for similar prices)... That's if you're not willing/able to shell out £1300 for the top spec all-singing-all-dancing-cooking-your-tea-and-best-thing-since-sliced-bread edition of the phone of your choice. Of course, mobile phone providers are now getting (back) into the habit of the whole 'lease your phone and upgrade anytime you like' game they were in in the nineties and noughties.
Yet again I read reports on some supposedly new and improved personal computing stuff and think "Really? Is that it?"
Yet again I read that something which five years ago worked fine is being "improved" to resolve issues that were introduced the last time that it was "improved."
Yet again, basic common-sense usability practice is trumped by some guy's novelty idea; the next of which will surely be log-ins via nose-print.
Yet again I'm astounded how we're expected to allow everything we do on-line, on our computers, on our phones, to be interwoven and interlinked into some Gordian Knot of personal data that likely can never be cut.
And yet again I'm left thinking "Computer use used to be so much easier, so much simpler, and so much more efficient. When did it all turn into a battle to try and force your way past the technological barriers in order to just do a job?"
Really, general computer and software design peaked about Windows 95 and WordPerfect, or maybe Word. And at whatever the equivalent was for the Apple systems. Nothing has become easier since then, and for 95% of people no really useful new features have emerged. Since then it's been lots of novelty, lots of extra steps to do simple tasks, and lots of endless promises that Real Soon Now we'll do it all while wearing VR goggles.
"Snow Leopard 10.6.8."
This a 1,000 times!
I remember being really impressed when Leopard came out. Genuinely useful additions to the OS that made day to day usage easier. And then a couple of years later, Snow Leopard was a performance, memory and stability release. Not worrying about big new features, just making what was already there more efficient. Essentially the same feature set, just faster and more stable. Probably Apple's last, great release. Ever since it's been a race to see how much tat they can staple on with each release.
Highly useful Air Drop?
You mean like dropping a bomb/weapon image on every iDevice on a crowded airplane?
What I would _love_ to see from Apple is that all announcements come with a
"what we are taking away this time"
as well as the
"This is how you are meant to use our miracle device. Resistance is Futile"
AirDrop works for me about half the time. Laptop can always see the phone, but phone is often unable to see the laptop. Worst Apple product/feature IMO.
Now Monterey file sharing crashes constantly. It’s been 15+ years since I’ve had to turn it off and back on again, and now it happens daily. Standards are seriously slipping.
AirDrop is useful when it works, I'll grant you, although that condition can change the point a lot because it's sometimes not working. When a device isn't updated, things can stop working (E.G. an up-to-date IOS device with a not up-to-date Mac because the Mac is old enough that it doesn't get a feature update anymore). But sure, I'll put that in the good features column, along with ... er ... well they've had over a decade ... there must be something ... sorry, I'm drawing a blank. Well maybe the recovery environment, that got a bit better, if that counts.
This doesn't have to be a problem. Just keeping the OS good, making good hardware, and adding the odd feature that is useful to someone is fine when the OS is already good, which it was. The problem is that I can think of things to put in the "ways it is worse now" column. Some of these are just cosmetic changes, but some are larger technical problems either hurting stability, removing features so I have to find third-party programs or write my own for things the OS used to do, or require reinstallation. People can make arguments like this for any OS and someone will, so my opinion isn't universal.
Bloody kids. Young people today, don’t know they’re born etc etc.
I’m a geek. I love this computing nonsense. I have a Linux computer for coding on, Windows (ditto), an M1 Mac for daily work (and coding on), and various older machines (including ones with Snow Leopard, Leopard, Tiger, A/UX, MacOS 9 (8, 7 & 6), and more besides) that I like to use and relax with.
In my spare moments, I have coding projects in flight with ThinkC 5.5 and Xcode 3 - as well as some properly up to date stuff. So, as you can see, I’m pretty immersed in this old stuff. Not just nostalgia and rose tinted glasses - I can lay my hands on it and use it. And so I feel confident calling bullshit on your assertion. The latest MacOS is demonstrably better than it’s predecessors, ditto Windows and Linux. Sure, there’s occasionally a bad release that bucks the trend, but the overall trend is upward and betterward.
I can definitely relate.
The most efficient writing experience I had was in the 1980's and early 90's with XYWrite. The thing was blazingly fast (written entirely in assembly language). Cold booting the luggage sized "portable" into the program was just as fast as bringing back a modern computer from hibernate, and opening writing software (These days I use Byword and Hemingway).
I had macros set up for all usual tasks (like publishing, printing, etc.) and none of them took more than a few seconds. Trying to do those things on a modern computer is rarely at a similar speed, and usually slower.
So in effect there has been no practical efficiency improvement in the core work task, and in some ways a slow down, compared to what I was working with 30+ years ago.
Now *obviously* there's a lot more capabilities in modern computers, and they are small, and the battery lasts all day, instead of 0.5 seconds for the capacitors to drain. But the fluff is making many types of work more difficult.
That said, I wouldn't go back in time for anything, except perhaps to snatch up more Apple and Microsnoft stock.
Lots of people would love an easy-to-use and well maintained personal computing device the size of a phone. It's so close - there are common open source operating systems under the hood of Apple and Android devices. The chipsets support tons of powerful features. Instead we get faster, dumber, less storage, more cameras. I was literally more excited about a new coffee grinder than any recent phone.
Because that's what people want... super cool cameras that can now take the pics that only the paparazzi with their 1300mm super telephoto lenses used to be able to.
That the device is meant to be able to make phone calls nowadays is an afterthought. "Oh, it also makes calls... right!" when you order everything via an app, chat to your mates via Whatsapp or Snapchat or Tik Tok or Insta, emailing (occasionally), Facetiming to be an utter annoyance to everyone else on the bus (and say "I'm on the bus mate, NO. ON THE BUS").
I know, like you I'm starting to sound like a grumpy 'Gerroff my lawn!' type pre-OAP.
And you can't beat a good coffee made with a grinder that gets the grind just so and releases the flavours perfectly, especially given Starbucks (ordered by app) or Costa (ordered by app) wouldn't know good coffee if it hit their nose and their taste buds.
AR wasn't mentioned explicitly, but the building blocks of Apple's AR plans were shown at the event.
- multiple workspaces, via Continuity between devices, or by external monitor support on iPads - courting AAA video game developers to Apple Silicon and Metal.
- more collaborative tools
- softening the line between computer tasks and real tasks
Bearing in mind Apple aren't aiming at the traditional VR gaming market.
Apple's AR looks to be set on collaboration. A room of giant whiteboards, if you will. Or a virtual monitor for each part of your workflow, so you move your body instead of alt-tabbing. The research is highly suggestive of the cognitive boosts achievable through movement and spatial cues.
"Movement and Spatial cues"?
From the company that is hell-bent on eliminating any way of finding your data other than "knowing" what conglomeration of their apps "owns" a particular morsel, or relying on one of the worst search interfaces I have ever used (False positives, random refusal to "reveal in Finder", etc.)
About makes me pine for the days when my body just _knew_ where to grab a particular listing or card deck.
Having only barely "tasted" (another sense missing) AR, I find it hard to believe a "thought->AR->Muscle memory" will work soon.
> About makes me pine for the days when my body just _knew_ where to grab a particular listing or card deck.
I'm old enough that I've done in the old ways what is now done at a computer. My body used to know that I was sat at an A1-sized draughting board. It felt different to using the computer I now use instead.
In between was a stage when I could only use CAD on UNIX workstations, not the under-powered Windows 2K machines in the main studio. So, I would be in different physical rooms for different computer tasks. The rooms were lit differently, noise and activity levels were lower in the CAD suite - which had air conditioning set too cold. Lots of physical cues.
I would type at a desk. I would proof-read sat on an armchair using hard copy.
I used to select a movie by standing up (walking to the video rental shop) and perusing the shelf. Now I can just peruse a menu.
The standard Apple default multiple monitor support on M1/M2 machines is less than previous models like the 15 inch 2016 mac book pro I have.
With my mac book pro, I can connect up 4 displays via an adapter into each of its usb-c ports. In my case I have 3 external monitors attached.
The new M2 MacBook Air has 2 USB-C ports but only supports one external monitor as default standard. The new MacBookPro M2 supports 2.
However, I am looking to move with the times and get an Apple Silicon mac, M2 etc.
I'm hoping that DisplayLink USB-to-DVI or USB-to-HDMI can help here with extending the number of external monitors possible. I also have a MacBookAir 2015 11 inch. With 2 USB-to-HDMI adapters and that Air's own mini display port, I can connect those 3 external monitors to it.
Fortunately, DisplayLink does support Apple Silicon, so this is encouraging though without testing, I would not know if it supports 3 monitors, but I would expect it to.
I agree with another poster about the closeness between MacBookAir and lower end MacBookPro specs.
https://forums.theregister.com/forum/all/2022/06/06/apple_wwdc_2022_the_m2/#c_4471965
On that basis, Myself I would get the MacBookAir provided the multiple monitors is supported.
I would be happy with one external monitor, I just need to see if Parallels will allow me to run my linux dev environment and a few Windows Apps which would allow me to bin about four laptops I currently use.
This won't run on any pre-2017 Mac, an indication of Apple's ongoing push to move away from x86 computers.
More like - if you haven't upgraded your Apple hardware in the last 2 or 3 (at a push) years Apple don't consider you to be a customer. They are, after all, a hardware company.
Yep, five years has been the standard "level of support" for a while. While I can understand this from Apple's perspective as a manufacturer, I wish they'd be more honest with this upfront and not impose synthetic restrictions.
My 2016 MBP has a new battery and a new daughterboard due to problems with the USB connectors. It's currently my reseve and is just as capable as my 2020 MBP, both of which are sill on 10.15 and will stay so for the foreseeable future.
Hey Article Writer Person,
You had the luxury of time to wait and provide more insight beyond the WWDC superlative laced presentation.
For instance, you didn't mention that Stage Manager is coming to iPadOS too. Instead you focus on somebody's snarky attack on the feature even though it's not been released yet.
Or that the Fitness app will be available on iPhones without the Apple Watch.
And then there were the new features that didn't get mentioned at WWDC - you will now be able to see your wifi password, third party 2 factor support directly within the browser/settings, etc.
Come on - do better please and thank you
I don't think the author is the only one that can do better. Admittedly, they didn't offer up a new colour of iphone as a significant innovation, but this was a lionizing spectacle of emptiness.
Wifi password in the browser -- maybe they intend a separate event for that groundbreaker.
If the author of the article included every announcement from WWDC, their article would resemble the inclusive live blogs from Anandtech or Arstechnica.
I don't think most readers mistook this summary for a blow by blow recap.
I get that, but then it begs the question - did this article provide value? Was it necessary?
I pointed out some things that to me were worthy of mention.. with 2 of them being after the actual presentation. Is it too much to ask for the author to have provided just a tad more insight?
"[D]id this article provide value? Was it necessary?"
For me, YES, because I won't get my tech news anywhere else. Especially when it's Apple-related, because their lack of direct access to Fruit HQ means whatever El Reg publishes has already been vetted, and I won't be inundated with too much OMG'ing or adverts. I passed up reading about WWDC everywhere else for 12+ hours just to catch it here.
A cold* one for our Vultures --> (* actual temp varies by brew, natch)
Apple Pay Later, to spread out a purchase via four zero-interest, zero-fee payments over six weeks.
It's probably worth noting that all these schemes increase credit risk in the economy and the hidden costs are borne by merchants who will inevitably pass them on in the form of higher prices. This is why, if you can pay on the nose, you will often get a discount worth significantly more than a six-week "interest-free" loan.
The OS also has a feature dubbed Rapid Security Response that can install vulnerability patches as normal software updates without needing a full iOS update process.
This sounds like an attack vector that would be irresistible to black hats and 3 letter agencies alike.
The OS also has a feature dubbed Rapid Security Response that can install vulnerability patches as normal software updates without needing a full iOS update process.
This sounds like an attack vector that would be irresistible to black hats and 3 letter agencies alike.
----
And yet, it's been in place and used quasi-regularly for updating XProtect, Gatekeeper, MRT, and so on for years. Have you heard of any such attacks, against systems where SIP hasn't been disabled?
It might, if your goal was to distract from the point. If you don't want to sideload, then don't, and you've done what you wanted. This means that adding the feature would benefit those who want it and do nothing to hurt those who like sticking with Apple's catalog. There are a lot of features I don't want, but as long as I'm not forced to enable them, there's no harm when they're added for those who appreciate them.
> This means that adding the feature would benefit those who want it and do nothing to hurt those who like sticking with Apple's catalog.
You'd think, wouldn't you? However, there is an art to getting users to do something unwise - Social Engineering. It's dangerous.
If there is an option to allow side-loading, bad guys will trick users into enabling it.
And those bad guys can also social engineer users into loading a website and entering all their personal information, but we don't use that as a reason to ban the internet. I'm sure someone would like to make that argument, but if they did, you would think it was a pathetic reason (I hope). This one is similar.
Apple getting into the instant credit business is the most significant announcement from the WWDC.
BNPL is hugely popular, and provides an entirely new revenue stream for Apple.
The ability to use an iPhone as a card payment terminal is also pretty big news, and is bound to be very popular.