"We can affirm without reservation that "California streaming" was the best Apple event on Tuesday, September 14, 2021, of all time"
It probably wasn't, but it's safe to assume that El Reg is not on the guest list for the afterparty.
Apple on Tuesday announced iPhone 13, a "new iPad" and iPad mini, and Watch Series 7 at its "California streaming" event, held virtually again this year to avoid going viral in the wrong way. The iBiz's latest shiny things arrived in the wake of a patch released Monday for its mobile and desktop operating systems to block a …
A mate of mine who is wedded to the Apple ecosystem said in a message to me earlier
"Have you seen the new Apple line up? I think we've reached or passed 'peak Apple' and they've run out of new product ideas."
I wonder if he's right given they lost their Car head honcho Doug Field to Ford recently.
In my humble opinion, current Apple executives and decision makers are just there to "keep the lights on". They have no idea what new product to introduce since the death of Steve Jobs.
When Steve Jobs was still at the helm, Apple lead the way and left competitors eating their dusts.
Nowadays, the only thing Apple knows how to do with the iPhone/iPad is to add "make-up" to an old product and to jack up the price.
Being number one is not as easy as staying number one.
To be fair, there are only so many useful form factors for a computer. Once a Newton has picked the low hanging fruit, it may take a lot of work by a lot of people before an Einstein can come along.
And it did take many years of work for Jobs hand-sized and paper-sheet-sized computer concepts to be possible. The Apple Newton was relatively easy to conceive of, but took a lot of work to become the far more useful iPad. There were lots of Palm models in the interim.
This said, Sony invented the Walkman, but not the iPod.
The Google Glass was a novel device, but it remains to be seen who will develop the version that becomes socially acceptable and widespread. The prize for the first popular VR headset also remains ready to win. No matter how much work Apple has put into the AR/VR space, it will take time before it can release the product it will expect to be a category killer.
What's interesting about the Newton is that it was a design theme that was clearly too difficult to implement on the available tech. The much more successful Psion 3 and 5 were significantly more usable devices (and in many ways have never been matched). Having a keyboard made up a lot of having a crummy resistive touch LCD screen.
Sony also did CD and MiniDisc. The latter is widely underappreciated in the West. In Japan it was very successful, and persisted way longer than you'd think into the era of HDD-based music players. It fitted the lots of music, mobile, use case well enough that there was no pressing need to move on to something else.
Psion came up with a hdd-based music player long before anyone else. They didn't follow through, because the only HDD's available were 3.5 inch. The guy who did the work ended up in Apple. Psion also did a prototype satnav, and again it was not realistic to productionise it because of component sizes. The guy who did that one ended up in TomTom.
I guess the difference between Psion and Apple was that the former was too small / financially timid to drive component suppliers to do the expensive R&D to miniaturise parts. Whereas Apple had the financial clout / bravery to go for it. Psion were successful where they could take extant components and do something unexpected.
I don't think that Google Glass type things will ever become socially acceptable.
Well, you're not wrong.
From a privacy standpoint anyway, they're dead in the water, despite what Zuck & Rayban want us to think.
If there was some provable way to show that images of people were being discarded before the AR stuff was being acted on (like directions, news info, etc) and displayed, then they'd have a use.
And if they were being manufactured by anyone not Google, FB, Amazon, MS or Apple, people might even believe them on the privacy thing for a while, too!
Because let me tell you, if these AR goggles (not VR, actual AR) with front facing cams become popular, hoods and some sort of targeted infrasonic to mess with image pick up will suddenly become a very popular field of research...
I still use my minidisc occasionally at gigs. Sits neatly in a pocket along with a lapel mike. Beats taking in a full recording system and the quality is perfectly acceptable.
Sony crippled it for more professional use by preventing the digital output from recorded material. Only the standard audio line out was available. It was all part of their DRM strategy which took them away from the pro audio arena.
>I guess the difference between Psion and Apple was that the former was too small / financially timid to drive component suppliers to do the expensive R&D to miniaturise parts. Whereas Apple had the financial clout / bravery to go for it.
The iPod was inspired by 1.8" HDDs being promoted by their manufacturers to anyone who had a use for them. The first gen iPod was made more user friendly than a competing device might be because it used FireWire for data sync and charging at a time that the latest USB version 1.x was not fast enough for the purpose. It is true that Apple's financial strength played a role: development of the 1st gen iPod cost $200 million.
Note: all of the above - the money, access to components, use of FireWire, and experience with jog wheels - also applied to Sony in 2000
> This said, Sony invented the Walkman, but not the iPod.
tl;dr: Sony did invent the iPod, they just screwed it up.
Back in 1992, Sony released a new technology, in the shape of their Minidisc portable player - these stored digitally encoded/compressed data on erasable magneto-optical discs, and offered far more capacity than any equivalent solid-state (or even HDD-based) system could offer in 1992 - they gave you 140mb of storage/70 minutes of music, at a time when you'd have to spend several thousand pounds to get a 486 with 4mb of ram and a 200mb HDD.
(Admittedly, the players cost around $350 at launch, while blank discs cost around $16, but with the technology having failed to catch on, by 1994 Sony was starting to engage in some aggressive price slashing. Either way: a 20mb laptop SSD would have set you back about $1000 back in 1991, while a 200mb HDD would have been around $400!
Alas, a combination of internal politics within Sony and a music industry wary of this newfangled way of producing "lossless" copies meant that the Minidisc was deliberately hamstrung from launch.
Then in 1993, Fraunhofer released their newfangled MP3 technology with a deliberately cheap "decompression" pricing model. And then had to drop the price of their encoding package when someone leaked their code on the internet.
And then it turned out that high-end 486s and Pentiums could rip CDs to MP3 in near-realtime.
And at 128kbps, the resulting MP3 files were justabout small enough to be transferred in near-realtime via 56k narrowband internet connections.
And then some bright spark released a program called Napster, which made it easy to find other people's ripped files.
The moral of the story is that as with most devices, it's not just about the hardware, but the surround. And the ability to rip your own music and copy other people's music without any artificial limitations was a major factor in why Minidisc failed and MP3 succeeded. Even Microsoft and other companies (e.g. Real) tried to get in on the act, though they suffered from the same issues as Sony, in that they had to at least pay lip service to the music industry and "protect" your music files from copying.
I can remember a Unix sysadmin at work in 1997, who was gutted when he reinstalled his Windows box, only to discover that all the music he had on the HDD was now useless, since he'd lost the decryption key.
Then in 2001, Apple released the iPod and bundled it with iTunes. And whatever people may think about iTunes, it did a good job of Just Working when it came to putting songs onto your iPod; by November 2001, it even let you burn DRM-free MP3 CDs!
And it's perhaps not a coincidence that Sony's NetMD hardware and Sonic Stage software were released in mid-2001, after the iPod had launched. But by then, it was too little, too late.
Oh, and Sonic Stage absolutely sucked, to boot ;)
So, yeah. Arguably, Sony did invent the MP3 player, but it was crippled from the start by commercial considerations. And so cheap/unlicenced MP3 hardware dominated the market, until Apple came along with a much more pragmatic approach, in which they offered both a streamlined user experience without any of the heavy DRM constraints which had crippled Sony, Microsoft, etc.
In fairness, this can be seen as a heretical thought by some and gospel truth by others still inside the ecosystem.
In practice? Yeah, they need someone with a strong enough force of personality to be able to take charge, grab the company by the (metorphorical) stones and haul them along. For all the, occasionally well placed, criticism of Jobs to be a bastard to work for, it can't be denied he set direction and if you didn't like it the door was over there.
That's not there on Tim's watch. Sure, the lawyers and the suits and the fanboys are barking at anyone that dares poke the walls, but that's their reason for being.
The Distortion Field is gone. We're left watching guys on a stage tell us this is the latest and greatest since the last time, but we've all gotten a little bit more... "well... it's not really changed much now, has it?".
We're more doubtful.
Apple really does need someone with the "vision" and leadership to grab the company by the dangly bits and drag it kicking and screaming in one direction again.
Though it looks like the 13 pro still has Lidar/Time of Flight camera capability.
Looks like Apple is quietly backing away from the great hype of yesterday. Unless they are saving up a "magical" release event for Christmas for fruity specs. Not holding my breath, as I expect Facebook to have it's face burned off by public resentment over their perv glasses.
The pity is that Apples phones were actually a handy AR device, you just couldn't strap them to your face without looking profoundly silly. But let's face it, they were more usable than a 5x as expensive headset with a 12 deg FOV no matter how cool it was.
Looks like they have once again succumbed to the "Field of Dreams" model of product development, where after spending a ton of R&D money they dump a tool kit on the market and then wait impatiently for someone else to produce tons of high quality content for them.
See Kguttag.com for an expert's in-depth reviews of pre-production AR display technologies... None of them seem ready for a prime time product yet.
Time Cook is on record as saying that the technology just isn't there yet for a polished consumer-grade AR device.
It isn't in Apple's nature to give rolling updates on its progress with upcoming products.
If I were Apple and wanted to release an AR display device in x year's time, I would start as they have done by beginning to introduce lidar / laser TOF sensors and accompanying silicon to the higher end models of existing devices such as iPads and iPhones.
"overall [Apple Watch] battery life remains unchanged, at about 18 hours"
A watch that doesn't even last a full 24 hours is a bit of a joke. At the very least, that's the minimum baseline target they should be aiming for. Accidentally forget to charge it overnight and it becomes useless, possibly requiring a spare charger (and extra expense) at work, to make your expensive gadget become useful again: just think of the hundreds of steps and hours of panicked heartbeats you won't have logged in the intervening period!
What exactly does a smartwatch do that sucks battery life quite so insanely hard? Does it really need quite as much bling as it presumably must have? A plain old digital watch (not really a fair comparison, I know) usually has battery life of at least a year, if not two or three. A smartwatch obviously does a lot more computing than that, but there must surely be a sweet spot somewhere where you can have greater power efficiency and battery life lasting for at least a long weekend, no?
Or take it off and charge it while you sleep? Unless you only sleep 8 hours out of every 40 I guess?
I put mine on around 6:30AM each day, and take it off around 10:30PM, and it's still got plenty of charge left, but it might as well go on it's charger while I'm asleep as does my phone; I'd image a large percentage of users do something very similar.
It's a conscious decision to have a smartwatch as opposed to a digital or analogue watch - you know before you buy it that it is going to do more than your watch, but you also know it's going to need more battery, and there's only room for so much battery inside a case you can still fit on your wrist. Whether the trade off between features and battery life are worth it is down to an individual.
You could do that, but that makes the whole sleep tracking feature kind of pointless. Certainly people have reasons to use the more power-hungry aspects of the Apple watch, but focusing on making it last if it can't be charged one night would help more than speeding it up even more.
"there must surely be a sweet spot somewhere where you can have greater power efficiency and battery life lasting for at least a long weekend, no?"
There are a lot of those, and people who want step counting and heart rate tracking often use watches with batteries lasting from a week to a month.
"What exactly does a smartwatch do that sucks battery life quite so insanely hard?"
It's the much faster processor in there working on smartphone-style tasks. The watches that last longer are usually a microcontroler that logs some fitness data and sends it to a phone via Bluetooth, maybe getting information about notifications, but that's it. The Apple watch has a two-core CPU running a limited version of IOS in order to present the user with many tiny apps, some of which use GPS monitoring (from the watch itself if you want to kill the battery even faster). They've also got WiFi and 4G radios in there, the latter for when you are out and didn't bring your phone, which I'm sure for many iPhone users is never. All that takes a lot more power for those who want such things. I have only seen a couple watch-based apps which strike me as a little useful, but there are many things I don't do, so there are likely better reasons that I don't know about.
People with Androids won't be converted by these. I've been given iPhones at work in my last 2 places, a 7(?) and an SE. I honestly couldn't tell them apart.
I like their size in my hand, like my old Mi Play, but not the screen size, like my older Redmi 4X.
They patched that. Everyone with an iPhone released in 2015 or later can get that patch. When you think about the ones that Android has and how many can patch them, there's reason to wonder which is better.
The only thing that Apple has that weakens them in the comparison is their reporting thing, and I hate that just as a lot of people do. If they're convinced to never launch it, that will go a long way to helping their case, though some trust has been lost irretrievably.
...I was feeling a bit throw-uppy when I saw Tim Cook spouting all those superlatives all over the place, like a tired old salesman who really wasn't all the keen on his wares in the first place.
I like my 'stuff' to be professional and pointful. I feel it cheapens the brand when they have to big it up quite that much. Makes me wonder what the point of last year's tech was.
I got my last iPhone about three years ago and I'm still not 100% decided on replacing it with an Apple phone. Am I allowed to go Samsung or Sony or something
But Apple have always been like this. Under Steve Jobs, everything they did was always "revolutionary" and "Amazing" and the best ever. Blah, blah, blah. They've always been a bit cringe-inducing, even at the times when their kit has been genuinely revolutionary and amazing.
Perhaps it's annoying you more now because you're getting older and more cynical? Or just because the stuff is further away from the days when it was all shiny-new - and not just design iteration. I'm a fan of my iPad, I refuse to pay the prices for iPhones though. They're not sufficiently better than a £200 'Droid - the annoying thing is that capabilities stopped really improving 5 years ago - but top-end prices have actually gone up.
I'm a bit puzzled by Apple's claim that they need to adjust their active calorie calculation if you use an e-bike. The full press release statement says that:
"Apple Watch can more accurately measure active calories when riding an e-bike, with an updated cycling workout algorithm that evaluates GPS and heart rate to better determine when users are riding with pedal-assist versus leg power alone".
If you don't have a power meter, most bike computer/app calculations just use your heart rate and have correlated that to power used and hence energy expended. I am somewhat unconvinced by the experiments these are based on: they are, in my opinion, poorly calibrated adjustment for age and gender and my Wahoo Element estimates less than half the calories that Ride with GPS does on the same information.
However I find I don't work any less hard, and raise my heart rate by a similar amount, when I use my e-bike as I average about 1/3 faster at a higher cadence than riding my unassisted one. Not sure how Apple can claim to improve on that by using GPS and presumably AI to infer I'm on using electrical assist.
Isn't chasing units kind of a fools errand? It is relevant to me that my heart rate is over 140 for 2 hours a day. That my watch converts this into 0.3 Rhinos weekly doesn't matter, and is probably not right anyways. What matters is that I can track my activity trends, not the number of Rhinos I can eat.
agree that the superlatives are somewhat excessive - as is the walk to the right, clasp hands, walk to the left, unclasp hands routine.
What I do find interesting is they way they use the video effects in the intros and transitions between subjects.
Yesterday wasn't spectacular but TC's walk from the desert onto the stage at the beginning was particularly neat.
The WWDC presentation had plenty of tricks - as did the earlier events .
Big kudos to the guys & gals who put them together.
Biting the hand that feeds IT © 1998–2021