i5 and ARM are similar in performance?
If so, I'm impressed with ARM! How truly awful is an i3, or are they dead? Have intel been deliberately deprecating i5 to push people to "premium" i7's?
Love rat Apple two-times its long-suffering squeeze Qualcomm with dishy Intel – and it's going to keep the baby but only let some of us see it. Over a cheap bottle of chardonnay one dark night in Cupertino, Intel wooed Apple with flimsy promises. The pair felt a connection (around the 1.9GHz mark) after the iPhone maker opened …
the i3 is not horrible at all.
i recently bought a laptop with an i3 processor. being budget conscious, the decision to spend more on features that actually make a difference to me won over. A lot of scripting which needs a decent resolution display (1080p) and a decent keyboard. A lot of browsing (firefox with ~40 tabs open), reading and watching a couple of videos on the weekend. i also connect to an additional monitor with a 1920x1600 display resolution.
i run 64-bit linux as my OS of choice, and virtual machines with win7/bsd/linux as guests, although i usually have only one guest os running at a given time. performance is fast and smooth. no lags, no freezes.
But how much is due to different macOS and iOS performance? macOS is heavier than iOS and has to cope with much more complex features, i.e just the need of handling overlapping windows makes the GUI code more complex than one that doesn't need to deal with it. Same for handling background tasks. Would people accept iOS limitations on a laptop?
If so, I'm impressed with ARM! [...]
It's possibly not as impressive as you think, remember when they debuted ARMs were much faster than the Intel chips of the time, it's 30 years of focus on performance vs. 30 years of focus on low power that's caused them to diverge.
Apple has long been rumoured to be looking to switch to ARM across the board, which does make sense given their penchant for vertical integration and their long term involvement with ARM, so it's not too surprising that 'desktop performance' is a long term design goal of their in house chip team.
The A series of CPUs have for a while been class leaders in single threaded ARM performance; I suspect this is no coincidence. ;)
BS, that "i7" in 13" MBA is just i5 with marketing pixie dust sprinkled on top. And even that mobile i3 is not whole lot worse - just half the cache and some less essential features fused off. True i7 can only be had in MBP. Any of these processors will smoke A10 in just about any task. Power usage (and device size) and possibly price will be the only downsides. Even Core M will be faster than A10.
No replacement for displacement.
> True i7 can only be had in MBP. Any of these processors will smoke A10 in just about any task.
For sure, but let's remember that a lot of the heavy lifting in traditional Mac productivity apps is done by GPUs (or more specialised silicon via Thunderbolt in the case of raw 4K streams, for example). If ARM is just dandy for web and office, and Adobe Creative Suite leans on the GPU, there's no insurmountable hurdle.
We've seen this in PC gaming too - TomsHardware for ages has suggested that most games don't benefit from anything faster than an i5 CPU
Considering the auction was only a year ago, it will be a long time before they are put into use. Why add extra hardware to support something that will not be put into use for years. That would be like building 5G capability into a phone sold next year, in case 5G actually appears in 2020.
As we know from Rosetta, the first step in a transition between architectures (assuming you don't want to throw away all existing third party software) is an emulation layer to allow time for the transition to occur.
For that to happen, the ARM would need *far more* performance than the i5 otherwise the architecture emulation will result in a horrendous slowdown of existing software.
There isn't that much need for any translation or emulation. x86 and ARM are functionally very, very similar. As an example, every iOS developer routinely runs their iOS apps on a Mac with an Intel processor for testing purposes (makes it possible to test an app running on every single iPhone or iPad model without bothering to buy them all).
If Apple released macOS for ARM, most developers would just turn a switch in the compiler and build an ARM version.
As others already said, Apple could quite trivially have the Mac compiler build fat binaries - x86 and ARM64 - just like they did with the PPC transition. You'd only need to emulate old code, anyone producing Mac applications that were performance sensitive like Photoshop would be first on the block with those fat binaries.
The problem isn't Mac software, that's easy. The problem is Windows. A lot of Mac buyers run Windows at least occasionally, and the A10 is simply not an option there because when they are running Windows they're running x86 Windows and old school x86 Windows apps, not the ARM version (is that even supported on Windows 10 or did Microsoft give that up as the abject failure it was?)
It does seem like Apple has something in mind for their SoCs beyond phones/tablets, because they are really pushing the performance. The A9 is still faster than any SoC you can get in the Android world, and the A11 will probably be around the corner by the time they finally exceed A9 performance let alone A10. I'm leaning towards Apple having some performance target in mind for VR that they want to hit before releasing something. Sort of like how they started development on the iPad in 2002. They could have released it years earlier - in fact they did release a small version of it called the iPhone in 2007 - but waited until 2010 when they reached the performance and battery life targets they had set in an acceptable size/weight.
Based on TSMC's roadmap they should be able to get another 30-40% with the A11 next year, and probably an overall doubling (i.e. Geekbench 4 single threaded score of ~7000) by the A13 in 2019. That's sure not something anyone needs for their smartphone.
I suspect that Apple will continue to benefit from the use of a larger die than competitors; the A10 from TMSC is about 25% larger than the A9 from TMSC / Samsung. How big Apple is willing to go with the A11 at the next node is really just a question of cost/benefit, but I expect that Apple will exceed the transistor count of the current A10 by a fair amount.
That said, there is still the A10X that is expected for the iPad refresh late winter or early spring, and that could be another decent jump.
The A10 is the largest one they've done. What I'm curious about is what is all that extra area for? They've grown by 30% and the small cores could only account for a tiny fraction of that. I'm sure the image processing for the dual camera is part of that, but can't begin to account for it all.
A6 - 96 sq mm
A7 - 102 sq mm
A8 - 89 sq mm
A9 - 96 sq mm / 104 sq mm (TSMC / Samsung)
A10 - 125 sq mm
The PPC transition was not smooth at all and took years and all the loyalty of Apple's customers.
I highly doubt switching ARM for Intel is as easy as changing a compiler flag or offering a fat binary in the App Store, even if opting for very limited legacy support.
Everything's pretty much covered with the App Store and universal binaries for out-of-Store builds. Not as much software on DVD now so Rosetta isn't as necessary as last time, they might even be able to get away without it.
They would need a good OS group though, as good as last time. Something I'm not convinced they've got as lately they seem to be focusing just on the pretty effects and iOSiation.
CDMA isn't really used at all in Europe other than a small Nepal on the Czech Republic. CDMA2000 on 450MHz was used for data only in a few countries where analogue NMT-450 was used but, it's largely been replaced by much faster LTE.
Also, a few places use it for machine to machine stuff; mostly almost closed networks owned by utilities to read meters.
But, in general European spec handsets don't have any reason to support CDMA, so given its a huge market, I'm guessing we'll be seeing more Intel radio chips sets here than you'll see in the USA.
You are correct but the percentage of CDMA phones in use is dropping simply because that CDMA is really only used in North America.
It really is time for CDMA (the radio bits) to die a death. AFAIK, bit of CDMA technology is in use in LTE.
As for 5G... It was on the news recently that the EU (sans the UK thanks to Brexit) wants 5G networks by 2020.
I guess we will be stuck on 3G for the next 30 years.
What about 4G you might ask? It is patchy at best and some networks are totally shite with it. There is no financial incentive to build out 4G let alone add 5G. Jusy my impression.
Do you really think €120m will be enough to roll out free WiFi across all EU towns and cities?
Do you really think it will take 30 years for UK telecos to roll out 4G, let alone 5G?
Perhaps there some succinct hashtag for Brexit is being handled stupidly but stop being rediculous...
The carriers should be keen to ditch CDMA/CDMA2000/UMTS as quickly as possible. The cell breathing characteristic inherent in any system using a CDMA-style signal has made network deployment planning a nightmare, and is the main reason behind poor coverage and poor network performance. This was especially a problem in Europe, where the carriers who'd previously rolled out 2G (which is piss easy in comparison) failed to understand just how hard it is to get a 3G network nicely planned to meet demand.
Things were different in Japan. They were amongst the first to roll out 3G (UMTS), and the cell density they put down overcame the inherent difficulties of 3G network planning. This was, I'm sure, a hang over from their previous indigenous network standard that had a far higher cell density than Western standards. I suspect they simply re-used their existing sites for 3G. Anyway, the NTT-Docomo 3G network is a masterful piece of network deployment, and really showed what you could do if you really went for it. The consequence:- it was (and still is) extremely expensive. It's very cool though to be on a Bullet train, going through a tunnel at 186mph, getting 20Mbit/s on a 3G downlink whilst watching the signal strength bars bounce up and down as you go past the micro-cells that line the tunnel.
For 4G the standards engineers finally remembered that network planning matters, and builds on aspects of GSM to relieve the problem. It should be far easier to roll out, and should result in 4G coverage being far better than 3G ever attained. No doubt they'll screw it up again with 5G.
It is dying because the telcos can't get access to the same range of network equipment that their competitors using open GSM / 3GPP standards can and they've all sorts of hassle getting first shout at handsets and other gear.
Also CDMA One (IS-95)/ CDMA 2000 (IS-2000) should be confused with the generic cover of CDMA (Code Division Multiple Access) which was around as a concept before either Qualcomm system for example it is used by GPS satellites and is also uses in 3GPP standards like UMTS (also known as W-CDMA).
Where the GSM family is standards absolutely won out is that they are an open, totally modular system where you can easily implement multivendor platforms using all sorts of equipment from loads of different companies without that much difficulty.
The other system is far more proprietary at least at the radio interface level. (Switching and data centre stuff is more or less the same for any telco network).
ALL of the CDMA One / CDMA 2000 systems will cut over to pure LTE eventually with VoLTE for voice. That finishes their dependency on proprietary gear.
CDMA/CDMA2000 were indeed always niche - only a few hundred million users vs a few billion GSM/UMTS users across the rest of the world. Back in the good old days of Nokia their best phones would always be GSM/UMTS, and maybe they'd do a CDMA version eventually. Market forces at work.
I'm sure its the reason why Apple earlier iPhones were GSM, simply because they couldn't afford (at that time; different now of course) to do a CDMA version for a comparatively small market. It was the right decision by a long way.
GSM did indeed win because of the availability of the standards. More than that, they were totally complete. They tell you absolutely everything you need to know to build handsets, base stations, NOCs, connection to the rest of the world. It was all there.
In contrast, even if you'd paid for the CDMA standards, actually you'd only bought part of the story, and you still had to go back to Qualcomm to get the rest of it. The standards (especially those for CDMA One back in the day when that was brand new) were woefully incomplete. I think that they're much more complete nowadays, but it's too late. It was tool late all the way back in 1993.
Voice over 4G has been a complete cockup. Most networks in the UK are falling back to 3G when you make voice call. Missing circuit switched voice out of 4G was a massive mistake, and is going to make it unnecessarily hard to ditch legacy 2G and 3G networks.
"Voice over 4G has been a complete cockup. Most networks in the UK are falling back to 3G when you make voice call."
Allegedly in my bit of suburban south Birmingham, Orange/EE switched off the 3G when (or shortly after) they switched on the 4G. User experience of local 3G users certainly matches that theory.
So what happens when a 4G user wants to make a voice call and there's no 3G? Is GSM/PCN still widely available and with sufficient capacity?
that is about as relevant to most of us as hot pockets. The USA mobile market can't even swop out sim cards ... forget those mobile neanderthals... there is a very good reason Apple products say "designed in California"... "made in China"... because you merkins let monopoly business keep you in the dark ages.
To be honest, the last 4 phones i have had (here in asia) have had dual sim capability... the west is getting more and more irrelevant by the day.
I got take my unlocked Iphone 6s, that has no CDMA radio, and use it with the AT&T, Tmobile, and even Verizon(As they are moving to GSM) with no problem, not to mention the tons of MVNos.
As for dual card phones, of course you can buy them in the US, but no one really needs them as they tend to choose a carrier and stick with it, and most carriers are no offering free roaming options in canada and mexico.
So unless you are road warrior outside of those three nations, you really do not need a dual sim card rig.
Dual SIM phones tend to be found, and are most useful, in countries with poorly integrated mobile networks with patchy coverage. If you can't phone a friend because they're on a different network, you carry a phone for that network too. Dual SIM simply means not having to carry too many phones.
AFAIK they never used to sit on both network simultaneously (which would require two of every radio), but simply allow you to swap between them on demand. Perhaps that's changed nowadays - a second radio chip is probably a trivial thing to integrate these days.
I can. After a brilliant introduction, you started waffling on about technology, instead of feeding your readers the full and juicy salacious details hinted at in the intro.
We don't watch porn for the plot you know.
Dual-SIM phones are fairly unusual in Europe too but I think it's largely because (with a couple of exceptions) in the vast majority of countries you'd typically buy a network subsidised phone. So, naturally enough they're usually SIM locked to their original network until you pay down a minimum term contract.
EU voice and text roaming is increasingly just included in your typical "bill pay" plan but, data is still a bit stingy and expensive.
For example I'm using Meteor (Eir) in Ireland:
€35 / month : 30GB 4G data (as well as absolutely unlimited data to Facebook, Instagram and Pokémon Go... Which is a bit pointless), unlimited calls, unlimited texts, 50 minutes of international calls to most destinations (including mobile) and unlimited EU voice/ text roaming but only a stingy 1GB of EU data... So, when I roam I usually bring an unlocked android phone to use as a mobile WiFi hotspot with a local SIM.
That garbage tech should've died at least a decade ago. It's only alive in the US and some parts of China. The rest of the world chose GSM and uses SIM cards, which means that carriers can't apply vendor lockin on their cellphones.
It seems that even Intel realizes this, hopefully they'll speed up CDMA's demise. In my own country, the few CDMA carriers built up GSM/UMTS networks and are eventually killing their prehistoric CDMA towers.
Biting the hand that feeds IT © 1998–2020