Re: "Email only recently (2008-ish?) came to mobile telephony"
Even before that there were WAP front-ends to email which somebody, somewhere, probably used once, at a rate of 38p/megabyte.
2846 posts • joined 18 Jun 2009
I had a first-generation MacBook Pro with its Core Duo, and used it up until probably about 2010; it is unfortunate that Apple needed to jump ship during Intel's temporary regression to a 32-bit ISA but at least no Apple laptop ever saw a single-core x86. The original low-end Mac Mini was a Core Solo.
The PowerPC was so far behind at that point that I doubt Apple could have afforded to wait much longer, but at least SSE3 was present right out of the gate, providing a smooth transition for things like the Accelerate framework.
Then you clearly didn't buy the original iPad, which is the semi-recent Apple device with the shortest useful lifetime.
Released in March 2010, support was dropped for it in iOS 6 of September 2012, and running 2011's iOS 5 on it leads to an incredibly slow device.
The iPad 2 of 2011 improves on the original with twice the RAM and twice the bandwidth, two processor cores instead of one and a greater-than-33% improvement per core, and what Apple claimed was eight times the GPU power though I can't still find a benchmark on that.
So I guess Apple just figured they'd brush the original under the carpet?
(Though I'm only now about to retire my 2015 iPhone 6s and Retina MacBook; the first-generation iPad is the aberration)
> In one final gasp, the Apple II supporters at Apple designed the Apple IIGS Plus, code named "Mark Twain". It had an 8Mhz 65C816, a built in SuperDrive, 2MB on the motherboard, and a hard drive. Prototypes leaked out and a user group that has one and wrote a series of articles about it. Apple management vetoed this unit.
The Mark Twain was designed in 1991, a full five years after the release and therefore likely six or more after the design of the original IIgs.
Nevertheless, per the only owner of one:
> Despite rumors that the Mark Twain is a speed demon, a standard 65C816-4 CPU running at 2.8 Mhz is found in the same physical location as it is on the other IIGS models.
> Selective for good reason - those things you mention weren't common AT ALL back in those days on competitor systems, regardless of how good or bad the other capabilities of said systems were. So criticising the IIgs for daring to suggest that it had graphics capabilities, just because it didn't include a bunch of barely seen elsewhere hardware to help mitigate its relatively slow memory bandwidth/CPU, felt like a bit of an unfair dig at the system.
On the contrary: the issue is that the extreme and unusually-severe bottleneck to write to video memory isn't otherwise alleviated. Trying to talk about half of that equation while ignoring the other is absurd.
Yeah, you've misunderstood the source.
The mechanism described uses banks $00 and $01 as a write-through cache for banks $e0 and $e1. Writes still occur at 1Mhz.
The statement "there is a solution which allows the screen to be written to at near full speed." is false.
Not a surprise you've taken the positions you have given your lack of familiarity with the machine.
> Writing odd values to odd addresses to page memory is par-for-the-course on 8-bit machines, the Commodore 64's memory map was probably just as crazy.
As per above, this isn't about writing odd values to odd addresses. It's about triggering a bunch of soft switches — which are flipped one way or the other by access to a single address — with each individual chunk of memory having two or three different reasons it might be pointing at one thing rather than another.
If you had programmed any other 8-bit machine you'd know this is not the norm. Every other machine uses: (i) regularly-sized memory areas; (ii) with a single point of control.
If you're a fan of everything being possible in two or three ways, each method being a particular combination of two or three other discrete inputs then, yeah, there's a lot to like in the world of Woz.
> The main thing that killed the IIgs was Apple deciding to artificially limit the speed of it to 2.8Mhz so it wouldn't compete with the Mac or any other 16-bit computer of the time.
The speed-limiting factor in an Apple IIgs is the available speed of RAM.
The 65816, like the 6502 before it, requires RAM accesses be completed in a single cycle.
The 68000 provides at least four cycles for any RAM access to be complete.
There is a conspiracy theory, which often nonsensically fingers Steve Jobs himself from then beyond the Apple grave, that the clock was limited, but basic facts seem to disagree.
See also the Apple IIc+, which Apple launched two years after the IIgs and which ran at a nominal 8Mhz. It achieved that by using a cache between processor and bus. And Apple achieved that by licensing the cache from a third party.
Even if they had magical superfast consumer RAM, to me it seems improbable that Apple could produce a faster Apple II machine in 1986, but had forgotten by 1988 and had to pay somebody else to help out.
> Woz I doubt was responsible for making the IIgs artificially slow
Right, because nobody was.
Woz is massively overrated; have you ever tried programming for a system he designed?
The rule for whether base or auxiliary RAM is visible in the region $2000–$3FFF on an Apple IIe is that it'll be the latter if: (i) the 80STORE, PAGE2 and HIRES soft switches are all set; or (ii) the 80STORE switch is reset but the RAMRD switch is set. Otherwise it'll contain base RAM.
There are similar rules pertaining to the other nine irregularly-sized memory regions that fill its 64kb of address space.
Think that's just a corner the team was backed into by a desire for compatibility with the existing ROMs? Then you're wrong as per the IIgs, which amongst other things introduces shadowing (using a fast part of RAM as a write-through cache for a slow part, essentially) and invents another set of arcane rules about what situations lead to shadowing of what.
Probably its best feature is having 'graphics' in the name while offering nothing beyond a plain single-buffered frame buffer (no hardware scrolling, no blitting, nothing), and putting that behind a 1Mhz 8-bit bus.
With the IIgs Woz achieved what Apple hadn't otherwise managed: to kill the Apple II.
Most Apple fans will receive completely reliable MacBooks, as the statistical failure rate is a tiny slither of the whole. But that doesn't mean that Apple isn't culpable for some of the flaws, as here, where the issue could clearly have been avoided.
Me personally? In twenty-ish years I've had a MacBook Pro with a GPU fault that happened late enough as not to really matter and a MacBook with a logic board that has failed twice, the first time after only about three years so during what should be a machine's normal lifetime.
That's a pretty high subjective failure rate, but I'm damned if I can find any objective statistics so there's no reason to believe me any more than any of those other commenters you refer to.
Probably skip the +3; the CPC can display real 8px/character 80-column text whereas for CP/M the Spectrum uses some weird 5- or 6-pixel-wide font, displaying only a portion of the width of the virtual 80-column display and jumping between the left and right portions of the screen ‘intelligently’.
I think it's the opposite; if he bought Twitter and choose very selectively not to apply its various moderation policies, he could continue his ingratiation with the Republican Party and thereby gain such favours as he desires next time the pendulum of government swings back to them.
See also: the drama of the Texas AG vs Twitter, with Texas 'inexplicably' using government clout to further Musk's objectives.
> But they aren't in a special area. They are just on the desktop, and in those versions, they were the only things allowed on the desktop.
I think that's a slender distinction: they're not in a special area, but they are in an area that only they are allowed to be in.
> It's not a bar.
There it is: I lose. You can't invent the taskbar with something that isn't a bar. End of story.
> They actually shoot the actors on green screen and use a computer to replace the green (color humans don't have)
That's clearly not what the author is referring to; suggest you learn more about modern filming processes such as filling a room with LED screens and rendering the effects to that.
Especially in the MacBook Air, Apple did not have a good history of using Intel’s fastest — even when maximally configured, Apple’s final Intel-based Air declined the top-of-the-line mobile i7 for TDP reasons, using the 9W 1060 rather than the 15W 1065.
So getting this level of performance _even in the base model_ is a huge performance leap for Apple customers.
Linus himself uses the quotation marks so I appreciate it's not to be read naively, but in what sense is the generic ARM stuff more 'multiplatform' than Linux has been until now? Is it about supporting something that's a little less cohesive than a traditional hardware platform, i.e. various vendors all providing very similar runtime environments but doing whatever they feel like with regards to startup — boot loaders, device trees, etc?
> Apple isn't a monopoly, Raspberry pi exists
As said by absolutely no-one.
However, Apple isn't a monopoly. Android exists. And the EU is drafting new legislation to address the damage to this market specifically because it can't just use its existing anti-competition law, because Apple isn't a monopoly. The EU believes it is doing harm without being a monopoly, as does the author.
Except it obviously doesn't, since Microsoft had control over 90% of the market, whereas Apple has control over around 20%.
... and that's why the EU is looking at the problem in terms of new legislation, to determine what they think overall market fairness requires, rather than targeting Apple specifically via anticompetition law, which can already be used to attack misuse of a monopoly position.
Microsoft had a monopoly, Apple doesn't. The rest of the world can innovate, whether Apple likes it or not.
> I would be more interested if they allowed >1 monitor to connect (where the monitors use DP).
The base M1 supports two monitors. The Pro and Max support four. The Ultra supports five.
No version of the M1 is limited to a single monitor.
However, this may be the faintest praise that anybody has ever posted, and I'm happy to admit that it took several minutes of searching to navigate Apple's confusing naming. Shouldn't the 'Max' be the best one by definition? And who are the better chips for, if not 'Pro's?
Nevertheless I remain very happy with my M1 Mini.
Yeah, NT4 was the one where they moved the GDI, along with print and video drivers, into kernel mode — buying both a speed boost and a step backwards in stability, especially as NT drivers weren't exactly anyone's priority at the time.
If memory serves then Windows Vista introduced the current model, of putting only a tiny shim into kernel mode and doing the overwhelming majority of driver work in user mode.
... and non-compete agreements are explicitly unenforceable in California, for the general public-policy reason that people shouldn't be able to bargain away their ability to engage in a lawful profession.
On the other hand, he'll be under a pile of NDAs and wouldn't have risen anywhere close to as far as he already has if he'd been the sort of person who obviously doesn't honour them.
Ugh, yeah. I have a Mac and in the past had an iPad but my Kindle was always first choice for PDFs despite the paperback-sized display with awkward panning and zooming due to the e-ink refresh rate, precisely because I can just drag and drop to it.
Well, that and not wanting to read off LCD when it can be avoided.
I haven’t had an iPad for the better part of a decade because I never really found any other use for it either.
Which work are you having difficulty with? Lack of USB input on the phones is the only thing I can think of that would push you to an Android rather than an iOS device in terms of productivity.
Otherwise, Office and Photoshop and Exchange and Slack and everything else is no big issue. Apple even finally started offering specs for AirPlay to partners a few years ago, so you can screen cast to your Roku, Samsung TV, etc.
I seriously can't think of a strong objective argument to prefer an iOS device over an Android or vice versa these days. It's just marginal preference amongst a sea of unexciting devices.
I suspect the unwarranted linkage to preemptive multitasking may relate to the reason than Chen got involved at all — per his blog entry he was chasing up on crash reports, and SoftRAM not only didn't actually compress but also was largely based on out-of-date Windows 3.1 DDK sample code, which being for Windows 3.1 made no effort to be thread safe.
So SoftRAM would crash hard on Windows 95 as soon as a lot of processes started hitting memory issues at once.
This is exactly right; IBM threw lawyers at absolutely everyone making a PC clone for a long time. The breakthrough was finally achieving 100% BIOS compatibility without using any of IBM’s original code, and with the necessary legal evidence to prove that the people who wrote the new code had never laid eyes on IBM’s.
IBM’s solution to that was to double down on lawyers and create the MCA bus and the rest of the PS/2 that would be much easier to protect against clones. But the horse had already bolted.
It has fantastic sound hardware — for me that’s the only bright spot. Otherwise it’s a framebuffer-only machine with nothing even close to the grunt necessary to do decent animation and a vertical resolution too low for productivity, and the memory layout is so arcane that even the official documentation names one of the many overlapping registers as the quagmire state.
The 65816 ends up being a net detriment because it has the same inefficient memory access patterns as the 6502 (there’ll always be two reads for single-byte instructions, read-modify-writes always have a spurious access in the middle, etc) but in a machine where large chunks of the address space are behind a 1MHz bus.
I’m a paid-up member of the Apple ecosystem, but I strongly doubt there’s a blame-the-user angle here, whether physical or otherwise.
I give it 99% odds that a bug in the software is to blame, whether the OS itself or one of the firmware updates that Apple bundled with the OS.
If I dare jump in; Microsoft aren't doing that bad a job in my opinion.
Google also can't make a plan and stick to it, but in that case the users pay as whatever Google is abandoning simply ceases to be. Microsoft's discarded frameworks at least continue to function.
Apple can make a plan and stick to it, but that plan usually involves a large amount of technology churn and the assumption that developers will keep up. As a developer you at least never get stranded by a complete horse change, but as a user you can still expect unmaintained applications to expire.
UWP requires a non-standard compiler if you want to target it from C++; a non-standard C++ compiler is not required for UWP because it was always mainly for the C# crowd.
Microsoft not only could be better, but is: C++/WinRT is the standard C++17 way into WinRT, provided as a header-only library for any old compiler.
Alas, I have absolutely no idea how UWP maps to WinRT, how either corresponds to Win32 or .NET, or what WPF has to do with any of it. All I really know is: don't mention Silverlight.
I think Reunion is meant to clarify, even to idiots like me.
Here's what I usually see when a Mac user tries to create a shortcut to a network share:
They drag the network share icon to where they want the shortcut, holding down the option+control buttons, and release.
Spoiler: if they're doing something that involves connecting to a network share, in an environment that hasn't already been dummified, they probably know how to use a computer.
Biting the hand that feeds IT © 1998–2022