* Posts by ThomH

2846 posts • joined 18 Jun 2009

Google tells Apple to 'fix text messaging' in bid to promote RCS protocol

ThomH

Re: "Email only recently (2008-ish?) came to mobile telephony"

Even before that there were WAP front-ends to email which somebody, somewhere, probably used once, at a rate of 38p/megabyte.

ThomH

Re: they do not support modern texting features ...

You mean for all those videos that are less than about 10mb in size?

ThomH

Re: they do not support modern texting features ...

I sometimes send video of my little chap to other family members, an ocean away; not having that reduced to MMS quality is a win, without having to find a file or video sharing intermediary.

Parallels increases prices with Desktop version 18

ThomH

VMWare Fusion Player is also free for personal use, though you have to register first. It's my preferred choice.

Google's ChromeOS Flex turned my old MacBook into new frustrations

ThomH

I had a first-generation MacBook Pro with its Core Duo, and used it up until probably about 2010; it is unfortunate that Apple needed to jump ship during Intel's temporary regression to a 32-bit ISA but at least no Apple laptop ever saw a single-core x86. The original low-end Mac Mini was a Core Solo.

The PowerPC was so far behind at that point that I doubt Apple could have afforded to wait much longer, but at least SSE3 was present right out of the gate, providing a smooth transition for things like the Accelerate framework.

ThomH

Then you clearly didn't buy the original iPad, which is the semi-recent Apple device with the shortest useful lifetime.

Released in March 2010, support was dropped for it in iOS 6 of September 2012, and running 2011's iOS 5 on it leads to an incredibly slow device.

The iPad 2 of 2011 improves on the original with twice the RAM and twice the bandwidth, two processor cores instead of one and a greater-than-33% improvement per core, and what Apple claimed was eight times the GPU power though I can't still find a benchmark on that.

So I guess Apple just figured they'd brush the original under the carpet?

(Though I'm only now about to retire my 2015 iPhone 6s and Retina MacBook; the first-generation iPad is the aberration)

Apple-1 prototype hand-soldered by Woz up for auction, bids expected to reach $500k

ThomH

Re: Open source?

> In one final gasp, the Apple II supporters at Apple designed the Apple IIGS Plus, code named "Mark Twain". It had an 8Mhz 65C816, a built in SuperDrive, 2MB on the motherboard, and a hard drive. Prototypes leaked out and a user group that has one and wrote a series of articles about it. Apple management vetoed this unit.

The Mark Twain was designed in 1991, a full five years after the release and therefore likely six or more after the design of the original IIgs.

Nevertheless, per the only owner of one:

> Despite rumors that the Mark Twain is a speed demon, a standard 65C816-4 CPU running at 2.8 Mhz is found in the same physical location as it is on the other IIGS models.

ThomH

Re: Open source?

> Selective for good reason - those things you mention weren't common AT ALL back in those days on competitor systems, regardless of how good or bad the other capabilities of said systems were. So criticising the IIgs for daring to suggest that it had graphics capabilities, just because it didn't include a bunch of barely seen elsewhere hardware to help mitigate its relatively slow memory bandwidth/CPU, felt like a bit of an unfair dig at the system.

On the contrary: the issue is that the extreme and unusually-severe bottleneck to write to video memory isn't otherwise alleviated. Trying to talk about half of that equation while ignoring the other is absurd.

ThomH

Re: Open source?

Yeah, you've misunderstood the source.

The mechanism described uses banks $00 and $01 as a write-through cache for banks $e0 and $e1. Writes still occur at 1Mhz.

The statement "there is a solution which allows the screen to be written to at near full speed." is false.

Not a surprise you've taken the positions you have given your lack of familiarity with the machine.

ThomH

Re: Open source?

That's a selective quote; the point is that the CPU has to do the work, but can't do that efficiently because of the 1MHz bus.

If the question were which other machines don't offer hardware and also offer such speed-limited memory access then the answer would be 'none'.

ThomH

Re: Open source?

> Writing odd values to odd addresses to page memory is par-for-the-course on 8-bit machines, the Commodore 64's memory map was probably just as crazy.

As per above, this isn't about writing odd values to odd addresses. It's about triggering a bunch of soft switches — which are flipped one way or the other by access to a single address — with each individual chunk of memory having two or three different reasons it might be pointing at one thing rather than another.

If you had programmed any other 8-bit machine you'd know this is not the norm. Every other machine uses: (i) regularly-sized memory areas; (ii) with a single point of control.

If you're a fan of everything being possible in two or three ways, each method being a particular combination of two or three other discrete inputs then, yeah, there's a lot to like in the world of Woz.

> The main thing that killed the IIgs was Apple deciding to artificially limit the speed of it to 2.8Mhz so it wouldn't compete with the Mac or any other 16-bit computer of the time.

The speed-limiting factor in an Apple IIgs is the available speed of RAM.

The 65816, like the 6502 before it, requires RAM accesses be completed in a single cycle.

The 68000 provides at least four cycles for any RAM access to be complete.

There is a conspiracy theory, which often nonsensically fingers Steve Jobs himself from then beyond the Apple grave, that the clock was limited, but basic facts seem to disagree.

See also the Apple IIc+, which Apple launched two years after the IIgs and which ran at a nominal 8Mhz. It achieved that by using a cache between processor and bus. And Apple achieved that by licensing the cache from a third party.

Even if they had magical superfast consumer RAM, to me it seems improbable that Apple could produce a faster Apple II machine in 1986, but had forgotten by 1988 and had to pay somebody else to help out.

> Woz I doubt was responsible for making the IIgs artificially slow

Right, because nobody was.

ThomH

Re: Open source?

Woz is massively overrated; have you ever tried programming for a system he designed?

The rule for whether base or auxiliary RAM is visible in the region $2000–$3FFF on an Apple IIe is that it'll be the latter if: (i) the 80STORE, PAGE2 and HIRES soft switches are all set; or (ii) the 80STORE switch is reset but the RAMRD switch is set. Otherwise it'll contain base RAM.

There are similar rules pertaining to the other nine irregularly-sized memory regions that fill its 64kb of address space.

Think that's just a corner the team was backed into by a desire for compatibility with the existing ROMs? Then you're wrong as per the IIgs, which amongst other things introduces shadowing (using a fast part of RAM as a write-through cache for a slow part, essentially) and invents another set of arcane rules about what situations lead to shadowing of what.

Probably its best feature is having 'graphics' in the name while offering nothing beyond a plain single-buffered frame buffer (no hardware scrolling, no blitting, nothing), and putting that behind a 1Mhz 8-bit bus.

With the IIgs Woz achieved what Apple hadn't otherwise managed: to kill the Apple II.

Apple to pay $50m settlement for rotten butterfly keyboards

ThomH

Re: Staingate et al

Most Apple fans will receive completely reliable MacBooks, as the statistical failure rate is a tiny slither of the whole. But that doesn't mean that Apple isn't culpable for some of the flaws, as here, where the issue could clearly have been avoided.

Me personally? In twenty-ish years I've had a MacBook Pro with a GPU fault that happened late enough as not to really matter and a MacBook with a logic board that has failed twice, the first time after only about three years so during what should be a machine's normal lifetime.

That's a pretty high subjective failure rate, but I'm damned if I can find any objective statistics so there's no reason to believe me any more than any of those other commenters you refer to.

CP/M's open-source status clarified after 21 years

ThomH

Re: The title is no longer required.

Probably skip the +3; the CPC can display real 8px/character 80-column text whereas for CP/M the Spectrum uses some weird 5- or 6-pixel-wide font, displaying only a portion of the width of the virtual 80-column display and jumping between the left and right portions of the screen ‘intelligently’.

Elon Musk considering 'drastic action' as Twitter takeover in 'jeopardy'

ThomH

Re: Burn

He's estimated to be worth more than $200bn so he can find the $1bn breakup fee if he has to, but I also wonder whether he's antagonising exactly the people who constitute the majority of his customers.

ThomH

Re: Where is the ROI?

I think it's the opposite; if he bought Twitter and choose very selectively not to apply its various moderation policies, he could continue his ingratiation with the Republican Party and thereby gain such favours as he desires next time the pendulum of government swings back to them.

See also: the drama of the Texas AG vs Twitter, with Texas 'inexplicably' using government clout to further Musk's objectives.

Original Acorn Arthur project lead explains RISC OS genesis

ThomH

Re: Doesn't Windows 1, from 1985, have a taskbar?

> But they aren't in a special area. They are just on the desktop, and in those versions, they were the only things allowed on the desktop.

I think that's a slender distinction: they're not in a special area, but they are in an area that only they are allowed to be in.

That said:

> It's not a bar.

There it is: I lose. You can't invent the taskbar with something that isn't a bar. End of story.

ThomH

Doesn't Windows 1, from 1985, have a taskbar?

Like this and as mentioned here?

With all that RISC OS did so far ahead of the rest of the world, it seems like an odd thing to flag up.

Consultant plays Metaverse MythBuster. Here's why they're wrong

ThomH

Re: Holodeck

> They actually shoot the actors on green screen and use a computer to replace the green (color humans don't have)

That's clearly not what the author is referring to; suggest you learn more about modern filming processes such as filling a room with LED screens and rendering the effects to that.

Apple’s M2 chip isn’t a slam dunk, but it does point to the future

ThomH

In Apple-specific terms, still a huge win

Especially in the MacBook Air, Apple did not have a good history of using Intel’s fastest — even when maximally configured, Apple’s final Intel-based Air declined the top-of-the-line mobile i7 for TDP reasons, using the 9W 1060 rather than the 15W 1065.

So getting this level of performance _even in the base model_ is a huge performance leap for Apple customers.

Multiplatform Linux kernel 'pretty much done' says Linus Torvalds

ThomH

Can anyone provide more context on "multiplatform"?

Linus himself uses the quotation marks so I appreciate it's not to be read naively, but in what sense is the generic ARM stuff more 'multiplatform' than Linux has been until now? Is it about supporting something that's a little less cohesive than a traditional hardware platform, i.e. various vendors all providing very similar runtime environments but doing whatever they feel like with regards to startup — boot loaders, device trees, etc?

Safari is crippling the mobile market, and we never even noticed

ThomH

Re: Commentard Bingo

> Apple isn't a monopoly, Raspberry pi exists

As said by absolutely no-one.

However, Apple isn't a monopoly. Android exists. And the EU is drafting new legislation to address the damage to this market specifically because it can't just use its existing anti-competition law, because Apple isn't a monopoly. The EU believes it is doing harm without being a monopoly, as does the author.

ThomH

"it gives Apple the same veto on innovation as Microsoft had, which is where"

Except it obviously doesn't, since Microsoft had control over 90% of the market, whereas Apple has control over around 20%.

... and that's why the EU is looking at the problem in terms of new legislation, to determine what they think overall market fairness requires, rather than targeting Apple specifically via anticompetition law, which can already be used to attack misuse of a monopoly position.

Microsoft had a monopoly, Apple doesn't. The rest of the world can innovate, whether Apple likes it or not.

Apple CEO: Silicon shortages and C-19 lockdowns to hurt sales by up to $8 billion

ThomH

Because silicon fro China isn't banned?

Microsoft exposes glue-free guts of the Surface Laptop Studio

ThomH

Re: Thing is the new …errr

The MacBook Pro recently got a bit thicker, and I'll admit that it was sort of weird switching from an older model to the current. But worth it once I started typing.

ZX Spectrum, the 8-bit home computer that turned Europe onto PCs, is 40

ThomH

Re: Where it all began...for some

There was definitely a Hisoft product, and if you had a +3 then there are a bunch of CP/M options — albeit that the options for navigating an 80-column display aren't fantastic.

We take Asahi Linux alpha for a spin on an M1 Mac Mini

ThomH

Re: Deleting macOS

Maybe you're a Linux user who wants a high-performance but svelte laptop that can go 14 hours on a charge? But you're not interested in adapting your workflow to a new OS?

Just two die for: Apple reveals M1 Ultra chip in Mac Studio

ThomH

Re: Threadripper? Deadripper more like.

> I would be more interested if they allowed >1 monitor to connect (where the monitors use DP).

The base M1 supports two monitors. The Pro and Max support four. The Ultra supports five.

No version of the M1 is limited to a single monitor.

However, this may be the faintest praise that anybody has ever posted, and I'm happy to admit that it took several minutes of searching to navigate Apple's confusing naming. Shouldn't the 'Max' be the best one by definition? And who are the better chips for, if not 'Pro's?

Nevertheless I remain very happy with my M1 Mini.

Apple seeks patent for 'innovation' resembling the ZX Spectrum, C64 and rPi 400

ThomH

Re: Cambridge Z88?

Pedantically: it was a CR1620, not a CR20xx. Which means it was only 16mm in diameter (and 2mm tall), and therefore even more compact than a CR20xx.

ThomH

Re: I'll see your Atari ST and raise you a Commodore PET

... it's also half a year younger than the Apple II.

20 years of .NET: Reflecting on Microsoft's not-Java

ThomH

Re: Notably missing in action...

Agreed entirely; I’ve bothered to find out what the current solution is but I haven’t yet chanced my arm on using it.

ThomH

Re: Notably missing in action...

Presumably C++/WinRT — a header-only projection of WinRT onto entirely-standard C++17.

Saved by the Bill: What if... Microsoft had killed Windows 95?

ThomH

Yeah, NT4 was the one where they moved the GDI, along with print and video drivers, into kernel mode — buying both a speed boost and a step backwards in stability, especially as NT drivers weren't exactly anyone's priority at the time.

If memory serves then Windows Vista introduced the current model, of putting only a tiny shim into kernel mode and doing the overwhelming majority of driver work in user mode.

Apple custom chip guru jumps ship to rejoin Intel

ThomH

Re: how long?

68000: 1984–1994, 10 years;

PowerPC: 1994–2006, 12 years;

Intel: 2006–2020, 14 years.

So by the power of numerology, I guess: 2036.

ThomH

Re: Nice One. ..... but it is not Cricket, Old Bean, is it?

... and non-compete agreements are explicitly unenforceable in California, for the general public-policy reason that people shouldn't be able to bargain away their ability to engage in a lawful profession.

On the other hand, he'll be under a pile of NDAs and wouldn't have risen anywhere close to as far as he already has if he'd been the sort of person who obviously doesn't honour them.

Can you get excited about the iPhone 13? We've tried

ThomH

Re: Thanks!

Ugh, yeah. I have a Mac and in the past had an iPad but my Kindle was always first choice for PDFs despite the paperback-sized display with awkward panning and zooming due to the e-ink refresh rate, precisely because I can just drag and drop to it.

Well, that and not wanting to read off LCD when it can be avoided.

I haven’t had an iPad for the better part of a decade because I never really found any other use for it either.

ThomH

Re: Thanks!

Which work are you having difficulty with? Lack of USB input on the phones is the only thing I can think of that would push you to an Android rather than an iOS device in terms of productivity.

Otherwise, Office and Photoshop and Exchange and Slack and everything else is no big issue. Apple even finally started offering specs for AirPlay to partners a few years ago, so you can screen cast to your Roku, Samsung TV, etc.

I seriously can't think of a strong objective argument to prefer an iOS device over an Android or vice versa these days. It's just marginal preference amongst a sea of unexciting devices.

ThomH

Re: It's an iPhone

Lockdown is an iOS app that provides a local virtual VPN in order to block adverts, etc, in all apps. It even does that bit for free. A real VPN for secure browsing and changing your apparent region is the upsell, which is fairly easy to ignore.

Remember SoftRAM 95? Compression app claimed to double memory in Windows but actually did nothing at all

ThomH

I suspect the unwarranted linkage to preemptive multitasking may relate to the reason than Chen got involved at all — per his blog entry he was chasing up on crash reports, and SoftRAM not only didn't actually compress but also was largely based on out-of-date Windows 3.1 DDK sample code, which being for Windows 3.1 made no effort to be thread safe.

So SoftRAM would crash hard on Windows 95 as soon as a lot of processes started hitting memory issues at once.

Apple is beginning to undo decades of Intel, x86 dominance in PC market

ThomH

Re: PC

This is exactly right; IBM threw lawyers at absolutely everyone making a PC clone for a long time. The breakthrough was finally achieving 100% BIOS compatibility without using any of IBM’s original code, and with the necessary legal evidence to prove that the people who wrote the new code had never laid eyes on IBM’s.

IBM’s solution to that was to double down on lawyers and create the MCA bus and the rest of the PS/2 that would be much easier to protect against clones. But the horse had already bolted.

ThomH

Re: I guess the 6502/68000 aren't part of iApples's history?

It has fantastic sound hardware — for me that’s the only bright spot. Otherwise it’s a framebuffer-only machine with nothing even close to the grunt necessary to do decent animation and a vertical resolution too low for productivity, and the memory layout is so arcane that even the official documentation names one of the many overlapping registers as the quagmire state.

The 65816 ends up being a net detriment because it has the same inefficient memory access patterns as the 6502 (there’ll always be two reads for single-byte instructions, read-modify-writes always have a spurious access in the middle, etc) but in a machine where large chunks of the address space are behind a 1MHz bus.

ThomH

Re: I guess the 6502/68000 aren't part of iApples's history?

Conversely, everyone wishes they could forget the 65816.

Brit analysts formed pact to crash Autonomy's market valuation, ex-CFO tells US court

ThomH

Also:

100 - 30 + 2 = 72

… because there’s absolutely no ambiguity in the original expression.

Apps made with Google's Flutter may fritter away CPU cycles. Here's what the web giant intends to do about it

ThomH

From the little that I think I know, Flutter is built around a presumption of immutable views and complete subtree recompositions. Like functional programming, but only the bad parts, and presumably implemented by somebody with a background in video games.

Users and efficiency be damned.

Apple's macOS Monterey upgrades some people's laptops to doorstops

ThomH

Re: "$99 and 24h later"

I’m a paid-up member of the Apple ecosystem, but I strongly doubt there’s a blame-the-user angle here, whether physical or otherwise.

I give it 99% odds that a bug in the software is to blame, whether the OS itself or one of the firmware updates that Apple bundled with the OS.

Microsoft's UWP = Unwanted Windows Platform?

ThomH

If I dare jump in; Microsoft aren't doing that bad a job in my opinion.

Google also can't make a plan and stick to it, but in that case the users pay as whatever Google is abandoning simply ceases to be. Microsoft's discarded frameworks at least continue to function.

Apple can make a plan and stick to it, but that plan usually involves a large amount of technology churn and the assumption that developers will keep up. As a developer you at least never get stranded by a complete horse change, but as a user you can still expect unmaintained applications to expire.

Apple's Safari browser runs the risk of becoming the new Internet Explorer – holding the web back for everyone

ThomH

Android 6 was first released in October 2015, a month after the iPhone 6s was launched.

In 2021 the iPhone 6s not only runs the latest version of Safari, but the latest version of the entire operating system.

Are you a 1%er? Windows 11 turns up in the usage figures

ThomH

UWP requires a non-standard compiler if you want to target it from C++; a non-standard C++ compiler is not required for UWP because it was always mainly for the C# crowd.

Microsoft not only could be better, but is: C++/WinRT is the standard C++17 way into WinRT, provided as a header-only library for any old compiler.

Alas, I have absolutely no idea how UWP maps to WinRT, how either corresponds to Win32 or .NET, or what WPF has to do with any of it. All I really know is: don't mention Silverlight.

I think Reunion is meant to clarify, even to idiots like me.

The old New: Windows veteran explains that menu item

ThomH

Re: Or, you know, you created a blank template of the project?

Here's what I usually see when a Mac user tries to create a shortcut to a network share:

They drag the network share icon to where they want the shortcut, holding down the option+control buttons, and release.

Spoiler: if they're doing something that involves connecting to a network share, in an environment that hasn't already been dummified, they probably know how to use a computer.

SUBSCRIBE TO OUR WEEKLY TECH NEWSLETTER

Biting the hand that feeds IT © 1998–2022