* Posts by Julian 4

27 publicly visible posts • joined 12 Jun 2009

PARC Alto source code released by computer history museum

Julian 4

Re: Still need the microcode for the chips and the detailed specs.

Hi John,

You don't need the microcode to run the executables because the Alto microcode implements specialised instruction sets for running user programs. It's emulators for the instruction sets you need.

The standard instruction set was almost identical to the Data General Nova minicomputer instruction set and the BCPL compiler targeted that.

There was a byte-code instruction set for the Mesa compiler; another for Smalltalk; and yet another for Lisp.

Nevertheless, Alto software will be so Alto hardware specific you'd need an Alto emulator too (and there is one that kinda runs OK).

http://toastytech.com/guis/salto.html

Google chap reverse engineers Sinclair Scientific Calculator

Julian 4

Squeezetastic

The Sinclair Scientific was a true engineering triumph: squeezing the UI and operation of a scientific calculator into just 320 x 11-bit program steps written in just 3 days from a hotel in Texas (how on earth did they test it?).

But it explains why Sinclair thought floating point would be trivial for his series of computers: if you can fit it into <0.4Kb* then it should have been no problem fitting into his second computer (as he promised) given its expansive 4Kb Super-ROM... but users had to wait for the 8K ROM ZX81 before that kind of luxury became available (the MK14 was his first computer).

Ken's post is brilliant, all the more amazing for the fact that we now know more about the Sinclair Scientific then we ever did, 39 years after it came out; and gain a better insight for why Sir Clive Sinclair was seen as the (eccentric) 1970s genius instead of the 80s techie front-man. Digital archeology at its best!

-cheers from Julz

[* 320x11-bits is about 440 bytes]

Behold ATLAS, the fastest computer of 50 years ago

Julian 4
Megaphone

Less powerful than your Fridge Magnet!

Why can't we resist the urge to bash old technology, making it out to be worse than it ever was? I'd be really shocked if my washing machine needed 96Kb of code and half a meg of disk space to manage a few switches and read a temperature comparator. I'd fire the software and hardware engineers for incompetence :-) The only thing in common with my washing machine and an Atlas is the drum storage.

It's the same with all 60s technology. "Did you know the Apollo Computers had less power than your ZX81/Pocket Calculator/Analog Watch?" Rubbish, they had roughly the same power as a BBC Master from the mid-80s and you could write Novels with that, browse multimedia video disks, run businesses and crash the stock market too!

We're better off respecting both the old technology for what it was and the brilliance of the people who designed it. The Atlas was a landmark machine that had real power, power they didn't waste by filling it with schoolboy quality Flash™ and Javascript coding. We could all learn a thing or two from their era :-)

The Jupiter Ace: 40 years on

Julian 4
Pint

Childhood 2.0

The biggest achievement for the Jupiter Ace was that everyone wanted one once the company had gone bust and I missed my chance too. The nearest I could get was the Nottingham Microcomputer Club's single Jupiter Ace which would be out on permanent loan with a waiting period as long as a Sinclair delivery schedule, so when I finally got the chance I spent a happy homework-free couple of weeks ploughing through the manual; cramming my head full of its Forthy goodness and mind-blowing concepts.

A brilliantly clever machine flawed only by its tame monochrome graphics and glitchy keyboard driver (the rubber keyboard is actually fine, it's the firmware that's at fault).

Aaaaah, and it says it all that Ace's are practically just as unobtainable now; if only there was a similar 8-bit Forth computer you could buy today; we could all reclaim our childhoods ;-)

Strong ARM: The Acorn Archimedes is 25

Julian 4
Thumb Up

Groundbreaking

Without question, the original Archimedes computers were the heralds of a new era in computing primarily because of the ARM processor. At the BBC@30 event in Cambridge a couple of months ago Steve Furber made a few remarks on the panel that were pretty revealing.

Firstly, yes they had tried to liaise with Intel about the use of the 80286, except they couldn't understand why the pinout had a shared address and data bus as it crippled the performance in their view. They wanted Intel to license the core to them and they'd repackage it. Intel declined, the rest is history.

Secondly, they visited WDC (which produced the 16-bit enhanced 6502, the 65816) and discovered it was essentially a 1 person operation. They figured that if one guy could develop a CPU, so could they.

Hence the Archimedes was released, using a UK-designed RISC processor at pretty much the same time as the FIRST commercial RISC computers in the US, except that the Archimedes was aimed at consumer machines rather than high-end workstations, a decision that lead to ARM being the dominant 32-bit CPU today, eclipsing Intel sales units by an order of magnitude, and the standard-bearer for RISC itself.

-Cheers from julz (the designer of the 8-bit DIY Computer FIGnition).

Review: Raspberry Pi

Julian 4
Boffin

Re: Ask the Right Question

Hi Jess,

Thanks for your reply.

> > The Pi is an ARM based device, (ignoring the GPU, which is fair enough given the lack of

> > documentation :) ) it is not ridiculously more complex than a 1989 BBC A3000

It's true that ARM assembler on a R-PI isn't that different from ARM assembler on an A3000, but the problem is with the overall complexity of systems and its the overall complexity that impacts on both how programming is learned and how developers develop for the systems. So, the 10,000 figure isn't hyperbole, but the literal growth in complexity between the early 80s and now (e.g. 256Mb / 32Kb = 8192; 200million lines of code of Linux / 10K lines in the BBC's firmware => 20000 etc ) Let's start from the bottom to the top.

At the bottom we have the CPU. Modern CPUs are orders of magnitude complex than in the 80s and R-PI's CPU is actually a microcontroller in itself. So, just considering the effect of memory management and cache means that in a modern system kids can't just poke around the computer, you have to use APIs just to be able to get access to memory (if it's allowed) and reading and writing to it involves more complex protocols (i.e. because of cache coherency, write-back policies).

Booting a modern CPU is more complex than an early 80s one: they don't just start running from address 0; you find you have to consider the boot-up reason, switch into the appropriate CPU mode; you find you have to explicitly enable RAM (enable chip selects and memory areas, take it out of self-refresh perhaps, allocate memory regions for the MMU). Thus rewriting say the ROM to support a fundamentally different OS to Linux is orders of magnitude more complex than for an early 80s computer.

Moving to the on-chip peripherals. The level of integration means that the R-PI's chipset is orders of magnitude harder to deal with than, say, an Atmel AVR, which in turn is several times more complex than the peripherals on early 80s computers. They are 1000s of pages long just for the main CPU itself (including peripherals) - so much more complex that the designers of the R-PI must only know a fraction of the contents. I used to be an Symbian OS baseporter, I've seen these things, no-one knows these chips to the degree hundreds of thousands of people understand the innards of a BBC or ZX Spectrum. By comparison I've read the docs for the chipset of the Arm250 (=> A3020), and although more complex than an early 80s micro, are pretty much fully graspable with some effort IMO.

All this has a knock-on effect further up the chain. Supporting a chipset with a billion transistors requires complex APIs and frameworks to be defined which itself impacts on the complexity and design of the Operating System itself. But of course, for a large system we want to run everything it theoretically can run, so the complexity is compounded by multiple frameworks, libraries, tools, environments, drivers, front-ends and at this stage what happens is that because the developers who contribute to these systems only understand a tiny fraction of the systems (and can't afford to spend the time to learn the rest) there's continual re-invention and duplication. Then the people who bundle the OS distribution similarly can't spend the time to work out what's truly needed they have to play safe and include the lot.

Finally, on top of all this we want kids to learn to program again. However, the development environments we've constructed to manage our 10,000x more complex systems are themselves formidable beasts compared with the 10 PRINT "I AM FAB!" environments on early 80s computers. Ironically, these environments were better than, say Python's interactive mode (so much for 30 years of progress). So we're drawn to recreating noddy programming environments that don't *really* teach programming as well as, say, a 1K ZX81 can (that is, I strongly suspect one can learn more about programming from a 1K ZX81 than by downloading and exploring for example, Scratch). It's like trying to teach kids to read and write using War And Peace and then deciding we'll just start with the easy words in it: and, of, the, it etc.

It's this baggage at all levels that makes it hard for an R-PI to provide an educational programming environment as effective as an original BBC Micro or ZX Spectrum; although of course you can run an emulator for these machines on it (which in a sense is to validate the earlier machine); yet even that wouldn't be terribly good, kids know the emulated machine is a fake, a form of magic and the deception would impact their learning.

All this is to say, the complexity has an impact in every way when it comes to development. Which brings me back to the original point: if it's true that kids don't program because today's computers are 10,000 times more complex than their forebears then we really need to give them simple computers, not small, cheap, complex ones. The difficult thing is accepting it, because powerful systems are so desirable we kid outselves into thinking they're better in every way. To finish with, a quote from Richard Altwasser (BBC News 17776666):

"What is important is not the technical speed of the device but the speed with which a user can get their computer out of a box and type in their first program." At this level, a BBC micro, ZX Spectrum and FIGnition succeed where R-PI would struggle.

-cheers from julz

Julian 4
Boffin

Ask the Right Question

Unless we ask the right question: Why don't kids learn to program these days? We won't be able to draw the right conclusions.

Is it because computers aren't cheap enough to risk programming? (though in fact netbooks are effectively 3x cheaper than ZX Spectrums were in our day and modern computers provide far more protected programming environments than in the 80s).

Or is because they're 10,000 times more complex today? Based on my experience of programming machines from the 80s to today, I'd have to conclude the latter. And once you do, you have to conclude that the Raspberry PI (nice though it is), can't be any better for learning to program than anything we currently have.

In which case, what we really need is computers simple enough to understand; machines like FIGnition ( www.fignition.co.uk). It's not about going back to the past any more than teaching children to read using Phonics is regressive compared with learning to read using War And Peace or Harry Potter.

-cheers from julz

Pints under attack as Lord Howe demands metric-only UK

Julian 4
IT Angle

I figured out the problem

Instead of going for an Empirical system of units (which would be a very British thing to do and give us Metric) we've confused it with an Imperial system of units, in a 150 year phoneme mishap. Either that or we just can't stand the idea of adopting something we're centuries too late to re-market as our own ;-)

-cheers from julz

Oceans gaining ACID faster than last 300 MILLION YEARS

Julian 4
Childcatcher

Re: So where is CO2 going

Prior to the industrial revolution CO2 was sourced and sunk by land and sea and it was all in equilibrium.

Since we've been injecting extra CO2 into the atmosphere the natural environment is capable of sinking some of it, land sinks some and the oceans sink much more as they have a much greater capacity for sinking excess CO2 (and also sinking heat, rather than reflecting it).

Some however, remains in the atmosphere and since the late 1700s this has increased from roughly 270ppm to 393ppm gaining roughly 1.5 to 2ppm per year. It's primarily due to (exponential) economic growth, which despite improvements in efficiency still mean more over all CO2 is produced by us, because our economy is powered by fossil fuels.

The Oceans can sink an incredible amount, but it takes a while for it to dissipate in the currents thus ocean acidity and ocean temperature are projected (and are starting to) rise due to increased CO2.

-cheers from julz (designer of the FIGnition DIY 8-bit Computer).

Conflict mineral laws haven't helped Congolese

Julian 4

Go TantalumDirect™

Conflict-free minerals are an ethical issue as least as important as Coffee and Tea ethics but it's no enough to make sure they're conflict-free, we need to make sure the miners have a decent wage.

Fortunately, there's a model that already works: Fairtrade. It works for basic beverages; is changing the landscape of chocolate production; exists for clothing manufacturing and even minerals such as gold.

The US legislators should therefore be asking www.fairtrade.org; cafedirect or even www.cred.co.uk ( certified fairtrade jewellers) for their advice: they've been doing it for up to decades, they've probably got the hang of it better than anyone.

-cheers.

Undead Commodore 64 comes back for Christmas

Julian 4
WTF?

Travesty!

Is this the dimmest idea ever? I was expecting a 1-chip C64 that did what a C64 does, not something that isn't even remotely a C64. What next, "QUAD CORE XEON Apple I recreated" ? Jet-powered Wright brothers Airplane faithful replica?

This thing is complete nuts!

Symbian Titanic heading for iceberg

Julian 4
Linux

Symbian OS: A Means to an end

it's fairly accurate to surmise that Nokia always saw Symbian OS as a means to an end, since they've never really promoted the OS as such - you always have to dig into the specs to find out whether it's on any particular phone.

That's really what crippled Symbian OS: Nokia took the amazingly capable and power efficient Psion EPOC32 O/S and neutered it to make it like a Series 40 phone, kludging on a UI which the O/S never really recovered from, to build the Nokia brand, not the Symbian brand.

Which means in turn the genius of Symbian OS - object-oriented from the bootstrap up is just gonna be trashed to make way for traditional kernels along with all the other clever OO O/Ss from the past few decades. Nokia are sailing their Titanic, but the officers jumped ship before it was launched.

Korg NanoKey MIDI controller

Julian 4
Coffee/keyboard

Da-da-da

Good grief - it's a Circuit bent Vl-Tone for the 21st century !

(Escape - 'cos for this baby, every extra key is a bonus )

Intel wades into smartphone wars

Julian 4

Overheads

John 62 "The x86 overhead is probably pretty small nowadays compared to the rest of the processor."

The x86 overhead never got smaller, what happened is that the processors got bigger; so that when you're dealing with large cores with 10s of millions of transistors the x86 overhead diminishes proportionally.

That's not true for MID devices since the cortex A9s etc use relatively small multi-core CPUs; so the competing x86 overhead will be a constant factor. Intel can't offer anything that ARM can do better, they certainly can't offer anything that scales like ARM does.

The question is: why would you want x86, since nothing needs it any more.

Microsoft's Linux patent bingo hits Google's Android

Julian 4
Pirate

Mobstersoft

It's what one normally calls a protection racket.

But weren't MS always in the Mobster business - growing their empire by intimidation? It's not just at an individual level ("I'd better buy Windows, 'cos EVERYONE buys Windows..."), but at an institutional level. There's numerous accounts of US school districts suddenly deciding they needed to dump all non-MS computers for no apparent reason and despite the protests of school governers, teachers and parents.

Didn't Al Capone once say "People hate us, 'cos we're successful." ?

It's nothing but a Mafia computing... come to think of it Steve Ballmer does rather look and act like a gangland thug ;-)

Guy Kewney, pioneer, guru, friend - RIP

Julian 4

El Grando

Gosh Guy Kewney's died!

I certainly found his regular Newsprint section in PCW absolutely essential reading since the first issue I bought: December 1980 ("Yes, but is it art?"). Memorable articles include "El Grando" (announcing the availability of the 68000 CPU with an astonishing 68000 transistors!) "A Mess Dos" (lamenting the unforgivingly crude features and limitations of MS DOS in 1984). And of course his full-page spread where he insisted he didn't actually design the astoundingly good value £599 Amstrad PC1512 in 1986.

Guy Kewney was consistently witty; cynical, prophetic, irreverent and everything I wanted to keep myself abreast of the state of play in the industry. An exceptional Journalist, I'll miss him.

-cheers from Julz @P

Intel finds cure for CPU old age

Julian 4
Pint

Don't you mean Intel & Friends?

Or even better - Intel's Friends.

Isn't this technique the University of Michigan research in conjunction with Arm Holdings and the 'research CPU' is in fact an ARM ISA (which I guess is why the Reg article doesn't mention the architecture in more detail)?

Since Intel no longer muddies its paws with power-efficient processors, I would be surprised if it would build a research processor based on a competitor architecture it isn't licenced for. So I figure the article needs a retitle - "Intel takes credit for rival's geriatric cpu cure" ( All Your Research R belong to us ) ;-)

-cheers.

Bishop Hill: Gonzo science and the Hockey Stick

Julian 4
Black Helicopters

In 5 to 10 Years

In 5 to 10 years, the arctic ice cap will be gone at the end of summer. Of course CCD's will manage to explain this once-in-human-history event as being natural - plenty of planets don't have ice-caps after all.

Now, let's take this bit of hookum: "the IPCC is headed not by a scientist but a railway engineer"

Yes. That's because the Bush administration replaced the original IPCC head with an oil-industry appointee in the first year of his first term. Please don't use Climate-wreaker's despicable tactics as evidence of fraud by Climate Scientists.

H.264 video codec stays royalty-free for HTML5 testers

Julian 4
Alert

It's the licence - innit?

The problem with MPEG-LA's stance is that licenced (not open-source) codecs are still required. Since Mozilla will support only open-source codecs, it can't support H.264 via HTML5 video tags.

Microsoft urges Flash makers to pay fat dollar for exFAT format

Julian 4
Grenade

The Stamp Act

When MS started to make moves on patenting FAT it was under the assumption they'd never charge for it. Now they are and given MS's past record, who is surprised?

Let's get it straight about the seriousness of this. Having to licence FAT (or exFAT - whatever) is like a tax on the right to own paper. It's a direct violation of the right to free expression. The American colonies went to war 250 years ago over a similar British tax. No government, and certainly no corporation - for which the general public have *no* *representation* has the right to implement it directly nor indirectly.

It is *not* worth it to pay a company for my right to take photos or copy data. I don't owe Microsoft for this, neither do you. All that's happened is that they hijacked a form of media and now they want us to pay for their offence.

Over this, one has to declare independence. Here's a grenade to start it off with.

Should you lose your religion on your CV?

Julian 4

Open Diversity

Surely, there's a good analogy here with living in a diverse community. Living in a large city (e.g. Manchester where I live) means you're continually exposed to a wide range of cultures and religions; and basically it's fine.

CVs include both qualifications and experience plus something about you as a person. Requiring people to omit their beliefs would be the equivalent of living in a city where people have to fake being white agnostics as a precaution against people taking offence.

That would never work - the best way of dealing with bigotry is open diversity. For example, IMO it's reasonable to put your faith on a CV if you want, though it's probably best to put it at the end, since it's not a direct qualification for your job. To a religious person, requiring an employee to omit it could easily look like the company is repressively anti-religious from the start.

In addition if you find that certain people put their faith at the top of their CV in big, bold 24-point text, then this is probably indicative of their character in some sense. Maybe you want people who are that forthright in your company! Maybe it implies they'd be less diplomatic in all sorts of social situations. Either way, handy.

I'd always advocate open diversity since as you say, even people's names or phraseology can imply to some people something about their ethnic background/beliefs and the only way to get over that ultimately is to bite the bullet and be open about it. Then we discover it's OK; then we can move on.

Parking spot flies to International Space Station

Julian 4
Pint

Pounds in space

"1,750 pounds of equipment ... 8,000 pounds.. 13 feet long and eight feet" - What? Has Nasa finally convinced Russia to think in retro-units? I know the US has yet to step into the 21st century measurement-wise, but is Nasa really sucking the rest of the space industry into its 1960s vision of the future?

Or maybe Torchwood is behind it all ;-) ?

For the other 95% of the planet: 8000 pounds is about 3636.36Kg; 1750 pounds is about 795.45Kg; 13 feet is roughly 3.96m.

Beer - Because everyone knows what a pint is!

Prehistoric titanic-snake jungles laughed at global warming

Julian 4

The curse of timescales

If it wasn't for the curse of timescales it'd all be a no-brainer.

On geological timescales, the boa in question had ages to adapt to a hotter climate. So did the forests.

On geological timescales today's species don't. So they'll die.

On human timescales AGW will take just the wrong length of time - too slow for individuals to easily accept it; too fast for us to easily adapt. Like a tsunami on the horizon, by the time you realise it's more than just an ordinary wave you're already to late to avoid getting engulfed.

For myself, I'd sooner do what's right rather than what serves my cynical self-interest.

-cheers from Julz @P

Nokia launches laptop

Julian 4
FAIL

Platform Psychosis#2

Nokia already had a near netbook with their Maemo-based N770/N780, it wouldn't have taken too much effort to port it to a full Atom-based netbook or even pushed the envelope a little further by launching a Cortex-A9 based netbook and get ahead of the crowd. They could even have made it a dual-boot Symbian / Linux offering.

Instead, they've shot themselves in the feet, head and chest by following Palm's previously near-suicidal "platform-psychotic" Windows strategy. By launching a netbook based on competitor's software on competitor's hardware they've sent a big message to their own customers to the effect that they don't believe in their own technology.

And if they don't, why should we?

Microsoft stores to get in Apple's face this autumn

Julian 4
Coffee/keyboard

Turn Bing into a verb - or adjective...

Or better still, a comical expletive. As in...

"Couldn't find it eh? - what a bingger!"

"Grief mate, you really bingged out on that one didn't you?"

"What a mess - it's an attack of the Bings!"

"Bing today, gone tomorrow."

"It's the Bing from hell!" (unsolicited search / useless tech advice / intrusion)

Bing is just so divvy... I really don't know how anyone can take it seriously, ESCape now ;-)

Catholic social club ousts coven of witches

Julian 4
Black Helicopters

Vitriol Download

Wow, given all the vitriol here, you'd think Catholics were busy banning almost everyone from everything, everywhere at all times!

Let's get a grip, I'm not R.C, but Ive been to a Celidth at the place in question a few years ago. It's not an amazing venue, but it is in STOCKPORT - an almost-city with it's own postcode region, not a village parish.

And in this case, it really does look like a bit of engineered Catholic bashing, open-season for every gripe about organised religion going.... oh, not quite, I note the FSM and IPC haven't made an appearance yet ;-)

-cheers from Julz @P

BTW: "I don't think the followers of Jesus DID follow the religion of their fathers which means that under old testament law, they should have been stoned to death" I understand, by and large, Christianity was considered a Jewish sect for several decades; so with few exceptions (e.g. Stephen), stoning wasn't on the agenda.

RIP Personal Computer World

Julian 4
Paris Hilton

A Real Shame!

I was a late-adopter ;-) I bought a 'priceless' ZX80 review edition from April 1980 and had every edition of PCW from December 1980 to late 1992 when PCW became PC World magazine :-(

Guy Kewney was definitely a must-read even for that techie-teenager. Covering articles such as: "A Mess-DOS" (bemoaning the awful command line and memory limitations of MSDOS in 1984); as "El-Grande" (the start of the 68000's availability in 1980) to bizarre articles about dodgy MDs of tech companies doing a runner.

PCW's obession with Sir Clive Sinclair - putting a monkey on the front cover of every computer his company designed (apart from the ZX80) always made me grin: The artist monkey reviewing the Spectrum on the first glued-spined PCW mag (they used to be stapled); the bowler hat monkey for the QL and the monkey on the bike for the Z88.

PCW were brilliant not just at news, but for teaching computing. They promoted hobby groups under the banner of CTUK (Computer-Town UK) and ran tutorials on everything from assembler to basic conversion to 'C', Pascal, Logo and early object orientation. Not only that, but user-contributions gave the readership a shot at a little fame (gosh, even I got something in the mag!).

Sad to see it go... I wonder when the back-issues will be worth something ;-)

-Cheers from julz @P