The Model A of when?
Guys, I know it's horrifying but at this stage 1981 was a lot more than 15 years ago.
The BBC Micro:Bit will start rolling out to all year-seven pupils in the UK from this morning. What the kids will receive is a matchbox-sized single-board computer with 256KB of flash and 16KB of RAM, manufactured by element14. Yes, you read that right, the same amount of RAM as the Acorn-designed BBC Micro Model A of 35 …
Sadly I know 1981 is more than 15 years ago. Seeing as I was born that year.
You think you have it bad? 1983 was when I first started programming, on a rubber thump Spectrum. The only computer we had at school was an Apple II and I never even got to see it because only the maths swots were allowed into the hallowed room :-/
Didn't stop me developing what has become a 35-year-and-counting programming career though :)
What is this. A Willy Waving competition?
I graduated from University in 1981, having already worked with UNIX for three years (very progressive university, Durham)!
And, although it was launched in 1981, most people who placed an order for a Beeb when they opened the process (like me, model B, issue 3 board, serial number 7000 and something, still working) waited more than 6 months to actually receive theirs.
I'm just waiting for the real gray-beards who make me look young to wade in with their PET, Apple ][ and Altair stories.
Grey Beard reporting in (white actually but that's genetics for you)
Back in about '79 '80 I hired a Commodore PET from a supplier just to show it off to my boss and what it could do. Crackin' machine way ahead of it's time as an All in One box
Well I say hired I borrowed it from the dealer for a small fee\deposit , probably his beer money for the weekend.
"I'm just waiting for the real gray-beards who make me look young..."
Duly obliging. My Apple II was bought in 1979 on the last day before the selective VAT for electronics kit was raised from 8% to the then general 15%.
However,,, my Motorola development kit was bought in 1976. Motorola 6800 and a full complement of 1KB of static RAM. Unfortunately you also needed a Teletype to drive the human interface. An enormous manual had the logic diagrams and software listings for all sorts of ways to use it to drive bits of hardware. It was a view into a new world where hardware like bar code readers used software rather than TTL chips.
My first real playing with a computer was back in 1967 when I transferred to the company's System's Test department. I could spend as long as I liked trying to break a massive 1MB prototype 3rd generation mainframe in any way I could devise.
Come to think of it - most of my 45 years in the IT industry was a case of being paid to "play" with computers. Now I'm retired it is just a hobby - with no PHBs to spoil the fun. The only difference over that time is that computer/electronic bits have become faster and much, much cheaper ..and the willy doesn't wave so much these days.
"BBC Micro Model A of 15 years ago". The BBC model A was introduced in 1981. Clearly history (or simple arithmetic) is not the authors strong point.
"Micro:Bit’s two ARM Cortex MPUs are descended from the Risc chips of that old Acorn machine." Again complete inaccurate, as the BBC micro used the MOS 6502 processor. It was the later Archimedes that used the first ARM chips.
I don't dare read any further.
Fast page 0 access on the 6502 was a major feature, well used in the Beeb for OS vectoring and frequently used counters (like buffer counters), which made extending the OS possible to even moderately competent machine code programmers.
In many ways, the 6502 was a model for RISC processors. Simple, many instructions executing in a deterministic small number of clock cycles (OK, maybe not single cycle, but better than an 8080 or Z80), very regular instruction set (as long as you ignore the missing instructions that did not work) and with enough useful addressing modes.
Mind you, it was simple because of the limited transistor budget available, rather than a desire to create a RISCy processor.
Fast page zero access was used on the PDP-8 architecture - a few years before the 6502 appeared. Goes to show that there is nothing new in computing - just new names for the same old stuff.
p.s. Got my PET 2008 in 1979. Still have it but scared to turn it on in case the electrolytics decide to expire.
Not entirely inaccurate though:
"ARM Evaluation System
As one of the first production RISC processors, the ARM Evaluation System was part of the development programme leading to the Acorn Archimedes and its early Arthur operating system. It was not branded "BBC", but it is physically contained within the family's "cheese wedge" case. The ARM 1 processor was clocked at 8 MHz, and was fitted with 2 MB or 4 MB of RAM.
In 2006 a new ARM processor board using an ARM7TDMI processor was designed and sold, without an enclosure but able to fit within the original case."
Didn't the Model A have a 6502? Which might have a small instruction set, but it wasn't RISC... Due to complex addressing modes, mostly. And variable length instructions. While it had minimal registers and used ram instead, that was mostly because getting registers on die was seriously expensive in the 70s.
6502 registers limited? Pah! We had loads to choose from! A, X and Y for a start. Then we had the status register, a stack pointer (generally best left to the processor itself, but you could have fun manipulating it) and a program counter (current execution address).
It was a dream compared to the Z80.
I have programmed many different types of machine at assembler level - even hand coding in octal/hex. However I have never felt any attraction to learning the Intel instruction set. I was spoiled by the orthogonal mainframe instruction sets - and the 6502/6800/68000 carried on that clean architecture.
With the enormous gate capacity of modern silicon it is a bit of a mystery why no one has implemented the ICL VME target instruction set. IIRC Its data descriptors stopped slack coding from producing buffer overflows - variables' data typing was policed by the hardware.
The caption to the photo at the bottom doesn't seem quite right. The computer in the picture looks to me like an Acorn Atom. Was this Acorn's first 'kids' computer? Well, maybe. I was a kid and I certainly had one (and spent hours explaining to my ZX81 owning school friends that a real keyboard was actually rather important).
Probably a pre-production Atom - it should have the Acorn Atom logo to the right of the spacebar - unless it's just the way the light is falling there doesn't appear to be one here. It could possibly be a mockup using an Acorn System 2/System 3 keyboard - this used the same case moulding as the Atom.
What a pair of posh gits! I saved for my zx81 from my paper round and it was significantly more than a week's wages at a few quid a week even second hand (around about £50 IIRC). Can only guess that someone bought one and got bored quickly because I bought mine before the Spectrum came out.
Would have loved the more advanced BBC but waaay out of my early teenage price bracket
I'm glad somebody mentioned this. You can recognise a BBC Micro very easily because of the "ashtray" hole (or overlaid gap or, if you fitted one, a ZIF socket) on the left of the keyboard which the Atom never had.
As for being the first "kids" computer, as I recall, the BBC Micro was part of a literacy project that was aimed not just at children but a general audience. It only really became a large part of the childrens' market when the government of the day put in a project to partially fund schools buying computers for the classroom as long as they were British which led to 75% (roughly) of schools buying BBC Model Bs (mainly), the rest tending to go for Sinclair Spectrums. Even taking that into account, the Beeb was never the biggest seller for kids outside schools, not even in its Electron form.
I didn't get one until I was well into my A level days. I still have it, too! ;)
The other systems on the school list was the Research Machines 380Z/480Z systems, which were, IMHO, less useful in the classroom than the Beebs, although one could argue that they may have had more potential for business type computing as they could run variants of CP/M and associated software which was the microcomputer OS of choice for business prior to the IBM PC.
They were also much more expensive!
I think that the Newbury Newbrain was also on the list, but nobody bought them!
I remember my school had the RM 480Zs, although I had left by then, but my younger brother got to play with them.
The school my Dad taught at, in Sheffield, got Sharp MZ80A's though, I remember him bringing one home to "test"... ahem... as he was the member of staff responsible for AV resources - which included computers, in those days.
The local technical college had an RM-380Z in, erm, 1979. It was where us sixth formers from the local grammar school who wanted to do O-Level computer science were sent because they actually had access to a computer.
At the start of the new course, we were using a leased line to an Open University computer. Once the first bills for that came in, various faces went white and, although the 380Z was appallingly expensive, it was seen as a bargain in comparison.
I broke it for about a week by inserting an 8" floppy the wrong way around...
Technically it is the same amount of RAM, but on the BBC Micro that RAM had to hold application code, the screen buffer (up to 10K on the model A) and working variables, stack, etc. - this could leave, depending on graphics requirements only about 5K for code. On the Bit, I assume application code will go into flash, and there is no screen buffer to speak of, so the 16K is only necessary for working variables, stack, etc., and code can be much larger.
BTW - I think you might have meant 256KB and 16KB rather than 256Kb and 16Kb.
This post has been deleted by its author
16KB OS, 16KB Basic, with other ROM based language or OS extensions paged into the same address space as Basic.
The ability to switch into and out-of a paged ROM to handle OS extension calls without disrupting the running programs was an extremely clever piece of design that overcame what should have been a serious limitation of the Beeb. Put some RAM in the same address space, and you could do some really clever things.
Beeb had the OS and BASIC in a masked ROM, and add ons were either masked ROM or EPROM. Useful, as I recently changed my B from DFS to DNFS just by UV wiping the EPROM and blowing the replacement ROM image into it.
Downloaded from the Internet, burned into EPROM, installed. I guess this is piracy, eighties style. That and dub tapes of software that throws up never ending load fails. ;)
It should be able to run Elite (assuming Elite only needed 16KB of variable data) - the CPU is a lot more powerful than a 2MHz 6502 - but there's no video output (I'm assuming that audio can be sent over Bluetooth at least), so someone will need to hack up a software controlled video output (not easy, but has been done for PICs, etc, in the past so it should be possible) first.
Ah, one possible solution is to use an SPI graphics chip, as the micro:bit features SPI on its edge connector (and I2C).
That will have its own memory as well, negating the video memory issue (16KB isn't a lot, a B&W 320x200 display needs 8KB)...
And I think a lot of hackers will be interested now they see it has SPI.
I, personally, would have preferred the microbit to feature a character display (e.g., 40x2) rather than a LED matrix, but maybe the LED matrix is cheaper.
>Ah, one possible solution is to use an SPI graphics chip, as the
>micro:bit features SPI on its edge connector (and I2C).
>That will have its own memory as well, negating the video
>memory issue (16KB isn't a lot, a B&W 320x200 display needs 8KB)...
The second statement is false; you don't need video memory, you just need the processing power (which is there) to generate the frame each time round the refresh cycle.
IRC Elite was vector graphics (greybeard to real-world translation; mostly black and the rest straight lines). It's a trivial problem to do the graphics for Elite given SPI, it's just more code than anyone cares to write.
John Bowler <email@example.com>
Elite needed a model B - 32k RAM + ability for higher res graphics - except the hi-res used 20K of the 32K leaving not enough space for the program so Elite had a trick of progamming up the on board timer to detect when the screen refresh was half complete so that the program waited for this and then reprogrammed the video controls mid frame to dispaly the bottom half of the screen in a lower res to save RAM! N.b. video RAM was a single buffer that program wrote into to set up image and video controller read to stream out to TV - because you got "snow" on the screen if you tried to write to the buffer while the image was being displayed games actually tended to spend most of the time waiting for the video flyback signal to be asserted and then do all the work for the next frame, update the image etc before the TV had got back to the first scan line - so most of the time games were actually waiting for the flyback signal and not actually doing anything!
The Model B also had the 6522 VIA, which was needed for timings, such as Elite's mode trick.
The Modal A had a cassette interface, BNC composite video out, and TV out - that was it. A base Model B is identical to a Model A except for the extra 16K of memory and the VIA chip. Everything else (Econet, Disc, Tube, et al) were optional extras that required soldering.
Everything else (Econet, Disc, Tube, et al) were optional extras that required soldering.
To an extent. Certainly true if you were upgrading from A to B, though if you got a model B from new it would often have most of the soldering already done with just the various sockets needing population. ISTR that the speech synthesis bit was usually left out though.
"because you got "snow" on the screen if you tried to write to the buffer while the image was being displayed"
I remember that being an significant Atom problem, but I don't remember having snow problems updating the screen on the BBC.
Must be getting snow in my memory ... now my mind is jogged the absence of snow was one of the improvements from the Atom ... I must have been remembering writing for the Atom where you *had* to update the screen during fly back. That said, it was still direct mapped memory so you really didn't want to be be updating too much on the screen while it was being scanned to keep things "smooth"
Snow was a problem on the original CGA cards on IBM-PCs (but not the monochrome adapters IIRC). Likewise, I don't recollect it being a problem on my ModelB! The mode switch on Elite also enabled the scanner view on the lower part of the screen to display in colour so you could see the goodies and baddies.
Ah, t'good old days... writing up assignments in Wordwise, printing them out on an old Juki NLQ dot matrix printer, saving to single density 40 track 5.25" discs... The youth of today don't know they're born...
Not quite: Elite actually used the monochrome mode 4 (medium resolution, 320x256 I think) for the top half and the low resolution four-colour mode 6 (160x256 with fat pixels) for the bottom section where you needed the extra colours: they were only 10KB graphics modes.
The disk version used overlays loaded dynamically when swapping from flying play to trading screens: the cassette version was much more limited in scope but loaded in one go.
We just don't do that sort of hackery in modern computers...
Peter Ford: the lower half is Mode 5. Mode 6 is Mode 4 with the two blank lines between every character column; Mode 5 was 2bpp, 160 pixels across. Mode 6 is, like 4, 1bpp and 320 pixels across but only 25 8-line character rows high, spread over 250 display lines.
With no VIA and the mid-frame real-time clock interrupt not being conveniently placed, the Electron version of Elite is just Mode 4 for the entire display.
After Elite, I'll have Starship Command, please.
Coding on bare metal? Sorry, I must have missed the bit about the macro assembler.
Given the article shows things like boxes with "microbit.bearing;" being dragged and dropped to put together a so-called program, it looks like any stitching together of high-level libraries is exactly what is going on here. I just can't see this leading to the next generation of CAD packages, office applications or even anything remotely useful.
Really? It's quite simple to imagine. A child that isn't interested in computers sees that it's easy to construct something using drag'n'drop and realises it can be fun. After working with that for a bit they then want to go further and branch out into more advanced languages and development environments, eventually establishing a career in IT and creating something useful.
You need to work on your imagining, that was easy!!
Given the article shows things like boxes with "microbit.bearing;" being dragged and dropped to put together a so-called program
From the article:
"... the Micro:Bit ships ready for four development systems."
"Entry level is Microsoft Blocks, which if you haven’t seen Scratch, is drag’n’drop flowcharts where kids simply assemble programs and fill in blanks."
"Microsoft TouchDevelop has been in schools for a while as a simple way to get kids started in mobile app development for Android and WinPhone. The feedback from teachers has been pretty good and again it is helpful and simple."
"As it’s based on ARM’s mBed platform it can also do C++, so the more advanced kids or adults can do hard stuff like drivers for new hardware."
I know we have a problem with obesity in kids, but why do we have to give them reduced sugar Raspberry Pi's?
Eh? I thought the low fat RasPi was the Zero!
Which is an interesting thought. Had the Micro:Bit come out before the Zero, would the Zero be as popular as it is? Could the imposition of a certain Merkan organisation and its propaganda frighten folk off from adopting a Micro:Bit, especially if it means that you need a PC to run it?
I'd be more likely to refer to this as a low fat Arduino than any sort of RasPi...
Anyone remember this https://en.wikipedia.org/wiki/Versatile_Laboratory_Aid
We had one hooked up to a BBC micro in college to plot a waveform - so we had a DSO in schools in the late 80s. I only had an analog storage scope with a polaroid camera doing a PhD 10 years later !
Not really. If you really want kids to learn then the ones who are interested need to be given a lot of freedom to experiment, mess around and try things out. They can't do that in the structure and time limits of a lesson.
The old BBC Micro days taught kids very little other than some 'basic' programs but didn't require the logical thought and problem solving skills necessary to be a programmer - in fact as it was wheeled around on a trolley and none of the teachers had a clue what it was for you only saw it a few times a year and had a few minutes to use it. Their Spectrum at home, when not being used for Chuckie Egg, allowed kids who wanted to to write their own programs and experiment for themselves.
In fact compared to the ridiculous costs they pay for overpriced interactive whiteboards and support the cost for 30 new Micro:bits each year would be an insignificant part of their budget.
I'll add to that. In so far as initial experience of ZX81 owners was typing programs in from magazines, kids often had a chance to see what a particular line of code did when it was typed wrong.
And once they'd got over that, how to make the game easier by upping number of lives or decreasing the strength of the monsters in the maze. A form of learning by tinkering with the mechanism.
And that would be a perfectly valid argument if the thing in question were a computer with a display capable of showing a game like the Pi is. What kid, suitably inspired to tinker with coding skills, is going to treasure and experiment with a configurable blinky-light brooch for more than a week or two?
It seems to have been designed by committee which I suspect is one of the reasons it was delayed so long.
I do wish the project well but with the Pi Zero now available I think they missed the boat.
Also items that are given away are not valued and often discarded quickly. Thomas Paine: "What we obtain too cheap, we esteem too lightly:"
I think that they missed the boat because the Arduino 101 has appeared. It's roughly the same specifications, and, importantly, I can buy one (and not have to mug some 12 year old for it). The community around the arduino is also a *lot* bigger currently, though that may change in time.
Goes very well with the drag and drop school of IT management, where developers, regardless of skill are treated as fungible typing monkeys.
Today's musical interlude courtesy of the same person that provided the wrap-up for Portal, Jonathan Coulton. https://www.youtube.com/watch?v=v4Wy7gRGgeA
"Micro:Bit Live!" perhaps?
Welcome to an hour of prime time BBC1, plugging it in, uploading code then ... pause ... will it work? ... pause to annoy the audience with naff drum rolls ... then 'oh look "Hello World" scrolling on some LEDs'.
'In the next programme we'll demonstrate how to wrap a parcel and set up an account on eBay.'
This programme will also be repeated on BBC1 then available on BBC1HD, BBC2, BBC2HD, BBC News24, iPlayer, UKTV Gold, Dave, DaveJaVu ...
Oh, sorry, my mistake - let me clarify: If you design a device that can be hard bricked, you're a f######g idiot. In my defence though it was an easy to make mistake - people on El Reg level usually know better than using the word "bricked" for what is a plain "soft bricked" device (which by the way all "examples" above are - or were those supposed to mean "Apple is doing it too therefore it's perfectly ok"...?).
How long until we get the BBC take on a "lets play" with Crystal Rainforest?
Also IT lessons in school havent been the same since token ring networks.
Hands up if you were the bastard that used to unplug your T connector because your IT teacher didnt know how the LAN worked to get an hour of "choose a game from the box" time. While they called out tech support.
Is anyone else a bit miffed at this shiny new replacement?
I liked "programmer" because it reminded me of a vast array of lights and spinning tapes which needed to be told how to spin and flash.
I also like "software engineer" as it reminds me that there's a lot more to "writing code" than understanding how to do loops, and some of the worst examples of "coding" suffer from general cluelessness about how to build systems.
I'm also a bit dubious about the worth of protecting the childerrrun from the command-line. It seems to me a bit like not wanting to put childen off an interest in biology or medicine by giving them fluffy pretend rats to 'dissect' in an "operation by Mb Games" sort of way. It's the details below the shiny surface that are what is interesting about computers... that's the bit you need to get excited by if it's a career for you.
In short, children should be taught how to write rude scrolling text in WHSmith. I think that's what I'm getting at.
The world has moved on. When we were kids, if you were introduced to a computer, you'd probably have had no choice but to use the command line. These days, a child's first experience of computing is probably a parent's iPhone - going from that to a blank screen with a flashing cursor that won't do sh!t until you learn some arcane language, probably isn't going to seem particularly encouraging.
"You might not be scared of cryptic syntax errors, but many 12-year-olds are. "
and they will continue to be until they're exposed to it! if they're not exposed to it at an early age, many will remain scared of it, opting for the instant gratification instead, because its easier and mostly done for you
>"You might not be scared of cryptic syntax errors, but many 12-year-olds are. "
>and they will continue to be until they're exposed to it! if they're not exposed to it at an early age,
Exposure to stupidity surely just ensures that the intelligent child assumes all of us are stupid?
I have been doing this for 37 years and I know that all of the syntax errors are there because the person who wrote them could not be bothered to explain. I understand almost all of them and those I do not understand I can understand with little effort. I would never wish this pointless, idiotic, capability on any other human being.
And, as for the El Reg author:
> instant gratification
And you and I did not want the same? I remember those pictures of an older woman on the cover of ETI.
I had a play with one at a IET event. I was quite impressed and the kids we were helping got the hang of it really quickly.
I think you have a choice of languages, including Python, so you can grow with it. I wasn't so impressed with the microsoft block language. The user interface made it difficult to edit. One peculiarity was the events. we had assumed that they operated like a blocking call, like a keyboard input, but in fact they are more like interrupts. It took 3 of us to work out why some code was not working because of that.
The other issue is the I/O points. It is far to easy to short circuit them which can cause it to reboot.
However generally they are pretty neat, fun modules. Wish I had had one when i was 12
"The other issue is the I/O points. It is far to easy to short circuit them which can cause it to reboot."
Oh dear ... anyone care to enter a sweepstake on how long it will take one kid in a class of thirty to work out how to draw pencil lines between the I/O terminals ...?
All I ever wanted at 12 was this: http://content.time.com/time/specials/packages/article/0,28804,2023689_2023681_2023598,00.html
A Mattel Electronics Football game... but then I discovered the radio in 1977 and built them instead.
Funny, I still want the Football game...
This bit device has some interesting functionality like compass, accelerator, buttons, bluetooth and a little display to make some neat projects.
It would certainly be a LOT less hassle to get a class set up with them than a Raspberry pi where everyone needs SD cards, USB keyboards / mice, HDMI outputs, network etc. That would be a nightmare for the teacher by comparison.
Personally I don't see this as being a loss for the Pi since kids who learn the fundamentals on the bit can easily progress to the Pi.
Are there any schools without some sort of IT hardware available for the kids to use? Mine are only in years 3 and 4, and they do Scratch programming on a semi-regular basis (using Windows PCs that they acquired on a highly unfavourable IT contract that damn near bankrupted the school).
I love the comments from the people who lived once in a dramatically changing world, but want the world not to have changed any more so that their (1980s-era) experience and methods are relevant.
My take? The computer revolution of the 80s gathered up all those kids who would have otherwise been soldering components to make drum machines or guitar amps. Doubt me? Ask anyone from the hobby electronics publishing industry. That market died almost overnight as PCs became a reality.
But today? The "maker community" is as much about mashing up other people's work as originating and understanding. To grab that potential you have to modify your approach.
Or, to put it another way: These days you are trying to attract the attention of a generation born to the passive computer experience: if you want it, you buy or stealz it. Getting this mental set into the mode of "let's write a program from scratch" is a non-starter. You have to show them why they would want to do that first.
You know; like you do with everything else when your audience is children.
Design your own amplifier did you? Or just slavishly copy the layout in the magazines.
Mashing up other people's work to create something different, and possibly better, is what we call progress.
The nice thing about this otherwise fairly useless gadget is that you DO write a program from scratch. Yes you drag and drop rather than typing linker commands in the console but you understand algorithms, functions, things happening as a response to other things. Much better than being sat down in front of an IDE and told to copy 100 lines of initialization boiler plate without understanding it, and without getting a single character wrong, just so you can get a blank window to appear.
Design your own amplifier did you? Or just slavishly copy the layout in the magazines.
Design your own computer did you, or just slavishly get your dad to buy you one?
And yes. In my Physics "A" level exam. Wouldn't have worked, but I realized that ten seconds after "pencils down" and could have fixed it there and then if not for time crunch and the invigilator's beady eyes.
"Mashing up other people's work to create something different, and possibly better, is what we call progress."
No, it's what we call plagiarism. Progress comes from pulling apart other people's work and *understanding* how it does what it does, then using that knowledge to do more. In a world where electronic maker forums are populated with people who do not know the difference between an LED and an incandescent light bulb we are not seeing progress. We are seeing "Monkey See, Monkey Fry Another LED".
If you had been less eager to snark you might have read what I actually wrote instead of what you wanted me to have written. My problem is not with the device or the way it is to be used, it is with those saying "Well that won't work because they aren't learning to code like I did on my BBC model B".
But judging by your response all you saw was the frightening news that your own experience was becoming obsolete as a learning model. Welcome to the real world. You helped usher it in with your Model B/Spectrum/Whatever it was back in 1982.
That I should have been brought so low. Downvoted by SINKs and DINKs on El Reg. *sob*
Microsoft blocks be damned! That's the typical whale song and bollocks designed to stop anything actually happening.
Works great with MicroPython as we had the privilege to see at our local Python user group meeting in January (in German). Though the restriction to 16 kB does severely limit what you can do with it as you can't really run a program and use the Bluetooth stack at the same time.
The benefit will be the 1 million units should, like the RPi, provide a large enough market and could help standardise IoT components.
It's not a physical lock-in but "do everything on the MICROSOFT cloud with MICROSOFT tools" lock-in.
This is hardly going to encourage the tinkering for which the Microbits are ideal. Scratch for the RPi already has the visual introduction to programming angle covered.
I don't see why the BBC didn't get behind the existing RasberryPi foundation? Instead they use the license fee to undercut an existing charity, I wonder why? The cynic in me suspects a manager is looking at this as his pension-pot spin off? Much like many of the the other publicly funded parts of the beeb now run by ex-BBC managers.
"It's mainly to do with the BBC charter prohibiting it IIRC. The RPF were in preliminary talks with the BBC before the Pi even launched."
That's the excuse the Beeb are using, but that seems like bullshit when you look at the "partners" involved in this, I bet Microsoft in particular would pay big money to ensure this kind of thing stays away from Linux and Scratch (Yes, even with their new "Linux friendly" attitude of late), a million kids knowing even a modicum of Linux and being familiar with Open Source programs in an X Window environment would be the stuff of nightmares for MS if tommorows IT professionals grew up with Linux, and the Arduino style "upload your code from a Windows PC" way of doing things is precisely how Windows IOT works (Even with the PI).
"It's mainly to do with the BBC charter prohibiting it IIRC. The RPF were in preliminary talks with the BBC before the Pi even launched."
"preliminary talks...." ....Lordy. the original PI launched over 3 years ago....Roger Wilson, Chris Curry, Herman Hauser, Steve Furber will be laughing their arses off, given the lead time they had to deliver the BBC Micro.
Charities... how does this compare to the BBC's charter/relationship with say Sports Releif, Comic Releif, Children in Need, or the BBC's music foundation or other charities that get promotion on TV and Radio etc.....
I think the bigger issue is probably: should the BBC be involved in this at all? I think it's a laudable scheme and I'm a big fan of the BBC principle, but I hope that some kind of non-quango will take over the running soon. No need to give the anti-BBC ammo in charter renewal year.
They are involved precisely for reasons of charter renewal. It ticks a 'public outreach' box that the BBC use to justify their existence, the previous pre-renewal scheme a decade ago gave us the BBC Open Centres - remember them?
It's a pantomime of the BBC reapplying for something they know they'll get. Late running, micro.bit could have best been deferred to integrate properly at the start of an academic year except that would leave only three months of impact until charter renewal. Expect all BBC support for it to decline suddenly after December.
Now decide you want to take your PI out of the house and stick it on the handlebars of your bicycle/scooter to do whatever it is you programmed it to do. How long does it run on 2 AAA?
What happens if you leave your bike round the back of your friends house and their dad accidentally waters it along with the lawn.
It's small, very portable and more replaceable than a Pi.
Genuine point - I'm really unsure why we're coming out with all these low powered devices in some kind of hope that it will teach kids fundamentals of computing or programming? I'm including the Pi here but know this will instantly get down voted because of peoples affections for it.
The affordability argument is out of the window for the majority of people.
And as for what they will use when (or if) they go into a related field of work? Well it's definitely not one of these devices, is it?!
Maybe in about 10 years time, or longer, you'll find out. It might be that these turn out to be a barely remembered oddity, or alternatively they might make the new/next generation of it-types go all misty eyed in the same way as many here do about zx81's, speccy's or Beebs.
Seems to me it's worth a shot at giving kids an easier route to poking about under the shiny graphical eye candy...
>I'm really unsure why we're coming out with all these low powered devices
>in some kind of hope that it will teach kids fundamentals of computing or programming?
The devices exist already. They are widely used. Teaching the people who, with any luck, will be wiping our butts for us in a few years time to use them is surely a good idea?
It's nothing to do with "computing or programming", whatever that means. There has been a gradual shift since the end of the 1960's from hard-wired electronic control to software electronic control of devices we use. Along with this has come an incredible increase in the number of devices that use electronic control. I used to use a stick with petrochemical bristles stuck to it to clean my teeth, today I use a microprocessor that is significantly more powerful than those I programmed in the 1980's.
There was a moment around 1981 or 1982 that still sticks in my mind; a coworker who I shall call Hugo Tyson for the purposes of this posting came into my office where I was doing whatever we did in the 80's, probably looking at low-res bitmapped images, and observed that the future of the company we worked for was, "one in every washing machine".
I've spent the rest of my life looking at bitmapped images and, yes, there is one in every washing machine. Indeed, far beyond Hugo's expectations (maybe) there are actually more than 10 on every desktop; remember there is one in every micro-SD card and there are countless numbers in every desktop PC (not that anyone would be seen dead with one of those these days.)
So it's like the Physics A level (before they started trying to teach people to remember class A or B amplifier design); it's utterly pointless, yet, if you understand it you might understand (global warming/why your toothbrush stopped working) and that might help.
John Bowler <firstname.lastname@example.org>
The One Show this evening trumpeting the launch of the Micro Bit was a hoot and a half.
Apart from all the "Isn't this wonderful, I don't understand any of this stuff so lets hear from a random 15 year old ''Expert" bit I loved how they got a bunch of schoolkids to turn the Jodrell Bank radio telescope to point at it's target,....well done!....except for the shot of the intermediate Raspberry PI's screen with Raspbian controlling the whole thing and presumably talking to Jodrell Banks ''big" computers or control systems, so basically the Micro Bit provided the buttons?, well done Beeb, well done!! lmao
...and the rest! Back in 1982/3 I was the proud owner of one of the first Model Bs smuggled into Saudi Arabia, in those days the 'Old Airport' sniffer dogs were only trained to find whiskey, dope and blow-up dolls. I spent *weeks* creating and typing in an Automated Darts Scoring program complete with speech synthesis that was used by our team in the Riyadh Winter Darts League. It spoke the scores and automatically calculated the shot-outs and even printed out score sheets on my FX-80. Lets see how creative today's kidz are when it comes to the usefulness of their apps.
Although a wholesale resumption of a BBC Computer Literacy Progrmmee is a nobrainer of an idea..... from an old hand BBC Model B, Master, A3000 old fuddy-duddy.
However not basing this on the Raspberry PI Foundations Pi Zero is an insane descision. There is no rreal reason why the Smartphone linking and device abstraction work could not have been done on this hardware. the Microbit is hamstrung by it's limitations, and the PI zero is only £4, so money is hardly an issue.
A stupid, stupid descision.
Biting the hand that feeds IT © 1998–2020