Please no
Not media centre, think of the children
Microsoft delighted retro fans by closing its Build conference with an open-sourcing of 1983's GW-BASIC. Warning that he would not be accepting pull requests from fans seeking to scratch a 37-year-old itch, senior program manager Rich Turner made the treble by adding the open-sourcing to the already-scored goals of Windows …
WHILE/ENDWHILE only appeared in BBC BASIC V for the Archimedes and later.
BBC BASIC is still being developed in three branches - firstly RISC OS continues to be developed and is now open source, BBC BASIC for Windows has branched to BBC BASIC for SDL2 and is now cross-platform, and the open-source Brandy BASIC still sees development on the Matrix Brandy fork.
Rich's blog post about this (referenced in the article) does show up a gap in our collective memories about computers which I figure was due to the arrival of cheap terminal type computers such as the BBC Micro. There were usable tools around since the late 70s for microprocessor development but they were insanely expensive -- an Intel MDS, for example, cost about 50,000 pounds at a time when a decent house could be had for 10K. These systems did come with compilers -- typically PL/M and a really versatile linker locator. Gradually other systems appeared -- the Zilog development system had a seriously useful language, PL/Z, which could be used to build complex, overlaid programs. My first pre-PC system was an Osborne 1, it was CP/M based and by the early 80s you could get all sorts of languages for it (I had Prolog, Lisp, PL/1 and Forth, for example).
My first contact the IBM-PC world and Microsoft came in mid '84 and while GW-BASIC was pretty good as BASICs went their software ecosystem had numerous deficiencies. One was that their version of the Intel linker/locater was seriously deficient, it was like a cut down version that was narrowly targeted towards buiilding simple PC programs. The other was a lack of language support which prompted inexperienced programmers to produce some fearsome piles of crap. Its actually not difficult to program in assember provided the assembler supports -- and you know how to use -- macros. This is probably the key to Microsoft's portability; macros allow you to structure assembler in something that starts to resemble a high level language**. This combined with a proper linker -- one that allows multi-module suppoort and the control of symbols exported from linked modules -- makes programming easy. Unfortuantley Microsoft seemed to keep the good stuff for themselves leading to generations of programmers struggling with poor tools and tortous code.
I'd guess that if GW-BASIC was available for several processors then the real source is in macros.
(**Its possible to write an assember in macros -- the code doesn't have to emit machine code for the processor it was ostensibly written for).
"There were usable tools around since the late 70s for microprocessor development but they were insanely expensive -- an Intel MDS, for example, cost about 50,000 pounds"
You were shopping in the wrong place. The S-100 systems were around since the mid '70s and much cheaper than that.
An Intel MDS typically had an in-circuit emulator -- a F'in Big Plug on the end of a cable that fitted into the target hardware where an 8080A, 8085, 8086 or 8088 would normally go. That allowed a programmer/developer like me to write Assembler and later C code for a target system and then run it natively with hardware trace, triggers and a whole load more without adding piles of memory-hogging debug statements and the like.
So, yes, an Intel MDS could cost £50,000 although that was an all-options pricetag, a bit like a fat Mac Pro workstation. The Intel MDS we had only had an ICE for the 8080A and floppy disks, no HDD but that still cost the thick end of £15,000 back then (mid 70s). We did get a swingeing academic discount on that sticker price though, along with restrictions on what we could do with it like no commercial development work.
Anyone else have one of the Blue Meanies, BTW? Ours had most or all of the chips inside in sockets on vertically-mounted boards and inadequate cooling resulting in the chips walking out of their sockets over time due to thermal cycling. It was a regular thing for us to take the cover off and push them back in place before starting the system up in the morning.
My first job in '86 as a new graduate was programming in PL/M on a huge, heavy blue MDS - double 8-inch floppy drives, with one disk for the system & the other for the code. Loved the clunk...clunk noises as the heads banged about during compilation. IIRC, we had an extension card so we could compile for the 8086-family in addition to the native 8085. They were on the way out when I joined, but still required for those working on System-X networking kit, which was being rolled out.
The MDS was networked to (I think) an NRM - a white really expensive server, about the size of a chest freezer, which also allowed serial-port-networked MS-DOS PCs to access the file system. It came with a multiplayer maze/shooter game called Snipes which ran on the client MS-DOS PCs, & the office used to ring with the yells of the slaughtered.
We subsequently bought two Intel ICEs to do embedded 80188 development - these were the beasts with the great spider cables, which as the OP states, were plugged into the target system, having removed the target's actual CPU. I was working down in the bowels of our team-written embedded OS, writing C in MS C 5.0 (which had a preprocessor bug that couldn't correctly expand our huge macros, themselves auto-generated via Intel's AEDIT scriptable editor, run as part of the build process). Code had to be compiled, cross-linked, burnt to EEPROM, walked down the corridor, plugged in, powered up etc. in an endless cycle of fun, with much staring at oscilloscopes and HDLC protocol analysers. Strange what you remember - happy days.
I do wonder sometimes why they feel the need to sit on code that is so old anyway. Much of it is dead, rotted, and probably no longer holds any "trade secrets" that aren't already known in other contexts nowadays (And yes, I DO know there is still code running out there that is even older and still proprietary)
Code *that* old pre-dates source control, at least on the platform in question. So no, it certainly isn't a case of flickinh a switch on a repo, but it might be a question of actually finding the code at all. Possibly even typing it in from a faded and barely readable (and certainly not OCR-able) printout.
I still remember when I realised I could do FOWIA (find out where I am) in Z80
POP HL
PUSH HL
RET
with one fewer byte:
POP HL
JP (HL)
meaning my subroutine was just E1 E9 instead of E1 E5 C9. I have some vague memory that E9 was originally "undocumented" - does that ring a bell with anyone?
*Yes I had to look up the hex, it's 4 decades since I could program by POKE addr,byte
(at 14 it was easier to memorise the instruction set than find the money for an assembler).
Ah Z80 assembler. I remember doing that on Spectrums & Amstrad CPCs. Once I went to Uni and discovered Intel CPUs I was shocked by the x86 memory segmentation and wondered how anyone could work with such an environment.
I have a dream of getting back into assembly programming - but with a CPU with a sane instruction set.
Indeed, the 8086/8088 addressing and instruction set was and is unremittingly ugly. It has only gotten worse over time. Somewhat reminiscent of a third world bus put together from a number of 1950s automobiles by a self taught mechanic whose only tool is an oxyacetylene torch.
But my impression is that while the 68000 had a saner instruction set, it faulted if one attempted to fetch a 16 bit operator from an odd address. X86 only charged a small execution time penalty (one clock tick?) for fetching 16 bits from an odd address. That was a meaningful difference in storage needs back in the days of $100+ for a few 100K of memory.
Could have that all wrong. Never had an MC68000 computer to play with. I did play around with Motorola's 8 bit MC6809 which was a joy to program.
I dont recall any problems with 16 bit operators in odd places - but the assembler should sort that as an option anyway. I used a CP/M 68K machine for doing some programming learning and it whupped Intel for ease of coding. I would put money on the wrapper and coding checks you had to put in to ensure you didnt wander over segment boundaries wasted far more storage and even a weeks coding on a PC would come up with so many peculiar happenings - code you thought was solid for months would fall over so you'd trace your new code for hours - that a lot more ram would be cheaper.
I learned a huge amount of defensive programming on the PC which has helped me in many ways but the joy of Linux and even the pre-release version of NT in the early 90s were game changers. I'd used unixes on 68000s for chip design work and they had been very stable but I'd not coded on them - largely because I was afraid of crashing them and they did disk checks on reboot! It turns out I probably could have and not worried!
"I'd used unixes on 68000s for chip design work and they had been very stable"
I had a short gig developing some reporting stuff for a factory installation on a Motorola box. At the end I had to go to Italy to install it on site. The machine kept crashing & I'd find odd files in lost+found which were clearly dumps of bits of memory. There were odd murmurs from my client's client about not letting me go until it ran. Fortunately it eventually managed a clean run and I escaped. I later heard it was traced to faulty memory.
The 8088 had the merit of existing when IBM needed manufacturing volumes of whatever they chose, so I don't think we should be too harsh on them for that decision.
As for segmentation, the 386 could have run a perfectly usable 32-bit platform with virtual DOS boxes for old software, in 1985. That's much less than 10 years after IBM's decision. If it actually took much longer before the world was free of 16-bit segments, it is because of much later decisions.
I did very little assembly programming; in fact just some M68K on some programmable controller box thingy at college and that was it. Prior to that I was jealous of BBC Basic and its fancy-pants inline assembly and after that I was vaguely curious about Vax Macro and the System/3x0 assembler that the old beards at work used, but that's about it.
I only discovered the x86's weirdness (nomenclature because it was the 80286 by this time) as I was press-ganged into writing an email retrieval client for our salesentities' luggables. It was awful. I mean my code was awful: I've made many indiscretions in my time but I'd never programmed a PC before and the remit was vague, so I should've refused and said I'd rather eat my own shoes than get involved with it, but I was young and not sufficiently well-versed in being stroppy when it mattered. So yeah, someone else's remit was "we've been told we need to do this, all we need to do is find someone naïve or daft enough to blame. Preferably both." And there I was. In hindsight, though there's not a shortage of challengers, I'm fairly confident it's the worst code I've ever written.
Obviously going back to having only 64K to play with was a bit of a culture shock: I mean I grew up in the 8-bit world but that was then and I never had to do anything especially challenging with it. I know I could've designed it to use some sort of HDD-based indexing malarky but when you have a "by the end of the week, how hard can it be?" requirement etc.
So I looked into how to work around it. I was also very interested in operating systems design at the time and absorbed all sorts of stuff about virtual memory systems and so on, inasmuch as a laysloth's knowledge can suddenly be expanded, anyway.
With that as my context, I read about the 80286's memory addressing.
Argh.
"meaning my subroutine was just E1 E9 instead of E1 E5 C9. "
And working out clock cycles for code then agonising over whether you could spare a few extra bytes by using only low cycle count instructions for the speed or use some of the more clever op codes to save RAM but took many more clock cycles to execute.
if it's on github, can we fork it with patches for Linux? keep it open source with all of the copyrights intact, but let us do GW BASIC on Linux, too?
I've always thought BASIC was a good learners programming lingo. We can keep it alive this way.
/me looks into this - 146 forks already!
> I've always thought BASIC was a good learners programming lingo.
You may want to have a look at yabasic, which runs natively on linux, and I think there's a windows port too.
There was also another basic. p-basic, in the 80s, that was often included on disks of "public domain" and "shareware" programs written in BASIC.
"You may want to have a look at yabasic"
found it in 'ports' for FreeBSD. Thanks. [it's always good to check stuff like this out]
But I still think it'd be fun to port GWBASIC on my own. It's all 16-bit 8088 ASM files and most likely using direct MS-DOS calls, so making it work for 32-bit or 64-bit and more 'generic' I/O would take some time, but it should be possible (maybe fun, who knows!). Maybe if I wrap a 'main()' around it... and use portable C language I/O like 'read()' and 'write()' ...
The original MS Basic for CP/M etc was um "copied" from Dartmouth BASIC, a cut down version of ForTran. It basically founded their company,
I skipped on MS Basic till VB5. I used QUBAL, Fortran, Pascal, Forth, Modula-2, Prolog, Occam, Coral-66, C++ and C all before 1988 and before VB5, obviously, (short period) then VB6. It was the Forms and ODBC with data controls made it handy for in house stuff.
The GW-Basic was a seriously obsolete idea in the 1980s.
>I skipped on MS Basic till VB5.
But did you really? If you used BASIC on a Commodore (PET all the way to the 128), an Apple (II+ forwards), a TRS-80, any of the MSX machines, a Thompson or Olivetti, or even an Altair 8080, then you used a BASIC written or licensed from Microsoft.
But I never did. I used Z80 Assembler, Pascal, Forth, Modula-2, 8051 assembler, NEC 78HC11 assembler. I learnt QUBAL at school and Fortran at college, then my wife, who studied from CAR Hoare, taught me how to program properly, even in Assembler. Macros and Forth like use of the stack let you use assembler better than Basic.
I wrote a static colour test card in Basic on the Spectrum and looked at the Basic on Apple II and IBM PC, Didn't use them. Wife got me to put UCSD-pascal on the Apple II to learn programming. You needed Modula-2 or Turbo Pascal (more like Modula-2) on the PC for actual applications. I had Modula-2, Prolog and others on the PCW. Controlled test gear with a veroboard interface and Forth on a Jupiter Ace.
I'd be careful what got revealed if I was MS. Perhaps there are too many skeletons in their closet from code they 'borrowed' from others? Little of what they initially did was original.
I forget which of my techie history books it is in, but IIRC the first thing Gates and Allen did after deciding they wanted to write a version of BASIC was to go get the DEC BASIC manuals from the uni library and start copying them. NT also borrowed from DEC. MSDOS was just a rebranded QDOS which itself was an unauthorised rip off of CP/M-86 by Digital Research. The Windows GUI was borrowed from Apple after they themselves got it from Xerox PARC. Etc.
I'm sure there are many more examples of MS 'borrowing' whatever they thought they could sell.
I grew up with MS Basic, which made me a bit of a laughing stock amongst people with better computers: considering BBC Basic and Locomotive Basic were approximate contemporaries.
Then I went to college and saw DEC Basic: although it's not something I used (I was trying to learn C at the time, and as much as it's just what I use now, it was traumatic) it seemed instantly and oddly familiar, more than just being another Basic dialect. I have no idea if there was any connection, though I notice that the released source archive has TOPS-10 style 6.3 filename conventions.
Well spotted Vometia. There definitely is a connection between the original MS BASIC written for the Altair and DEC's BASIC. Gates and Allen have never tried to hide this connection AFAIK and have freely admitted it, though sadly I forget exactly where. MS BASIC was also written on a PDP, as was their 8080 emulator. In any case the link was definitely documented. I did find this:
"BASIC-PLUS is an extended dialect of the BASIC programming language that was developed by Digital Equipment Corporation (DEC) for use on its RSTS/E time-sharing operating system for the PDP-11 series of 16-bit minicomputers in the early 1970s through the 1980s.
BASIC-PLUS was based very closely on the original Dartmouth BASIC, although it added a number of new structures. It also included a number of features from JOSS concerning conditional statements and formatting. In turn, BASIC-PLUS was the version on which the original Microsoft BASIC was patterned.[1]
The language was later rewritten as a true compiler as BASIC-Plus-2, and was ported to the VAX-11 platform as that machine's native BASIC implementation. This version survived several platform changes, and is today known as HP BASIC for OpenVMS."
https://en.wikipedia.org/wiki/BASIC-PLUS
[1] Stephen Manes book "Gates: How Microsoft's Mogul Reinvented an Industry--and Made Himself the Richest Man in America" from 1993.
"though I notice that the released source archive has TOPS-10 style 6.3 filename conventions"
That would be because of the inheritance from CP/M, both the MS-Basic for CP/M and the fact that it ran of MS-DOS which borrowed, shall we say, from CP/M and CP/M looked very much like DEC stuff.
Ah, I didn't know that about CP/M filenames: I've never used it so just sort of assumed it used the now-familiar 8.3 format. All I've seen of CP/M is that it looks like a sort of hybrid of CMS and Generic DEC OS, which seem reasonable enough choices as starting points!
Did my O Level computer studies project in GWBasic, a functional PCB layout program. I benefitted from the fact that a print screen to an Epson FX86 (or whatever Epson dot matrix printer it was that we had) resulted in an integer number of pixels on the screen corresponding exactly to 0.1” on the print out. Photocopy the printout to transparency, use that for the lithography, et voila! I recall that I had to build up a library of mirrored text characters for the bottom layer of the board, every time it ran, which was tiresome to do in GWBasic. Ah fond memories...
I got only a B for my efforts too, despite presenting a fully functioning program and example PCB populated. Bastards - it was definitely worth an A.
Some of the O-level marking was surprising. My English lit paper was returned ungraded, which I thought only ever happened if you spelt your own name wrong or didn't turn up or something. Don't get me wrong, I was crap at English lit, but I did (and still do) remember the stuff we had to study reasonably well.
What was perhaps more surprising was getting a grade B for English language. I certainly wouldn't have awarded myself that.
The only programing language I know (besides HTML) this is really cool.
Basic definitely was the Python of it's era, easy to use and easy to learn. It could run from MS DOS 2.0 on, so libraries and schools loved it.
I never used it much, but it was my first time learning programing so I remember it fondly.
Unless the teacher was really good, people using Basic only learned Basic, not programming.
If you learn to program it's a few days to pickup any random language, but of course MUCH longer to learn to use the libraries, or in the case of C, which libraries not to use.
The same can be said of many courses on <insert fashionable language>, you're learning a specific language and not much about programming. Though one university course, used Modula-2 as if it was Pascal and taught nothing about co-routines, signals, mutexes, procedure variables, why it has stronger typing so only typed arrays and not anonymous arrays can be assigned, Modules, using public and private parts of modules to implement object orientated programming, doing device drivers using "magic" types. Generic procedures or functions, e.g. a Quick Sort that takes procedures as parameters so it can sort using a Compare passed in. No knowledge needed of types, or type conversion, or even take a parameters which are procedures to access a "table" or file of the data and one to store the sort indexes. Discovered this interviewing a graduate that could produce lovely looking code who had no clue how to program, despite the Honours.
"Unless the teacher was really good, people using Basic only learned Basic, not programming."
You can also learn Basic programming in a course on any language. It was decided that research staff and students in our dept. would do a one week FORTRAN course. (Somehow SWMBO escaped that although she fell into scope.) Some time later I had to help one of my colleagues sort out a program. All variables had the format of one letter and one digit. The GOTOs leapt all around the place. It was a classic example of Basic spaghetti but written in FORTRAN.
I picked up a fairly recent, somewhat-scientific book the other day, wherein the author had generated some tables of dependent variable-vs.-(many instances of) independent variable; the results were photo-copies of his original printouts. Guess what the author used to generate the results and print-out: BASIC.
BASIC is a very good language, containing some very-high-level constructs which are extremely painful to access or generate in most any other language.
It is STILL, to this day, extremely fashionable to engage in BASIC-bashing, for NO good reason(s) other than (a) it's not the language d'jour; (b) its name; (c) "...it's not a structured language..."; (d) it contains the GOTO command.
Here's three real piece of NEW information for all you effetes: (1) if you need a LANGUAGE to force you to write a structured program, you don't know how to program (cf. the book entitled "Structured COBOL"); (2) the GOTO command is a very powerful command, to be used only by those who know what they are doing (if the GOTO command scares you, then so does Linux's "dd" command); (3) "new" does not mean "better".
"...We are still trying to undo the damage caused by the early treatment of modularity as a language issue and, sadly, we still try to do it by inventing languages and tools."--David L. Parnas
"Originality is no excuse for stupidity."--Fred Brooks ("The Mythical Man-Month", et. al.)
The comment of the main program in GWMAIN.ASM is interesting as it says who wrote the original MS BASIC and when:
COMMENT *
--------- ---- -- ---- ----- --- ---- -----
COPYRIGHT 1975 BY BILL GATES AND PAUL ALLEN
--------- ---- -- ---- ----- --- ---- -----
ORIGINALLY WRITTEN ON THE PDP-10 FROM
FEBRUARY 9 TO APRIL 9 1975
BILL GATES WROTE A LOT OF STUFF.
PAUL ALLEN WROTE A LOT OF OTHER STUFF AND FAST CODE.
MONTE DAVIDOFF WROTE THE MATH PACKAGE (F4I.MAC).
*
Ahh Memories.. Learned GW Basic back in the 80s, on an IBM PC XT. Pretty good version of BASIC from memory. But then, Microsoft have often done BASIC well, as I like Qbasic as well, and Microsoft Basic, for all it's faults, was considered the standard by which other BASIC's were judged for a long time. I liked Microsoft's BASICs until Amiga Basic. I loved my Amiga(s), and when I first fired up Amiga basic, I was excited that a major software company had developed something for my Amiga. Then I saw the half-arsed attempt they made..
I opened a random ASM file from the github and OMG, the comments were referring to CP/M command line parsing. I was working for Digital Research on CP/M internals when the IBM PC came out, and I vaguely remember hearing that Microsoft took their 8 bit 8080/8085/Z80 BASICA for CP/M and ran it through an automatic 8080->8086 code translater to bootstrap GWBasic. Obviously they did lots of work to it after the auto translater, splitting code and data segments, and so forth (the 8080 didn't have segments at all).
I first encountered Basic on a Varian computer back in 1976; I was handed the manual and told to learn it (for a project to develop an optical visual inspection system - project a grid over an object and read the patter via a camera one pixel at a time). Even with the Varian (I think the model was a V70) maxed out with 32k of RAM, it was never going to work - but it was a good learning exercise. Basic, to me back then, wasn't too far off the Fortran I'd learned at uni, and ZX Basic and then BBC Basic were natural developments.
Have to go Anon as the project was covered by the OSA and, even though you "ensign" the Act when leaving employ, you are still legally bound to say nothing about your work. A real pig when applying for CEng!