Precise time?
Why specify the program was first run at 4AM (May 1st 1964), without specifying the time zone?
Wanna feel old? Thursday marks the fiftieth anniversary of the invention of BASIC, the programming language that took the computing world by storm during the PC revolution of the mid-1970s and 1980s. A version of BASIC shipped with practically every home computer of the era, but the language actually dates back to 1964, when …
This post has been deleted by its author
4AM?
Yeah thanks guys for setting a precedent that everyone from Steve Jobs to even the most sedate and boring sysadmin now expects programmers (and fixers) to be available all bloody night.
Example: Work 60 hours plus on a failing network in Birmingham. Take a well-earned break for a couple of days back home 200 miles away, drive early on Sunday morning to see the kids for once. Halfway home get a call to say something else has gone wrong, "and if you don't come back right now, don't bother coming back at all".
Such happy days.
Still, I love my job, innit.
This post has been deleted by its author
>A friend describes it thus: "C is assembly language with syntactic sugar."
The other one is "C is portable assembler"
Like many of these cute phrases, it is wildly inaccurate.
Some does depend on what C you're talking about. K&R C was not much more than assemler in some ways, but ANSI-C is a different beast.
My point was you can do things in C with ease, such as bit-wise operations, pointer arithmetic, etc, that can be seriously dangerous, but is also essential for some OS operations. Same in that respect as assembly, and not as other languages that (for good reason) deny dangerous operations.
(* Seriously. Been there, done that. 6809 circa 1980.)
Oh, yeah, you and your fancy-pants index register with the arithmetic operations. When I was a lad, we had INX and DEX and were glad of it!
(6802, and we built the Data General Dasher D200 around it in 1978).
// Good times.
// We would have built a Pac-Man game into it if we had had enough spare RAM...one of my regrets.
re: (* Seriously. Been there, done that. 6809 circa 1980.)
I've done something like that. 6809 on a Tandy CoCo3, early 1990s. I had bought an assembly-language tutorial book at Radio Shack, but was unable to get EDTASM, which it was written for. Luckily, the book had a table of opcodes, as well as object code printouts in all its code examples, so I taught myself how to hand-assemble.
I did all that to try to get some resemblance of speed out of the CoCo3's "hi-res" graphics modes (a whopping 320x200 in 16 colors). The stock HGET and HPUT blitter commands were way too slow (which is what happens when the original programmers' properly "structured" program calls 3 subroutines for each byte it's copying to the frame buffer). I also added transparency and horizontal & vertical flipping. And tile-based backgrounds. And double-buffering. I never actually did built the game I wanted to make; I had more fun building the tools. (for certain definitions of "fun".)
This post has been deleted by its author
>To an assembler programmer, C misses out a few pieces.
You can say that! Having worked in x86 ASM up to 286/386, I moved across to C (which I found very easy although frustrating to debug as the point at which the compiler gave up with a generic "missing semi-colon in line nn" was often many lines after the actual error which was nothing to do with end of line semi-colons.) and wrote the x86 assembler for a C programmers workbench. Things which standard C didn't handle were segmentation and all those lovely op-codes that Intel provided to support operating system activities (and which it seems even today aren't fully exploited by modern x86 based OS's); but then writing an efficient code generator was difficult enough as it was.
I remember how easy it was to teach my classmates to write a 'wait till lessons start and then beep continuously' script for the Micros sat in the class corner / corridor / teacher's office (stock room).
Unfortunately that got stopped when the same kids started putting rude messages on the screen - it was not longer a computer fault but an obvious prank.
There's the famous quote about how BASIC ruins programmers.
Personally, I think it probably brought more people to IT than any other programming language ever has.
Almost everyone I know in IT "of a certain age" used BASIC as a kid. Whether they're now a developer, a systems administrator, an architect, a hardware engineer or a support engineer. (Or whatever other IT roles we can think of.)
And through things like VBA, BASIC continues to be very much alive despite a plethora of newer options.
Maybe, over the next few decades, Python or Javascript will eventually supercede it. But it'll take one heck of a lot of work, and we'd lose BASIC. And deep down, I don't think people working in the industry actually want that...
Mark #255 "data structures" in BASIC
One can emulate slightly more-advanced data structures, e.g. doubly linked lists, or just about anything else, within BASIC's simpler data structures (e.g. string arrays). It's a neat trick that not many people seem to know about.
So what date was the first BASIC interpreter fired up? And how does that compare with the first interpreted language? I only ask because Sinclair Basic was unbelievably easier than compiled COBOL on punch cards, or FORTRAN on paper tape. Reducing the edit-run-fail cycle to seconds from days changed everything.
I started with BASIC on 16 bit DG Nova and DEC PDPs, it looked quite good after using FORTRAN. Now after learning C and SQL, I can still write FORTRAN code in most languages.
BASIC was really useful and inexpensive way to implement instrument and machine control - A few lines in QuickBASIC allowed you to open a serial line and write its output into a file, and send an input back. I know that some power and utility company applications are still out there running this code under DOS.
I think it was substantially more than a simple port, beyond the first incarnations anyway. The Dartmouth MAT statements for matrix manipulation never made it to the MS interpreter and a few things that were useful for the microprocessors of the day such as the PEEK and POKE instructions came in.
Dartmouth BASIC on punch cards was great - you could actually shuffle your card deck before reading it in and it ran just fine thanks to the line numbers - eat your hearts out FORTRAN/COBOL/C coders!
> Actually they didn't. They (BIll Gates and mostly Paul Allen) simply ported Dartmouth BASIC.
It has been claimed that MITS BASIC was a port of a public domain BASIC interpreter that ran on the DEC-10 and for which the source code was available. Certainly it used DEC style syntax. All 8080 development at the time was done on DEC machines until CP/M became available. The 8080 cross compilers were somewhat compatible with DEC-10 assembler so it would not have been difficult to port though the floating-point was quite different, which is why they needed Monte.
http://www.vanwensveen.nl/rants/microsoft/IhateMS_1.html
I remember coding in Locomotive BASIC on the Amstrad CPC 464. It was great, and I also learnt how to use GW-Basic and QBasic.
As I learnt a lot about programming with this simple language, I decided used that idea 25 years later to teach students at college how to program, but by using Python. Apart from the for loop, Python was a really good way to teach programming for absolute beginners.
"As I learnt a lot about programming with this simple language, I decided used that idea 25 years later to teach students at college how to program, but by using Python. Apart from the for loop, Python was a really good way to teach programming for absolute beginners."
www.codecademy.com/tracks/python
This is now my go to link for when somebody asks me how they can learn to program. The course is well written and the interpreter runs in the browser. I've seen kids writing programs after only a few days and I'm not talking about the usual "Hello World", I'm talking functions and flow control. A teenage son of one of my relatives was even writing passable object orientated code after a week.
On topic; I've always liked the look of BASIC but I missed out on the age of micros and grew up with the IBM PC clones. When I got interested in coding at college they taught us Pascal. I'm not sure why, presumably because it's what the lecturer knew best.
Pascal was created as a teaching language. It's prime goal was to be highly structured, and have a very concise syntax that encouraged students to think in the way that matched the good programming practices of the time (highly structured, functional and procedural programming). It generally succeeded in these aims.
It is quite clear that someone who learned Pascal could convert to other scientific languages (like Fortran or or Algol) relatively easily, and I know lots of people who moved to C with little difficulty.
But as a language, it was strongly disliked by students. Because of the precise syntax and strict type checking, it was a very pedantic language to write. In other languages at the time, you might get a successful compile, but have a completely broken program because of an escaped syntax error.
Now Pascal would never force you to write programs that worked, but it would protect you from some of the pitfalls that other languages might allow. But the repeated compile/fix cycles without a run caused many colourful moments in the classes I was involved in. But I'm not sure whether that was preferable to the compiler incorrectly attempting to fix simple errors like the PL/1 subset teaching compiler called PL/C, which is what I learned formal programming in.
The other drawback of strict Pascal implementations (and here I am explicitly excluding all Borland/Zortech and other 'extended' products) was that there was comparatively little support for some operations that were needed in order to cope with real-life problems. Files with multiple record types were complex (you had to use a construct like variant records to do this), and the very strong data typing did not have the equivalent of a cast operation (I'm still talking strict Pascal here), which made some of the tricks that you do in other languages difficult or impossible. There was also no variable length string construct (there were fixed length character arrays), and as a result, almost no what you would describe as string operations. This meant that you quite often had to do code some comparatively simple operations yourself. And there was no form-filling or screen handling features at all. But at least that was not unique to Pascal. Almost none of the high level languages of the time had that built into the language itself (it was normal for these to be added by library routines, the most obvious example being Curses for C).
This meant that kids who learned BASIC on 8-bit micros at home regarded Pascal as a backward language that restricted what they could do, whereas people from a formal teaching environment regarded it as very good language for precisely the same reason!
The other reason kids had difficulty with any compiled language was the fact that it was not interactive. The whole compile thing compared to just running it seemed wrong to them.
But as a language, it was strongly disliked by students.
Frankly, this is still the case these days. And judging from cow-orkers, the people who go into computers would like to be able to write working programs with the same attention and care as they put in writing tweets. Frankly, they are in the wrong domain.
The whole compile thing compared to just running it seemed wrong to them.
Of course, and this is why REPL-supporting languages are good.
A cogent explanation. I did pick up Pascal reasonably quickly and learned enough to pass the course. However, as a new programmer I found it to be rather 'fiddly'. I wouldn't say it completely put me off programming but it did make rapid prototyping difficult and that certainly dampened my enthusiasm. As I went off to university to study graphic design the only real coding scripting I did for years was Javascript for web development. Then I got into games development as a hobby and taught myself C++. I probably couldn't write a Pascal program today (can't remember a lot of the syntax,) but all the arsing about it made me do with data types definitely gave me a head start with C++.
>The other reason kids had difficulty with any compiled language was the fact that it was not interactive. The whole compile thing compared to just running it seemed wrong to them.
This sentiment could be applied to computing professionals working with Ada where getting some code out of the multi-pass compiler, rather than error and warning messages, was a cause for celebration.
This post has been deleted by its author
Yup, I never needed to got/gosub anyhting on BBC Basic. Procedures all the way.
Hehe, once wrote a simple program that pretended to be the command prompt on the school ethernet network, then sat back and watched teacher try to logout, as it responded to most common commands apropriately, but logout and some others generated false error messages.
Think I have my school project on a 5 1/4 inch disk somewhere...
Was probably and Econet network. And the login screen was in BBC Basic anyway. We told users to do a <Ctrl>-<Break> before logging on to prevent this type of thing.
I ran a Level 3 Econet network back in the 1980s. If only the security had been better enforced (the concepts were good, but it was trivial to get around), then it would have been a great low-cost network for file and print sharing. But there was no concept of privilege mode in the BBC Micro OS, so it was simplicity itself to set the bit in the Econet page in memory to give you privilege at the network level. And once you had this, you did not need a users password to be able to get at their files.
Still, I suppose that you can't have everything in a single-user 8-bit micro. But I agree, the BASIC was good, with the exception of it not having a while-wend construct.
10 PRINT "LUKE SMELLS OF POO!"
20 GOTO 10
Then, as my skills developed, I could do this in larger text size, cycle the screen border through many bright colours and cycle the text colour as well. I'm afraid that's as far as I got with programming, before going back to playing games. Apart from a brief course in C++ 20 years ago.
You got further than most of us!
My mates and I'd walk into Boots, type:
10 print "SHIT! ",
20 goto 10
Run out the shop very quickly!
The comma after the quotes would force some BASIC's to print the same thing to the end of the line, then loop back to the start like a typewriter, so the screen would fill up with the chosen offending word for the day as opposed to the naff down-screen-scroll!
More fun with the Oric that had ZAP, EXPLODE, PING and SHOOT sound effect commands, so you could REALLY cause a nuisance and draw attention to it, LOL.
Of course that really did call for a bit more sophistication - an initial delay loop of a minute to give you time to retire to safe distance from where to observe the commotion!
When there were display ZX81s in the WH Smiths, I would type in a REM statement in the first line of a program, using the Sinclair special block characters, a small piece of assembler that would put a different value in the Z80's I register that was re-purposed by Sinclair to point to the page number of the first page of the character generator table in the ROM, and then call the code from the relevant address. I often added this to the program that was loaded, and then run the program.
What this resulted in was a screen of garbage, You could see that there was something there, and it would respond to all of the right commands like list, but the text was unreadable. If I remember correctly, the funniest thing was to put a value in that was the base page of the character table offset by one. This had the effect of shifting the displayed characters along by a number (32?) of characters, so the result was effectively a block-shift cypher of the program.
Some tricks for shop computers I used to employ were:
1) A POKE to the VIC-20 video chip to change the screen width from 22 columns to 23 or 21. This would cause anything on the subsequent lines to be offset left or right by one extra character each line. Really annoying for the shop assistants who couldn't figure out why their program listings were going diagonally down the screen.
2) Running a simple loop to copy the CBM-64 ROM into the underlying RAM, a POKE or two to keep it there, then a few more POKEs to change the SYNTAX ERROR text (now in RAM) to something far less pleasant. Then I'd stand back to watch kids (usually) cause an error and ask their unsuspecting parent what that meant!
3) Enter a short program into a Spectrum that changed the screen to 2 shades of blue, print "COMMODORE 64 BASIC V2", "64K RAM SYSTEM 38911 BYTES FREE" and "READY" with a loop to flash a cursor. That got lots of people confused! I once even convinced someone that a Spectrum had had a CBM-64 ROM installed in it by mistake!
One of my circa-1980 hacking tricks was padding the BASIC lines (multiple BASIC commands allowed per line of course) with REM and placeholder characters ("/") quantity equaling the length of the code, followed by fake code as the rest of the REM. Then executing a machine code routine that would replace the placeholder character with ^H. The resultant BASIC would look like whatever I wanted, while the actual code was hidden by the ^H.
It became trivial to do things like this:
LIST
10 PRINT "Hello!"
RUN
Good bye!
The actual code was thus:
10 PRINT "Good bye!": REM ///////////////////////// 10 PRINT "Hello!"
where the '/' got replaced by the ASCII of ^H, thus instantly backspacing over the real code upon LIST.
This made almost anything possible and confused the uninitiated.
I sometimes suspect that the same ^H in Source Code trick could still be done in some environments. Beware.
once upon a time,
no mater what computer I was on, I could knock up a basic bit of basic to do something I wanted done,
then the 80's came along, and basic was BAD
so now I have to know and use, TCL, Python, VBasic, C in all its forms, Jscript, and a few more,
and I'm a hardware developer just trying to use the computer..
Yes I have a fond heart for Basic, like all languages , it can be used worng,
one place I worked at, used a fortran 77 program they had written to word process !
and another has a cobol based spread sheet like program,
but basic, its still in my heart and soul,
youngsters today just don't learn to program, but then, neither did we, we just had fun making things bleep and pixels flash, that doesn't do it for the kids now days.
My little girl came home the other day talking about the Small Basic stuff they'd been doing at school that day, I'd never heard of it before then.
It's only got something like 20 keywords in total, the interface is really simple so kids can get started quickly. It has one data type, simple I/O and graphics handling libraries built in, I spent a quite a few hours playing around with it, writing silly utilities and games! A good laugh!
This post has been deleted by its author
I'm pretty sure that 'Beginners All-purpose Symbolic Instruction Code' is a backronym. Allegedly BASIC was originally styled Basic, and it didn't stand for anything other than what it was - a basic programming language. I can't remember where I read that, and I concede that it may be bollocks.
1000 GOSUB PutFlameProofPantsOn
1010 GOSUB BeAPedantOnTheReg
1020 IF YouAreWrong THEN END
The rest may be history, but there’s an important footnote to that history.
By the mid 1980s, John Kemeny and Thomas Kurtz began to despair of what was happening to BASIC on the micro computer. In particular, the limited memory and processing power of the computers at the time had led to a fairly naïve programming language, lacking many of the structures and features of the more modern languages.
Meanwhile, back at the ranch, BASIC had been developing into a more sophisticated product. Kurtz & Kemeny implemented and marketed a more advanced version called True Basic, a structured programming language which, among other things, dispensed with line numbers and the dreaded GOTO statment.
You can, of course still rewrite the title of the article as:
FOR i = 0 TO 1 STEP 0 : PRINT "Happy 50th Birthday, BASIC" : NEXT
Also, fail for not alluding to Bill's Butthurt Missive about people copying his precioussss paper tapessss of Basic.
Already back then, the "YOU BE STEALING MUH INTELLECTUAL PROPERTY" meme was prevalent.
"Wanna feel old?" Heyyyy, I resemble that remark (Hi, Kingsley). I don't think BASIC is a "backrony", as back in the old days (50s, 60s) computer languages were "acro-named" after their purpose, e.g.
COmmon Business-Oriented Language, ALGOritm Language, FORmula TRANslator, etc. There were others, SNOBOL (not sure what it mapped to), LISt Processor. I think Pascal was the first language to show its Wirth without an accompanying (justifying?) acronym, shortly followed by Ada and Babbage (just kidding there...strong typing, weak typing, touch typing...).
Then "a", "b", "c", `c+, c++ Seems they hit a wall before "d"...kind of like the energy efficiency ratings that used to end at "A", but now we have "A+", "A++", "A+++"....looks like they painted themselves into an alphabetical corner...
And who could ever forget Programming Language / I And plugboards....that wasn't just close to the hardware, it WAS hardware....ahhh, those were the days...
Someone (here in Germany) said to me that the reason IBM called it "Batch" processing is that that's the sound the card deck makes when you drop it and it hits the floor....unnumbered, of course....
Here's to beer-soaked cards, card sorters, and the good old 029 punch....and, erm, extra class registration cards mysteriously allowing 26 people to show up in a EE class limited to 12....
> Then "a", "b", "c", `c+, c++
There wasn't actually an 'A'. 'B' (the forerunner of 'C') was derived from BCPL: Basic CPL; where Basic was used in the sense of reduced rather than being related to BASIC and CPL was 'Combined Programming Language'. CPL was designed to be a combination of ideas from APL (Atlas Programming language), ACL (Atlas Commercial Language) and others.
The Basic I remember using on my first job was, err, challenging:
- limited memory footprint, although separate program files could be 'chained' together to run sequentially (each one overwriting the previous one in memory)
- comments included in what used up said memory
- variable names limited to 2 characters (plus trailing $ for arrays, iirc)
- return n statement that returned control to n lines before/after the line with the matching call
I have a vague memory of using a paper grid ( letters x digits ) to help keep track of which variable names had been used (command line editor only), but I think (hope) I must be imaging that.
All in all it was an inspired choice for the financial applications we were developing.
BASIC was a RETARDED language. It was a toy language created for writing toy programs on toy computers. All of the variables were global. It didn't support any of the features of a real language, like pointers. You could use it to write a small program but it couldn't be used to write structured programs. If you tried to write a large program you got spaghetti code. The syntax was crap. And it was interpreted. I got my first big break in programming by re-writing a BASIC program in C for my potential employer. My C version ran in one percent of the time and I was hired. C was available at the same time as BASIC and is still the backbone of modern programming while BASIC has (thankfully) been cast upon the ash heap of history. I am so glad that I learned C instead of BASIC and that my exposure to it was limited because it sucked.
"All of the variables were global."
All the global variables are global, all the local variables are local, don't complain just 'cos you used a crap version of BASIC.
"It didn't support any of the features of a real language, like pointers."
Versions of BASIC that didn't have pointers didn't have pointers. Don't complain just 'cos you used a crap version of BASIC.
"You could use it to write a small program but it couldn't be used to write structured programs."
Versions of BASIC that didn't have structures didn't have structures. Don't complain just 'cos you used a crap version of BASIC.
"If you tried to write a large program you got spaghetti code."
Versions of BASIC that didn't let you write modulaised code didn't let you write modularised code. Don't complain just 'cos you used a crap version of BASIC.
So true, of course early limited versions could be missused, of course the best was QL Superbasic where the following could be done
for f = 1,3,-7, 15 to 12 step -1
next f.
It wass fully procedural and I wrote a Superbasic subset compiler in it (only integer arithmatic) for my A levels which was faster than the comercially available supercharge compiler.
Not only that but goto exists in C so you can, and people do write spaghetti code in C. What is it with open source stuff where people insist that all variables can be a maximum of 2 character long.
QL rules yeah ( except that useless graphics layout)
jake, is that you?
I am so glad that I learned C instead of BASIC and that my exposure to it was limited because it sucked.
Your passport to leet elitism, sir. Valet parking included!
Ok, so anyone owned the
Power C compiler or any other C compiler fitting into the 64KiB (or 32 Kib) of RAM of those times?
"Due to the limited memory available in the C-64, the compiler may not be able to handle some very large source files. It is unlikely that the Power C 128 compiler will run into insufficient memory problems and not be able to handle the large source files"
> C was available at the same time as BASIC
Not true. K&K BASIC was 1964 (see article), C was nearly a decade later, initial internal versions were around 1972.
On micros BASIC was around 1975, it wasn't until 1980 that a subset of C was available (Dr.Dobb's Journal May 1980 "A Small C Compiler for the 8080s"). Full C compilers for micros were several years later (Aztec, Lattice).
The fact is that improvements to BASIC that allowed it to do "sophisticated" things like declare local variables, use pointers, be compiled, etc. arrived in a piecemeal fashion in a whole bunch of implementations during the 1990s This contributed to the fact that BASIC is a hodgepodge of hundreds of different languages. The fact is that GOTSUB NN and other BASIC syntax is incomprehensible. I'm not bitter, I'm very grateful that I never had to program in BASIC, except my stint converting BASIC programs into C to improve their performance.
Also I mis-spoke about getting my big break in the industry. Actually, I ran the company owner's interpreted BASIC program on a data sample he gave me and in 45 minutes I gave up. My C version ran the whole job in 20 seconds, so it didn't run in 1% of the time, it ran in less than 0.0074 of the time. He told me that the program would complete its task in a hour so this was probably more like 0.00555... as much time.
It didn't support any of the features of a real language, like pointers."
In 1977, TRS-80 Model 1, Level II basic - VARPTR returned the memory address of the specified variable, a pointer. Few people understood what that was for or how to use it, but it was a pointer and very, very useful, although most of those few who did use it generally only used it to be able to POKE machine code routines or graphic chars into string variables.
"You could use it to write a small program but it couldn't be used to write structured programs. If you tried to write a large program you got spaghetti code."
Even the early BASICs had at least GOSUB so it was entirely possibly to modularise your code. Many had defined functions (DEF FN()), and later procedures made an appearance, both of which use local variables. All of this was 30 years ago, when I was using BASIC to learn programming.
@BBG "...didn't support any of the features of a real language, like pointers."
IIRC, my Radio Shack Color Computer (MS-BASIC, circa 1980) *did* support pointers, in the sense of a variable (or variables) pointing or indexing into an array (or a multidimensional array). Once you have that building block, you can do *anything*. Unlike the LOGO language which did have huge gaps, the BASIC provided with the CoCo could do *anything* you could imagine. It was completely complete. The *only* missing command was that the joystick buttons had to be read with a PEEK command; but you had the PEEK command so that made it 100.0% complete.
What do you really expect from what was designed for teaching as a first language in introductory computing courses. When I first encountered it back in the mid 70's, it enabled 14~15 year old's with no programming experience to start learning the basics of computer programming; those who showed an aptitude for programming quickly moved on to Fortran IV.
By the way, there is nothing wrong with interpreted code; in the right environment. The LivingC development environment (1985~89) was built around a interpreter for the 'C' language. By using an interpreter it was able to provide many useful aids to the developer, specifically enabling them to visually execute and debug their code logic at the 'C' language level - in the absence of supporting functions and libraries (LivingC automatically created relevant stubs enabling the programmer to see parameter values supplied and input suitable return values), something that had not been done previously in a 'C' programmers workbench.
I learned BASIC from reading the keycaps on a ZX Spectrum Plus. It got me into programming. I went on to learn C. Then when emulators took off I taught myself Z80 assembly language. When I went back to university I learned Java. I think BASIC did what it set out to do, which was to give non-technical people an easy entry point to get computing tasks done. I think the modern equivalent is probably spreadsheet programming. I don't think it's as rewarding though.
Statistic teacher in high school let me use it to write some simulators for some problems we couldn't solve theoretically. The following year they offered a class programming in BASIC. I eagerly signed up for it not realizing I'd already learned more than 90% of what they'd be teaching and most of what I hadn't learned was sort of pointless at least as presented (DATA statements for the program without actual reading of data from an external source). I do however recall one very simple and highly instructive assignment: Simple cash register program where you'd input some prices, cash tendered and calculate change. Every one of us went straight to FLOAT for our numeric input and none of us checked for errors. And then the teacher put in 4.95 for the price of the item and 5.00 for the money tendered.....
I'm not quite as old as BASIC, but that was a lesson I've carried with me ever since that day in class.
Good article. There is no doubt BASIC is groovy and quick to learn. Coupled with colour and sound it made the first home computers into attractive and interesting items.
True it lacks structure and teaches bad habits, but so do stabalizers on a bike. If it had been Pascal, and not BASIC, nobody would have bothered to type in 1-line programmes in Laskys/Commet, or learned to program.
Because the original version of BASIC was so bad, various dialects were created to add the features of a real language. These were added in an unorganized and uncoordinated fashion, resulting in a Tower of Babel of BASIC dialects, see:
https://en.wikipedia.org/wiki/List_of_BASIC_dialects
What a nightmare to program in a language with hundreds and hundreds of incompatible dialects! Good bye BASIC. Don't let the door hit you on the way out.
Somebody correct me if I'm wrong, but how much better is the situation in a "more respected" language like C? The core C language is very portable, sure. It's lean and only consists of specific commands. But guess what ... when you actually want the machine to do useful things like input/output and graphics, you have to link your code to libraries, and how standardized are those libraries? Can I take a complex Microsoft C application and have it run on other architectures without modification? Does this happen to other languages too?
...until you were FORCED to use the LET statement. The 'original' Dartmouth Basic required the use of the LET statement, and if you had ANY upbringing in other languages, Fortran for instance, you always forgot to put it in. Bummer.
The original basic described in the blue pamphlet (I actually got to see one!) didn't have matrix operators. Those were added in later. It was quite nice while it lasted, and when the various microprocessor versions came out, you could usually do the small problems that you whipped up in the half hour preceding. That is what the language was designed for.
Of course, the promoters ALWAYS included a couple of games (Blackjack was a personal favorite) to keep you busy and entertained. The biggest feature was the interactiveness. On an ASR33 teletype, you got immediate feedback if your program had problems, or needed editing. At 10 characters/second, you did a lot of checking before you typed "run".
Heady days. Of course the caveat applies here: "If you remember the 60's you weren't really there".
...according to the adverts at the time, powerful enough to run a nuclear power station, until someone knocks the table it's sat on, the RAM pack wobbles and the computer crashes, leading to the inevitable China Syndrome and 100-mile circle of death.
All hail the Big Blob of Blu Tack, saviour of mankind...
...set my mind off trying to adapt it to play BEEP the musical accompaniment in full 8-bit ZX BASIC mono glory, whilst still being reasonably synchronised to the text display.
And just typing that made my mind jump to what would be required to emulate a 3-channel sound from the same single-channel audio system...
This post has been deleted by its author
that my dad bought me an illustrated 3-book series about computers when I was about 8 yrs old. I have read about inventors of Basic, about first mass-production computer (Altair, if I spell it correctly), about Jobs and Wosniak, about Apple and Microsoft, about movement against proprietary software. And now, 22 years later, I read about the same things! (at least, nothing in that book mentioned an Internet)
First programming language I've managed to learn, as I recall at the age of ~12, on my ZX Spectrum. I still remember the excitement when my first own longer piece ran without an error... :)
...then I'm pretty sure the next thing I did was either loading some game from cassette (most likely copied from someone else as you couldn't buy stuff like that behind the Iron Curtain back then, we had to by the Spectrum itself in Vienna) or started typing in the next BASIC game from the latest Spectrum magazine. =)
Lunar Landers on PDP-8, Coding in machine language, loading code via toggle switches, STOP, RESET, PROGRAM LOAD.
Then the migration to high level assembly language. It was all great stuff. Then along came BASIC with LET X = 10;
What do you mean LET?
Where did X come from?
What register is that or what memory location is being used - certainly nothing I declared.
Boy those were troubling issues in the day...
They licensed a 6502 version of Microsoft's BASIC, but decided it wasn't fit for purpose. It certainly wouldn't fit into an 8K byte cartridge which the spec said. And they needed a BASIC now.
So they got Shepardson Microsystems to write a new one. It did fit, and they did it so quickly that it was finished before the contract was signed.
"Flash forward to 1975, when electronics vendor MITS paid two young programmers to create a BASIC interpreter for the Altair 8800, an early microcomputer. Those coders were named Paul Allen and Bill Gates"
Based on source code originating from "Decus Basic" on a PDP-8 while Gates was a student at Lakeside School ..
http://mike.dos.ru/docs/Bruce_Montague/open_source_license.htm
"The best way to prepare [to be a programmer] is to write programs, and to study great programs that other people have written. In my case, I went to the garbage cans at the Computer Science Center and fished out listings of their operating system."
http://www.salon.com/2004/03/19/programmers_at_work/, Susan Lammers, April 1986
Friday, Mar 19, 2004
“There’s this wonderful outpouring of creativity in the open-source world,” [virtual-reality pioneer Jaron Lanier] said. “So what do they make — another version of Unix?”
[Jef Raskin, who created the original concept for the Macintosh] jumped in. “And what do they put on top of it? Another Windows!”
“What are they thinking?” Lanier continued. “Why is the idealism just about how the code is shared — what about idealism about the code itself?”
Having everything ass-backwards, even 10 years ago. Criticizing what you pretend to stand for.
I hate these people so much.
This post has been deleted by its author