I remember it in the early 90s
I was at college then and used Turbo Pascal at college. I also remember JRT Pascal for CP/M. I loved Pascal, it was so much easier than C, and I finally learnt to get out of the goto issues in Basic. Happy days!
Pascal, a descendant of ALGOL 60 and darling of computer science courses for decades, turns 50 this year. For engineers of a certain age, Pascal was hard to avoid in the latter part of the last century. Named for 17th-century French mathematician Blaise Pascal, the language is attributed to Swiss computer scientist Niklaus …
My uni recommended MS Quick Pascal. Even though it was version 1.00, it was actually a pretty good product. Used for a few years from 1990, then went with TPW when 1.5 released. Switched to C++ after that, but will always remember QP's 80x25 IDE; even had MDI with moveable windows.
Turbo Pascal is what I cut my object-oriented teeth on. I had self-taught myself BASICA (like everybody else, back then) and it was clunky but okay, but programming in Turbo Pascal just . . felt right. Like it was what real programmers do.
Then I went to uni and got smacked across the head with Assembler, C++ and Lexx & Yakk. That put some sense into me.
But I think I'll always prefer Turbo Pascal.
I also remember Turbo Pascal from the 90s and college. Recently found my old Lottery Number Picker from late 90s that I'd written in Pascal. Unfortunately not the source code.
My story of Pascal, for me, was interesting. I was never a good programmer. I enjoyed it but struggled to get my head round it, still do.
I wrote a long winded routine for the lottery program, my lecturer looks and managed to shorten the routine massively.
I discovered how to write to a file one week, so I then wrote, while at college in the Windows 3.11 days, a sniffer program. While having my programming lessons and during times using them in study periods, I'd note when students booted the machines, they'd forget to switch to the network drive, say F: drive and would type 'login' in DOS on the C: drive, it not being there would get an error so then they'd switch drives.
So I wrote my version of login and would stick it on the C: drive to see if it would work and if people would fall for it. I didn't know how to hash out what you typed so instead of * you'd see your password, get an bullshit error I'd display, realise you were on the wrong drive, switch to the correct drive and login as normal. Meanwhile my login program had grabbed your user name and password and stored them in a file called assignment.doc, because students also would leave their documents all over the C:
Again, not being a good programmer the details would be written in plain text. If you found the assignment file, you'd be able to read it and realise your machines were compromised. I genuinely only wrote it to see if it would work. I was amazed it did. I still remember one password that it grabbed that worked, 'masterofpuppets" once logged in as the user, we quickly logged out after. I told my college friends, who were also amazed by it to NEVER ABUSE IT. I left it at that. Never used it after that one account I got into, and never did anything bad with that account. I also warned them, if you abuse it and get caught, I had nothing to do with it.
So the day came when the dicks were messing around with it. Lucky for me I was bunking off that day as couldn't be bothered to go in. There was also a stupid cartoon animation tool on the machines that one idiot of the group of friends would use, to make piss take cartoon animations of the lecturers. The rule was, if you got caught pressing the reset button on a machine, you were up to something. They got caught that day. Got caught with the login program on them and the cartoons.
It all kicked off. They were pulled into interviews. It was serious. We had a big meeting in the hall over this login program that was found on the machines. That it was illegal blah blah. That meeting was to say 2 students have been kicked off the course and one given a suspension. I escaped. Lucky for me they kept quiet over where they got it or who wrote it. No one ever found out it was me. I carried on at that college for another 4 years with 2 different IT related courses. I'd learnt my lesson and didn't do anything like that again there.
In their interviews they told me the lecturers asked who created this program? Its very well written blah blah. And this is why I remember it so well. I said they are talking bollocks. They are saying that to see if you confess to writing the code. Why did I know they were talking bollocks? Because the code I'd written was in the Pascal help file :o) I'd just added the text the college display when logging in.
Some years later, while my cousin was at Leeds Uni I told him this story. He said they used a similar system, could I write the program for him. Still being naive I said sure and also because I'd been reading the 2600 magazine. In that they had a piece about Pascal and how to do very basic encryption for writing to files. Ooo, I could add that to my sniffer program. All it did was you'd type A, it would plus 20 to the ASCII value and write that to the assignment.doc file. I'd stick other random crap in the file also. So if you found that file, you'd just think it was a corrupted student assignment file and just delete it and think no more of it. You'd take that file home and decrypt with the decryption program that simply minused 20 from what was in the file. I never found out if he ever used it or not. He became a Doctor, so if he did use it, he never got caught.
..for any real software engineering.
I rewrote yards of it into C so that I could actually get code to work. It was so strongly typed that things like reading two bytes of a comms stream that might be a bit field, two bytes of data or a particular error code was almost impossible without creating a union of every possible thing it might be.
And if you were stuck at the coal face nested 15 levels of subroutine deep and you encountered the sort of 'line's dropped on you mate' error the inability to 'go directly to jail without passing GO' with a lon gjmp() produced a nightmare of messy code .
Academics loved it. Engineers avoided it.
Of course the fashion is now to type overloading, leading to an equal mess.
Academics should be banned from writing languages
Pascal certainly was not useless. We wrote a SCADA system in Pascal on RSX in the early eighties. One problem was people with a mathematical background who looked at the typing system and thought 'magical', forgetting that real compilers are pragmatic things and no, you couldn't declare more than 1500 subrange types (IIRC). One for a value from 0..n, another for 1..n+1, another for 0..n+1, another for 1..n+2 etc etc. For every possible value of n in the system.
One real problem we had was compiler speed, or rather lack of speed. It took ages to compile the system. I remember we worked all one day solving a bug and eventually figured out the cause and investigated the compiled code and discovered where a one-bit change was needed. So we patched it and went to the pub. (and yes we did eventually recompile without the patch)
Used Pascal in HE and then Turbo Pascal on Z80 CP/M based laptops and handhelds for many commercial systems in the late 1980's and most of the 1990's.
For some of the systems I did, look up the Epson PX4 (F1 timing systems, Gallup pop chart, Tie Rack and most Pharmacists ordering systems), PX8 and HX20 (even had a voice interface!)
Also Epson EHT (touch screen handheld used to load balance Concord luggage, although that application was in C on DOS!)
[ICON: as there was only me and John Franklin of Epson who knew the ROM disassemblies inside out.]
I have a PX-8, with thermal printer, ramdisk, acoustic coupler and the fairly rare PF-10 floppy disk unit. It's an awesome piece of kit, but absurdly overengineered --- three processors! Integrated microcassette! An intelligent ramdisk! A serial floppy interface that's only a bit faster than the Commodore 64! Multiple batteries (including one soldered to the motherboard which I had to remove)! An internal DC power regulator which can't provide enough current to run the machine (it will _only_ run off battery; the DC port is only there to charge the battery)!
Sadly I managed to fry the PF-10 performing a batteryectomy (yes, it had another soldered-on internal battery, which was leaking). I'm slowly working on fixing it. I've got to the point of running programs on the internal 6303 and am trying to figure out why the stock firmware won't work any more.
If anyone's interested, here's a video of me loading a program on the PX-8: http://youtube.com/watch?v=S3MARL-F8NI
A company I was with for the first half of the 90s had a pretty major application written in Turbo Pascal.
When we got this newfangled thing of a network (Netware 3 IIRC), that team was trying to access network drives but there was no way to identify them as such from within TP (at least as far as that team was concerned).
I was doing a lot of C at the time (still am for that matter) and so I wrote a linkable function that accessed the list of lists that could determine the type of storage and the drive letter (local, removable, network). I was the toast of the team (well, for a short time - after all, that was (sniff) just C).
I will note that a pretty major ECAD package (electronic design) was originally built with TP and is now built using Delphi.
There were pitfalls, of course; I once wrote a fairly small piece to parse credit card details from smartphone data which worked perfectly - one of the Pascal people did the same, but it choked on occasion with an incorrect record (too short as I recall).
The record lengths were fixed and the bank involved insisted that the records had no delimiter to re-sync should a record be wrong. Easy to enforce in C [using fwrite()] but quite difficult in Pascal.
You want to talk about strong typing in Pascal? I was recruited - though it felt like I was shanghaied - on a project to move data across a network. Across each link the identical data in an identical layout was carried in a RECORD named for that link. At the transition to the next link the data had to be copied to another RECORD named for the new link. Now this was clearly a stupid choice on the part of the architects, but most languages had a means to bypass typing in such situations, usually with some form of cast.
Not this version of Pascal. Huge amounts of time were wasted programming the item by item copy at each link.
The problem with Pascal was its popularity for teaching, which meant it became de rigeur for systems programming. "C" on the other hand, a consummate systems language, eventually became one of the most popular teaching languages, to the dismay of many would be coders.
The problem with Pascal was its popularity for teaching, which meant it became de rigeur for systems programming. "C" on the other hand, a consummate systems language, eventually became one of the most popular teaching languages, to the dismay of many would be coders.
Despite the best efforts of my software tutor at university to teach us Modula-2, I stuck to teaching myself C. That was definitely a good choice... But I did that by teaching myself Pascal whilst still at school. I have fond memories of Turbo Pascal - my first proper graphics programs... And Turbo C.
The impact has been that projects that started off in languages favoured by academia have often run into deep, deep trouble when it comes to scaling up. And the only reason why we have performant mobile devices is because the electronics advanced to the point where it can run copious amounts of inefficient code on small hardware for a decent battery life.
My great hope is that universities start picking up Rust to teach. That's a really good blend of a systems language (you can, and people are, writing operating systems in it) and high level stuff. If we end up with a torrent of budding Rust engineers coming out of university, we won't be too badly off.
But on the whole I fearthat there's few academics who actually willing to teach stuff that's hard. Interesting may be more fun, easy means less work, but it's no help to the students who then go out into the world and find their skills are not in short supply or high demand. When we get new graduates in we have to have them in structured training area for a year before they're let loose on real work.
Now that we have students willing to sue universities for the lack of tuition time, poor course content, etc. we might see students themselves force universities to take their role more seriously.
You didn't even need a typecast. Just declare a single type and then two vars referring to the same record type, It looks some Pascal written by C programmers unaware of the Pascal typing system. The "Type" section of a Pascal program is not irrelevant. In Pascal you can alias even native data types to different types so a wrong variable won't be accepted by the compiler just because it has the same size.
Anyway, any decent Pascal implementation supported typecasting, in Borland Pascal is as easy as Type(Variable).
Also everybody hails Rust today - and it is a strongly typed language - it avoids many of the programming errors that made C programs full of vulnerability holes.
Agreed, it's staggering how because there didn't seem to be the equivalent of StackExchange "solutions" to a problem that something as simple as type casting in Pascal was deemed impossible (OK, some of the early academic versions never touched anything low level but . I did plenty of binary input processing in Pascal, it took a little effort but in the end was a very stable way of processing the data and depending on the situation, sometimes unions, sometimes just a few different defined classes were all that was required. With the various different classes all it took was a cast to the appropriate type and the data came out natively.
Most versions of Pascal could do type casts just as well as C could. The problem was that this was one of the features in Pascal that wasn't standardised. If you had a problem then it was with the particular version of Pascal you were using.
The same is true of goto. Common versions of Pascal supported goto a label, but again the implementation was not standardised, and in some cases may need a compiler directive to enable it.
I've done binary communications programming in Pascal, and in the instances I am familiar with the basic algorithms can be translated from C code pretty much as is, if you know Pascal well enough.
The main problem with Pascal was there was a base language which was standardised and then a series of extensions to support lots of things involving OS and hardware interfacing. Professional grade compilers then were proprietary, and compiler vendors were more interested in promoting vendor lock-in than portability.
The ANSI/ISO standard was quite useless, as nobody really cared for it.
One of the problem was Wirth went on designing new languages instead of evolving one, without someone leading Pascal evolution was left to companies writing compilers and selling them - and when the main ones were Apple, Borland and Microsoft you could see how easy it would have been to put them in a room and came out with a standard. With BASIC happened the same.
With DOS and Windows coming without an OS compiler, there was much more competition to sell them compared to Unix. Being able for example to call into DOS without having to buy an external expensive assembler could bring more customers.
Microsoft applied its own extension to C, C++ and Java as well, before having to capitulate to the standard for the first two, and being sued by Sun for the latter.
Meh. Pascal appealed because it was strongly typed.
Rumor is one company used Pascal heavily in a large ECAD software suite for electronics design then they bolted a Motif UI to it.
Funny how all but their flagship tool at this particular company seem to lack visible evolution from ancient software methodology from the 80's.
Then there was another even older ECAD system. Adding to its functionality required mastering pointers to variables in C, subroutines written in Fortran. Ok, today, that method seems mundane.
C workshop was my first interactive debugger.
>> "Meh. Pascal appealed because it was strongly typed."
Indeed, others here are criticizing Pascal being strongly typed. They fail to recognize that this one feature is what made Pascal the preferred teaching language in the 70s. I went to a pretty rigorous engineering university, and they employed Pascal in both intro and advanced language classes. If you are instructing people who know nothing about structured programming, then you want to start them thinking in very methodical techniques. Pascal provided that structure then.
I suppose I tend to view Pascal as the Latin of programming languages in that it generates a certain amount of debate vis-a-vis its usefulness IRL (I mean as specified rather than as implemented) vs. its usefulness in teaching appropriate methodologies and habits.
I never got into it nor Modula 2, both of which were taught when I was at college in the '80s, but I did find myself gravitating towards Ada and quite liked it. Unfortunately, that was undermined by requiring a Vax running the enormous MAPSE subsystem, as well as me being the wayward creature that I was, I decided that C looked like something that was much more likely to do my head in and was therefore The Language For Me™. And so it was. It worked out in the end but I wonder how things might've panned out had I (and everyone else) stuck with Ada.
It's manifest nonsense to say that "Pascal was useless", unless you're reading the textbook and forgetting that practical implementations were modified for engineering use.
"Whitesmiths Pascal" was particularly common. It was a bodge, running on VMS and cross-compiling for 68000 targets via a couple of intermediate stages one of which was C. It wouldn't have been my personal choice, and I didn't much "like" Whitesmiths, but there was a lot to like in Pascal, and a lot to learn for people with open minds.
I use it on a fairly regular basis for developing demo code for Ether CAT systems. I'm very much an engineer.
Structured Text is a Pascal based- it's part of the IEC 61131-3 suite of programming languages.
Implementations can be a pain in the Boris, but it works.
Don’t forget that Wirth designed Pascal with the object of it being a system language. It was able to compile it’s self on CDC 6000 series mainframes in its earliest implementation, and in Object Pascal form was the system language of Macs prior to OS X.
It was far from useless, and was designed to be fast and efficient to compile on the limited hardware available at the time.
Pascal was useless as shipped. No linkage, no variable-length arguments (despite using them in the language, so a Wirth Pascal compiler couldn't be written in Wirth Pascal), the program needing to be in one source file. All the serious Pascal compilers fixed these shortcomings -- but none of them in the same way, which made Pascal non-portable. Of course standards committees tried to fix this, but usually by inventing yet another mechanism (insert inevitable XKCD).
But Pascal's programming tools were great: good IDEs, good sets of libraries. UCSD Pascal was a good system. Turbo Pascal was superb. This made Turbo Pascal the obvious choice for writing well-performing programs under MS-DOS.
Then two things happened to kill Pascal: UNIX and Windows.
UNIX was a joke. A minicomputer operating system of obscure commands and questionable stability and an odd security model. BSD and then Sun focussed on making UNIX not only a serious operating system, but one at the edge of operating system features. Other minicomputer operating systems didn't come close to what Sun was doing with their workstations. Even the billions spent by IBM and DEC didn't touch dynamic linkage or TCP/IP or NFS. And the language of UNIX was C.
UNIX was so obviously the future that Microsoft acquired a UNIX and used Xenix as the development platform for their MS-DOS products. So it was natural for Microsoft to seek to replace their expensive workstations and servers with PC-based C compilers and linkers, and then natural to sell that C compiler and linker, and then natural to support that C product better than the other languages it offered. In time it was natural for Microsoft to write OS/2 and Windows in C.
What also cemented C over Pascal was PJ Plauger's work on the ANSI C committee. Unlike the equivalents for Pascal, the ANSI C committee did a great job. It codified existing practice, it bought innovation where it was sorely needed and thus readily accepted (eg: function prototypes, the "void" type), it wasn't afraid to adopt a particular vendor's solution whatever its weaknesses (eg: BSD strings.h).
Now we had a language which you could write a program in and it would run on MS-DOS and UNIX: if you were developing programming infrastructure then using C doubled your market; and if you were writing programs then using C meant you had more programming infrastructure available. Many of the GNU tools became available for MS-DOS and people developed a hankering for the real thing: UNIX on their PC at a price they could afford. Moreover the action of AT&T and Sun in seeking to massively charge for their compilers meant that UNIX systems in practice all used the same compiler for applications programming: the free GCC. So not only was there a common language for UNIX at the 10,000ft level, there was a common language at the 1in level, plus autotools papering over the differences in system libraries. Pascal simply wasn't that ubiquitous.
With Windows, C and Win32 became the common choice for applications and Pascal's largest group of users quickly left.
Later the web browser killed Win32. But since C was the language of choice for both UNIX and Windows servers, that only made C more dominant. Then the world tilted and interpreted languages roared back into fashion on the server-side: Perl, PHP and Python. C became a niche language for systems programming and performance-critical paths (and part of the appeal of Python was its ease for putting C in performance-critical paths -- usually by importing a third-party library).
Given a free choice of development systems for a new project my team of very experienced developers chose Delphi. Their reasons were better IDE, easier to code, more reliable and faster execution than MS C++.
Excuse me, I'm just off to write a ROT13 filter in TP 3.02, My first programming language.
So I'm officially old. The university I attended in the late 70's had a Pascal Micro Engine (link ) which always intrigued me. Pascal has had an enduring influence on me, although it's I/O sub system sucked . Of that type of language I always liked Modua-2, a sort of grown up Pascal before N Wirth got obsessed with OO and Oberon.
I used UCSD Pascal, Turbo Pascal, and Modula-2. Of all the compilers and IDEs for any language from the MS-DOS era, the one that impressed me the most was TopSpeed Modula-2. It was way ahead of what Microsoft or Borland were offering. They later offered Pascal and C compilers as well. TopSpeed was founded by the original Danish creators of Turbo Pascal when they had a business falling out with their US based marketing partner (Phillip Kahn) and they split from the company, taking their next-gen IDE and compiler with them while Kahn (Borland) kept the Turbo product line.
One of the interesting things about the Modula-2 language was how it addressed the same issues as Object Oriented Programming (then a new concept) was trying to address, but in a different way. Modula-2 used Opaque Types and modules instead of objects.
Overall I think that objects were the better overall solution, but it's interesting to consider what might have been had opaque types caught on rather than objects.
UCSD Pascal is another interesting technology, in that it used an architecture independent VM long before Java was even a gleam in its creator's eye, and as you said Western Digital even built a CPU specifically to run it. There were also Basic, Fortran, and C compilers which produced code which ran on the same VM.
It also allowed for native code compilation under programmer control through the use of compiler directives ($N+ and $N-, if I recall). The VM had virtual memory management built in, which wasn't a feature that the MS-DOS OS (where I mainly used it) itself could give you.
What killed UCSD Pascal was it was licensed to a company who had exclusive rights to all newer versions and they ran it into the ground rather than continuing to develop it to keep up with progress in hardware.
This is also another "what might have been" event. If it had been open sourced rather than licenced to a single company, it might have evolved from what was fundamentally an 8 bit architecture (with split code and data pools for 16 bit machines) into the 16 and 32 bit era and still be with us today supporting still more languages.
I still have TopSpeed Modula 2 and occasionally crank it up. Having learned Pascal on a Dec Vax and a Sirius, I bought my own copy of TopSpeed Modula 2 for my Amstrad PC1512 and never had a moment's regret.
I've got decades of experience with C#, Python, Delphi, PHP, Node, Ruby, and Go (obviously not all of them over those decades), and when I think back to what JPI achieved with TopSpeed on 8086 boxes with floppy drives it's stunning by comparison. The language is a dream to work with too.
I also don't think the windowing text mode IDE they produced has been beaten for productivity even now.
I've used Lazarus and Free Pascal and they're great, but the TopSpeed IDE building Modula 2 to modern boxes with modern libraries would be a thing of beauty (I've tried Excelsior and am not yet convinced).
In the mid-1980s I had a job where I was simultaneously using Pascal, Modula-2, assembler and some C, all on VAX/VMS. Some others in the team were using Modula-2 quite heavily on embedded networking devices, and my main use of it on the VAX side was tooling support for them.
Two things about the language I hated: required upper case and the pervasive use of unsigned integers, which Modula-2 called CARDINAL. Unsigned counters are error-prone when counting towards zero: if you write < 0 in your loop test, it will never terminate, so you get a hang. The tools written in it that showed up for me to port were also surprisingly nonportable, with the packed/unpacked mess hanging on from Pascal.
I don't remember whether it was a requirement of the Modula-2 language, but I seem to remember screens of import statements, one identifier at a time, the equivalent of C++'s "using MyNamespace::myIdent" rather than the get-it-done "using namespace MyNamespace;". Meanwhile, VAX Pascal did allow importing everything in the module.
(There were 2 VAX Pascal compilers, the original one and the new, incompatible one. It was the new one I used. Modula-2 would have been an improvement over the old one.)
With Modula-2 import statements went along the following lines:
FROM InOut IMPORT ReadCard, WriteCard, WriteString, WriteLn;
You imported everything you wanted to use explicitly. If you find that you have "too many" imports, then there is a problem with how your code is organised. Modula-2 programs are supposed to be broken up into modules, that's the whole point of the language. The greatly improved module system was one of its big advances over Pascal. The explicit imports were also an improvement over C, as it greatly reduced the chances of naming collisions. You could look at the import list and see exactly where each imported reference came from.
I'll put it another way, think of Modula-2 modules as being like objects in newer languages. Wirth was trying to solve the same problems as object oriented programming, but in a different way, and it was not obvious at that time that objects were the future of language design. Modules were not supposed to be huge with loads of imports any more than modern objects are supposed to be huge. If they were/are, your program design was/is wrong.
Comparing Modula-2 to C++ is pointless, as it came well before C++ and experience had been gained in the mean time.
Cardinal numbers (unsigned integers) were also greatly welcomed at the time, as most people programming with Modula-2 were doing so on 16 bit computers and in many cases were struggling to get enough positive range out of a 16 bit integer to do what they wanted.
If your loop termination condition contained an error, then a bug is a bug and it's the programmer's error. I personally would much rather have a bug that failed in a way that was impossible to miss than to have a subtle off by one error that produced slightly wrong results.
The actual thing that most experienced Modula-2 programmers complained about can be seen in the import example that I used above, which was the I/O routines. They didn't have the degree of flexibility that Pascal's I/O had, but instead were very simple functions that did one thing with one type. As a result you needed more functions to do mixed I/O than the equivalent Pascal program. Of course those I/O functions were really fast because they did just one thing and one thing only, but drawing forms on screens or printers took more work than I would have liked.
"Unsigned counters are error-prone when counting towards zero: if you write < 0 in your loop test, it will never terminate, so you get a hang."
That's not a problem with unsigned variables, that's a (pretty fundamental) problem with the programmer. Something that cannot be negative will never pass a test checking for negativity.
This post has been deleted by its author
I've had developers who just disabled all the warnings and hints. This was reflected in the final code performance and reliability.
On the other hand, I've had sections where due to bugs in the compiler it generated false warnings. Annoying to have to try and restructure an entire code block to work around a buggy warning, but at least turning warnings off and back on again was possible.
Indeed, it shouldn't have compiled, which may lead the programmer to assume the counter must in fact be signed. This was someone else's program I was porting (from a 60-bit word-aligned machine!) and the declaration was probably hundreds of lines back, as tended to happen with Pascal languages since declarations could only appear at the start of procedures or globally. After that head-banging moment, I didn't fall for it again.
But compile-time checks are IMO not enough here, because of underflow wrapping. Consider a loop with complex logic where sometimes you decrement the counter by 1 and sometimes by 2, for valid reasons. And suppose you get it wrong. With a signed counter at -1, your loop will run terminate even if it runs once too often. With an unsigned one, it wraps to a large positive number, will hang, and perhaps scribble all over memory. I know which I'd rather debug. You can argue the mistake shouldn't have been made anyway, but then why are you using a "safe" language?
Question: in Pascal, is [0..65535] an unsigned 16 bit or a signed 32 bit number? I guess that was the problem Wirth was trying to solve with CARDINAL.
I always felt that it would have been much more sensible to clear up the naming conventions to something like INT8 and UINT8, INT16 and UINT16 and so on rather than having to remember arbitrary, inconsistent namings for the pair of them and to reserve the likes of INT or UINT for when one really didn't care about the size but wanted the compiler to pick the most efficient for the target system (which could be a hazardous assumption of course).
The thing that really annoyed me was Wirth's indescribable desire to move to a single pass compilation process. This caused no end of dead chicken waving to get otherwise simple things working.
Pascal was the primary high level teaching language at Newcastle Uni when I was there in the early-mid 80's. IIRC it was the UBC* flavour which had a couple of oddities (such as the character map not having A follow 9), running on an IBM S/370. I just missed punch card input by one year as they implemented a Unix based data entry system, it was still a minimum 15 minute wait from submission to results though which taught us to get it right first time.
I later acquired the two Pascal ROMs for the Beeb B so continued with the language after entering the world of work.
*University of British Columbia
a couple of oddities (such as the character map not having A follow 9)
If you mean immediately following, 'A' doesn't follow '9' in any character set I've ever worked with. Quite possibly what you're misremembering is that A-Z (and a-z) were not 26 contiguous characters in the IBM EBCDIC character set.
When I was at Newcastle in first year they taught us in Algol W. In second year they decided that Pascal was now the tits and we all had to use that instead. Everybody just kept writing their programs in Algol and fixed the syntax errors.
That was possible in the second year as we had data entry terminals, unlike in the first year when we had fucking awful punch card data entry :-|
Another implementation that was used in universities was Berkeley Pascal, which came with BSD Unix. As a student, I wrote some lengthy pieces of Pascal code in it for analysing Petri nets. (Now I hardly remember what those things were). This implementation produced native code for the VAX, and had enough nonstandard additions to make it just about practical for programming.
Compared to the alternatives available (K&R C and Fortran), it was the best choice for this task, in the sense that the resulting program was more likely to be correct.
It was a very weak Pascal, though, lacking things like modules and type coercion that were in the DEC Pascal that ran on the same hardware with their proprietary VMS operating system. It puzzled me enormously why computer science departments were so enamoured of Unix when the programming languages were so very far behind those available on other operating systems.
(BSD Pascal was definitely a good choice for your project, against the ones you mentioned. Young people might not know that C compilers of the era often issued no type errors at all, especially in function arguments.)
Computer science departments were enamoured of Unix because it was free. They very quickly worked out that they could save a lot of money by just adopting the free thing and teaching about that.
You can argue about the various merits/disadvantages of OS's all you like, but IMO it's this simple fact that enabled Unix (and subsequent flavours) to take over the world.
Plus of course the fact that its syntax and commands were nice and obscure, which helped all of us in the Society to Keep the Amateurs Out Of Computing.
I think the free (as in no cost) was less important than that there was source (*) and documentation about Unix. You could run assignments like add feature X to unix. Try doing that with VAX/VMS.
(*) Strictly speaking, Unix was not "open source" as we now know, but its source was easily available to academic institutions. Eventually Berkeley reimplemented bits and pieces and eventually released the result under a liberal open source license, leading to lawsuit by AT&T, which was settled in 1994. Long-standing uncertainty about the legal status of BSD is one reason Linux is the most used free OS, and not FreeBSD or NetBSD...
Oh yes, I endured an early 1980's engineering course and had to use a Research Machines 380Z to do my coursework - could I figure out where to put the semicolons? As a self taught BASIC 'programmer' I really didn't have a scooby at the time...
I spent a lot of my career twiddling pins on embedded microcontrollers using assembly language, but eventually I had to grow up and use C.
By then I could afford a Nissan 380Z which was a much more pleasurable experience...
I saw FORTRAN programmers struggle to translate their code into new applications, they always succeeded but it was an effort and wasted a lot of paper. Pascal programmers just switched from one system to another in a minute or two, I used to write FORTRAN programs in Pascal - it's so easy to translate any language into Pascal. The nice thing back in the early days was that you could write a program in Pascal and run it on a DEC PDP-11, VAX,or MSDOS on the desktop in the days before Windows existed.
These days I have to write in modern languages but I still think and plan the code in Pascal in my head.
I'd missed Pascal at university in the mid 90s. So it was Modula-2 (aka Son of Pascal) for me as the base language for the course. I recall on the very first day the head lecturer said "...we could teach you something commercially viable like Ada, but we're not really into that...". The next year the base language was indeed Ada.
One of my first professional jobs, at the end of the 70s, was working on a project to build an easily programmable multiprocessor system using Intel 8086s and the Concurrent Pascal system developed by the Danish comp sci Prof, Per Brinch Hansen.
Concurrent Pascal was a high-level language for systems programming, and included concurrent processes, classes, and monitors for controlling access to shared resources. A small machine-dependent kernel provided things like process scheduling, interrupt handling, and basic I/O. There was also Sequential Pascal, a flavour of the standard language, for writing applications.
The compilers for these languages produced p-code, reverse Polish instructions for a stack machine, and these were interpreted at run-time by a machine-dependent interpreter which was another part of the kernel. One of the things we did was change the back-end of the compiler so that it produced directly-executable 8086 machine code. Execution speed was not a problem, but memory was expensive, so we added optimisation for code size, and ended up producing more compact code than Intel's own Pascal compiler.
The multi-processor system worked, but we never got as far as making it resilient. Each processor was in its own cage, with its own memory, power supply, and link to the comms bus. Hence one demonstration when someone asked "How do I know that it's a multi-processor system?". I said "You can turn any one of these processors off." "And?". "And the whole thing will stop working!". :)
In the first AP Computer Science class offered in my area, we studied Pascal, data structures, and fundamentals. We compiled on a single South West Tech computer, we didn't even have Turbo Pascal yet. I remember pointers and addressing being a huge mystery, it took weeks to really understand what was going on. Eventually we worked on linked lists, double linked lists, binary trees, etc. Before relational databases were a thing, even years after in college, it was all about storing and maintaining data records. A far cry from today. Kids today are probably learning C++ a few years after they are born.
When Apple Pascal, based on UCSD II.1, came out for the Apple II I thought it was fantastic. Supported 80 cols out of the box and came with a great book that taught you structured programming. One of the best self teach books around I reckon. As I had grown up with BASIC and assemblers, getting my head around using procedures, structures and strong typing was made much easier with this book.
Many years ago (mid 80's) I was tasked with converting an accounts system from Apple II pascal to COBOL on an IBM PC - it even had 2 floppy drives so I didn't have to swap disks to compile. Luckily it was work experience whilst at college so only had to do it for a month or so.
Later I used ICL system 25 assembler, C on Unix and then for some reason Delphi.... the swings and the roundabouts.
Whilst not free, RemObjects Oxygene is an ultramodern implementation of Pascal with language features that other modern languages only aspire to. You can use Visual Studio or RemObjects own Mac and Windows native IDEs (Fire and Water, respectively). Those native IDE's are themselves implemented in RemObjects languages, as is their build system (EBuild).
Languages, plural, because as well as Oxygene they have implementations of C#, Java, Swift, Go and (coming soon) BASIC (VB.NET compatible). Why use these implementations ? Not least because - as their IDE's testify to - you can use any/all of these languages to develop for any/all of the platforms that the Elements compiler stack supports. That's (deep breath...) .net .net core win32 win64 Linux (x64) JVM Android, iOs, macOs, iPadOs, watches, tvOs, WebAsm.
You get the idea.
It isn't used very widely anymore, but it's worth mentioning that there _are_ current successful commercial systems that are written in Pascal, such as FL Studio, a widely used music production suite, with over 10 million downloads each year of it's trial version.
The was my introduction to computer programming. I even remember the textbook; Introduction to Programming Principles : Using Turbo Pascal by Richard W. Foley.
Perl and VBA quickly followed and departed leaving me with a few books on Agile.
I had to learn MODULA-2 (MacMeth) at Uni. I still remember the horrible frustration of having to swap discs in and out of the Mac Classic to compile code and the pedantry of the syntax meant it took ages to get code right. Anyhow it gave me a life-long hatred of Pascal-like languages.
Nah it was just a very ornate and verbose language. The Macs we used had one floppy drive, the first had Finder and Macmeth on it, the second was where we had to save our work (since there wasn't space on the first disk). A compile meant juggling disks and so I learned to hate the verbosity of the syntax (and overuse of the shift key) since every keystroke was another opportunity for failure.
Some languages do have horrible compiler errors - C++ in particular is garbage, but I don't see that being due to the language syntax per se.
The first 15 years of my programming career were based on Borland's various Pascal versions. From CP/M all they way to D3. But then I moved away to C++ and Borland Builder (thankfully the VCL was accessible so that made life easier). For the last 15 years it's been C# and with luck that'll be the end of it. Another seven years at the most I'm off to play golf full time :)
Thank you Nicklaus and Anders :)
My first programming job, in 87, was to write a custom word processor using Borland's Turbo Pascal. They had a package that was basically the source code for a word processor, and I customised it to work like Wordstar, with extra features to add standard paragraphs for engineering schedules. I can't remember the name of the package - if someone can recall, I'd be grateful!
I remember bits of it well. All DOS-based. The hard disk of the PC XT I worked on was 10MB.
Started using Mac Pascal (interpreted, no less) on a Mac 128 way back when, then graduated to Lightspeed Pascal on an SE and Mac II, before finishing off with Think Pascal v4 (post-Symantec acquisition). The last "4.5" release lives on in emulation on my iMac...
And my copies of "Think Pascal: User Manual" and "Think Pascal: Object-Oriented Programming Manual" sit on my bookshelf next to my copy of "Algorithms + Data Structures = Programs".
Pascal was a great language for teaching and exploring data structures, and with object libraries was powerful (and easy) enough for commercial-grade releases.
A very smart, probably autistic but whatever, electronics engineer was tasked with teaching me programming when I was a fresh apprentice. I guess his boss saw some possibility of talent in me, but he didn't.
The very first thing he said to me was, "Write out the Fibonacci sequence using Pascal."
My honest, gormless reply got a rare smile from him, and then a groan that later became common
"I know who Blaise Pascal was but I've never heard of Fibonacci."
Actually, he may not have been autistic. Much more likely I am just stupid enough to ascribe better intelligence / education to some weird brain condition. But I think I got a point for knowing who Blaise Pascal was. History of Mathematics, worth reading to the end.
In 1985/6/7 I made a good living writing accounting and data management applications in Pascal on Burroughts B20 BTOS (aka Convergent CTOS).
Of course, Wirth's original Pascal is useless, but with the right extensions, it is nice to use. The B20 version was modular and did things like variable-length strings. I enjoyed the BTOS/Pascal environment much more than any other '80s programming environment. When I moved on to Unix/Emacs/C, I was appalled on many levels! When I did some work in Delphi a decade later, I felt right at home!
Burroughs Pascal was the main language for BTOS applications, but they also supported COBOL. The COBOL guy would come in from time-to-time to debug something or other; he was always easy to spot because he had a jacket and tie (whereas we were all in T-shirts). He did not talk much. One day, I expressed an interest in COBOL. He tried to explain PIC clauses to me, and I almost lost the will to live!
..UCSD P-System to be precise. Back in 1988, I started off writing bespoke invoicing systems for an accounting system that was written in the same language. The software was originally running on Apple II and I did the IBM PC port (which took me just under a day). It was literally just changing one library of non-portable functions. Whilst at the same company I also worked on a 3D carpet design system that was written in Turbo Pascal 3.
I use Lazarus Pascal for all my science coding, in gui, graphics and text mode. Drag and drop a breeze. Re-compile exact same code on Windows and Linux (gui included) in a flash. Easy to distribute binary executables. Watched colleagues trying to achieve same result in the snaky language and nearly fall about in mirth.
FPC without Lazarus has a textual GUI that apes TPascal's heavily, and it runs on wintendo, larnicks, BSD, etc so you're not chasing relic hardware. Also because it's FPC you get the modern Object Oriented experience if you wish with delphi or traditional Object VIsion look and fondle.
This interview with Wirth from sometime after 2000 popped up in my Youtube recommendations:
He taks about the origins and history of Pascal, then about Modula-2 and the Lilith workstation, Oberon, and finally his retirement project.
I started my Electronics Engineering degree there in 91, and IIRC we were the last year group to be taught Pascal as our introductory language before they switched over to C. By then the CompSci department who ran the introductory programming courses on behalf of the Engineering faculty had equipped its labs with a load of Acorn UNIX boxes, though that's about all I remember about the development environment - due to the high demand for these workstations, I ended up doing most of my coursework on my Amiga at home using the somewhat flakey (but free) PCQ compiler, so tended not to use the workstations much beyond the first few weeks in the first year.
Having then ended up being allocated a final year project that was entirely software based (in a foreshadowing of the way my engineering career would eventually head over the decades), I then started using Turbo Pascal (v5 I think) which had then been installed on the engineering lab PCs, and bought a copy of HiSoft Pascal so I could continue working at home through the night (another bit of foreshadowing - most of my best code is written between lunchtime and whatever time I eventually decide I *need* to get some sleep).
After graduation I was lured back to the department to do some postgrad research work, during which time I continued using TurboPascal for bashing out command line test tools whilst also teaching myself the basics of embedded C and PCB layout (the two skills that were probably most important as far as securing me a decent first job out of everything I learned in the 7 years I eventually spent at uni). Whilst in that first job I started using Delphi for doing in-house development tools, then transitioned to Lazarus/FreePascal at my third employer, and still use it today for some stuff if don't feel like doing battle with C#'s thread-safe design philosophy, or if I'm doing something where the godawful C# serial port component would cause problems.
So whilst Pascal wasn't the first programming language I learned, it's been the one I've used the longest in one variant or another - getting on for 29 years so far. Not bad for something that quite a few people dismiss as being not much good for anything except teaching...
I was taught Pascal at uni (Southampton) then found myself at my first job programming in Pascal for the next 15 or so years for the Quantel Paintbox family.
It was embedded, so we made our own hardware, so cross-platform wasn't an issue. We used the Oregon Pascal-2 compiler, running on DEC Vax and outputting for 68000 family. That compiler had the best of both worlds - Pascal levels of type checking, but a selection of features useful for real systems (eg separate compilation, the ability to turn off bits of checking as required, and the ability to use pointers to fixed addresses to allow us to write to our hardware's registers).
We wrote our own real-time OS for it. We had the spec for how to get Pascal to call assembler or vice versa.
Down-sides? Lack of IDE or debugger - development done with text editor (initially Dec EDT, later VaxTPU with our own extensions), debugging done by writes to console (or with a logic analyzer if you're dealing direct with hardware). Sometime mid-90s we started being able to edit using PCs and send to the Vax to batch compile.
At the time (early 80s) when this direction was chosen I think C was too new and didn't give you the help Pascal type-checking did. Don't let anyone tell you the compiler should do whatever the programmer asks because quite often the programmer is wrong on some details so a bit of help from the compiler is very helpful. And when you found you did need to defeat some checking you could do.
We had the source for one version of the compiler, Oregon having disappeared without trace, and made the occasional mod to it. Eventually the rest of the world caught up and we moved to C++, though there were attempts at other ways forward, such as trying to compile the Oregon Pascal compiler with Vax Pascal, or gpc, or with trying to combine the concepts of cross-gcc with gpc to get a cross-gpc.
Biting the hand that feeds IT © 1998–2022