Even then...
... technology was amazing.
Some info here:
http://history.nasa.gov/computers/Ch6-2.html
Reading that makes yer eyes go wonky though.
If you thought Fortran and Cold War-era assembly language programming is pointless and purely for old-timers, guess again. NASA has found an engineer comfortable with the software to keep its old space-race-age systems ticking over. In an interview with Popular Mechanics this month, the manager of NASA's Voyager program …
... technology was amazing.
Some info here:
http://history.nasa.gov/computers/Ch6-2.html
Reading that makes yer eyes go wonky though.
You'll notice in fact there are 3 processors involved here. Some came from the Viking programme but all (AFAIK) are custom processors built out of LS TTL, usually aroudn the LS171 ALU, like the PD11 and the Xerox PARC Alto.
When one of the processors was not fast enough to do the work they decided to add a DMA mode to all instructions to allow "hidden" data movements without the direct involvement of the CPU.
Not something the average x86 or ARM programmer is used to considering as a design option.
I suspect the JPL does have the necessary documents but you may have to rehost the assembler if you're going to have a go at re-programming Voyager as data updates are going to be sloooooow.
"When one of the processors was not fast enough to do the work they decided to add a DMA mode to all instructions to allow "hidden" data movements without the direct involvement of the CPU.
Not something the average x86 or ARM programmer is used to considering as a design option.
"
DMA usage is very common. As I write this I'm taking a 10 minute break from debugging a dma issue on an ARM micro. The CPU is doing almost no work (CPU loading of about 1%), but the DMA is working at about 70%.
Go look in the Linux kernel - stuffed to the gunnels with dma. cd linux; greip -ir dma
" but you may have to rehost the assembler if you're going to have a go at re-programming Voyager "
What do you mean by rehosting the assembler? I would expect the assembler is a cross-assembler (ie. it runs on a normal machine (eg. originally a Vax or such, but now aLinux box), but generates code for the target CPU. THat's how most embedded systems are developed.
"DMA usage is very common. As I write this I'm taking a 10 minute break from debugging a dma issue on an ARM micro. The CPU is doing almost no work (CPU loading of about 1%), but the DMA is working at about 70%."
You need to read the chapter. Slowly.
The design team added DMA to each individual instructions implementation when data transfers were not quick enough. Not an option for any modern mpu's IP.
"What do you mean by rehosting the assembler? I would expect the assembler is a cross-assembler (ie. it runs on a normal machine (eg. originally a Vax or such, but now aLinux box), but generates code for the target CPU. THat's how most embedded systems are developed."
True, these systems data from the 70's. IOW you're looking at 70's assembler written in the 70's version of it's implementation language and running on a 70's computer.
It all depends on how up to date NASA's tool hosting has been.
If the toolset was developed in a main stream language without using too many supplier unique features it'll be simple. If they relied on special features of that language or its support libraries you'd either have to duplicate them or build a new tool chain.
"
If the toolset was developed in a main stream language without using too many supplier unique features it'll be simple. If they relied on special features of that language or its support libraries you'd either have to duplicate them or build a new tool chain.
"
There won't be any "support libraries". It will be from-scratch code all written by the project programmer(s). Take a look at the ZX spectrum disassembly (google) to see how it was done in those days. A 1 or 2 pass assembler *maybe* followed by a linker - though frequently the code was fed to the assembler as effectively one single module, in which case linking would not be required - the assembler output the completed binary.
There are a few generic assemblers available. You start by defining the basic architecture and instruction set / mnemonics, and then the rules for each instruction and associated binary (machine code) output and you end up with a perfectly usable assembler. I once programmed a generic tool to do Z80 assembler programming as I did not have access to a Z80 cross-assembler at the time.
A quick google came up with http://sourceforge.net/projects/sgasm/ that looks like the sort of thing I recall. I have also read of a self-configuring generic assembler - you feed it an existing comprehensive source code and associated binary, and the program figures out the assembler rules (obviously it won't understand instructions or variations that weren't in the source code you fed it).
Two problems with C programmers and neither has to do with competence. First the FORTRAN dialect is probably something like FORTRAN IV which is very different from the modern version of FORTRAN. Lots of nasty differences between the versions. Second is assembly language instruction sets are processor specific. So someone familiar with the assembly language of current processors would be unfamiliar with the quirks of this processor. To add to the problem, apparently the processor is effectively a one-off. Finding documentation for either language would be difficult. You might find a used copy of a FORTRAN IV text but I suspect the assembly language would be difficult to find. The assembly language documentation was probably very good originally but how much has been lost, misfiled, etc. in 40 years is an open question.
Programming in the mid 70's was more concerned about absolute memory management and accounting for memory usage than today. The economics of programming has fundamentally changed from programmers are relatively cheap compared to the hardware to now were most hardware is cheap and this the programmer becomes relatively expensive. The two schemes require very different approaches to programming.
This post has been deleted by its author
"You need to know how to use basic concepts such as bitwise operations, BCD number representation, etc, which are basically universal in any assembly coding."
I'd add understanding of the various addressing modes and the elegant, fast data structures that they can be used to build.
(Mine's the one with programming the 6809 in the pocket
IBM 360- and 370-series BAL programmers of the 1970s and earlier carried accordion-folded "green cards" that listed all the operations. I had white cards and yellow cards that covered later 370 models like the 370/168, but they were still known as green cards. The summary information all fit on the card and reference to the big manual that actually described how the instructions worked in detail was only occasionally required. The instruction sets of the DEC PDP 11/70 had a distinct flavor (memory addressing and subroutine calling conventions, octal vs. hex, ASCII vs. EBCDIC), and the programming conventions were different but the basic concepts were the same. The IBM Series/1 minicomputer instruction set, for which I coded assembly for several years was relatively byzantine. The equivalent "green card" was actually a booklet, and the full processor manual was a little more useful. I only dabbled with Motorola 6502, Intel 8088 and the like in assembler, but can say confidently that the knowledge is universal and relevant even when working in much higher layers such as say, Scala in a JVM, but less often applicable.
But all this knowledge could be circumscribed well and is limited in scope. There is much more to know in today's environment and I believe the work is even more challenging to do well. We have tools to protect us from the old classic errors, but as creative humans, we will continue to find new ways to screw up. I believe an assembly language experience is worthwhile for any coder.
This post has been deleted by its author
"I wrote a an introductory tutorial to X86-64 assembler, specifically aimed at those who had a bit of experience with the Z80 from the 8-bit home computers of 30 years ago."
Since both the X86 and Z80 are essentially derived from the 8080, the concepts would be quite similar. This task may well be more like someone who has some experience of the wealth of instructions on the X86 being restricted to a PDP8, or having to learn fluent assembly code for the PIC with the most obscure set of on-chip peripherals and registers.
Re: WTF is a "nibble-serial CPU"??
As I recall a nibble is four bits (half a byte) and given that all I have read suggests that the processor is probably a 'bit splice' design, i.e. entirely custom and created from discrete logic IC's, its not inconceivable that the data is shuttled between memory and accumulator in a serial fashion as opposed to a parallel bus. No doubt error checking and reliability being a major driver behind the design.
Its a long long time since I have been involved in that stuff but its not a complete punt more an educated guess.
Yeah, the 74181 is a 4-bit ALU on a chip, and NASA mentions using TTL 4-bit parallel logic in the chapter linked in the first comment, processing 18-bit words in 5 cycles, as a significant advance over bit-serial ALUs. Fewer wires, fewer packages, less power, but less speed than a full parallel ALU.
FWIW, DEC sold a bit-serial ALU version of their PDP/8 at a fifth of the price of the full 12-bit unit, so this was a strategy pursued even outside the limits imposed by space engineering.
In this chapter about Galileo [http://history.nasa.gov/computers/Ch6-3.html] there is mention of 2901 series 4-bit slice ALUs being used in parallel to make a full 16-bit ALU PDP-11/23 equivalent machine with a fully customizable instruction set. Then NASA found that the processors were not sufficiently radiation hardened to survive the conditions found around Jupiter, and had to pay Sandia $5M to fabricate special versions of the chips that could survive being blatted by high-energy particles. If the spacecraft had not been delayed they would not yet have discovered the radiation problems before launch...
"
They could learn, of course, but then so could a competent C programmer.
"
I doubt most C programmers (at least those used to programming PC applications running under a sophisticated OS) could learn assembler quickly - it's a significantly different mindset, and for the older CPUs you have to have a good handle on the hardware operation as well. Most assembler programmers are however capable of switching to a different CPU instruction set and becoming competant in programming in that language reasonably quickly.
Maybe the C programmers who program embedded devices that do not have a formal OS or shedload of libraries could transfer to assembler more easily.
If it's the HP3000, that great machine hit EOL in 2006 ([NO]Thanks Carly, Winston, and Wim)...
2100 (2000) was gone a lot earler than that; 1000 probably still has MIL contracts, but it's by no means a mainframe, as it's a realtime box.
Spent many a year developing, managing, debugging, and peering at the h/w and s/w innards.
MPE forever, we say. Too bad HP didn't listen...
HP 2100 not a mainframe, 'twas a desktop mini.
2nd year elec eng, 1974/5, programming it was part of the optional computing course.
We had to write the assembler, then hand-assemble it into the machine code, then enter it in with the front panel pushbuttons.
I thus gained an intuitive understanding of how instructions are decoded, logic flows through the ALU, and the way an ISR works.
Can I have the job please ?? I still don't "get" object-orientation :-)
2100 was more of a controller, predecessor of the 3000. I did OS/language/DB/utilities/internals development and support at HP. The HP-IB and PA-RISC versions of the 3K were worlds apart from the "Classic" 3Ks, which were similar (somewhat) to the 2100. Bob Green has some good articles about the origin of the 3K.
Well, there's object-oriented COBOL now, so oo-RPG can't be far behind :)
Gosh, even this icon isn't old enough :) ------------------------------------------^^^^^^^^^^
Well, there's object-oriented COBOL now
"Now"? Since 1993. OO COBOL is old enough to vote and td
so oo-RPG can't be far behind
Maybe, though the only real change to the language since RPG IV in 2001 seems to be 2010's Open Access for RPG, which is really an I/O plug-in mechanism.
This post has been deleted by its author
I still don't "get" object-orientation :-)
+1
The last time I tried some OO code, I found myself staring at the disassembly wondering why anybody would actually want to use such things. It seems to me that the further you get from native assembler, the slower and clumsier the software becomes.
MOV PC, LR
(or RTS
if you are old school) (^_^)
The last time I tried some OO code, I found myself staring at the disassembly wondering why anybody would actually want to use such things.
Funny. The last time I wrote some assembler, I found myself wondering why everyone didn't just write an instruction stream in binary. Lazy bastards.
National Aeronatical and Space ADMINISTRATION.
A bunch of ADMINISTRATORS.
The head of NASA is personally selected by the prez, so it's largely a political position, and that sets the one. The decision making is much like most political decision making: just kick the problem down the road and hope the ramifications are not experienced on your watch.
Now perhaps you can understand by people just ignore O-ring erosion and cross their fingers or ignore the fact that an old coder is going to retire. It explains why there was no plan to replace the Space Shuttle (or fix it in the first place).
What's important is picking which tie to wear for the next trip to the White House.
You are correct about administrators, but wrong on replacement for the STS.
Lockheed was supposed to deliver the Venture Star, but bit off more than they could chew with the radical engine design. Project failed.
McDonnell Douglas also had the Delta Clipper, which was actually in ongoing low level test flights, but lost the contract to... yeah, the Venture Star. Which didn't even have a static engineering model having all the money spent blowing up or melting engines.
https://en.wikipedia.org/wiki/VentureStar
https://en.wikipedia.org/wiki/McDonnell_Douglas_DC-X
(be sure to google flight video. Have a hanky ready because it's a crying shame how the American people got fucked over on this)
All this was over 20 years ago. <<-------------
"
Wow, that's forward thinking of NASA. It's not like the guy got hit by a bus or something. They have had years (decades) to look for a replacement.
But no, let's wait until he's retired then start looking.
"
If the code only needs updating every 5 years or so, you'd not want to hire someone to sit doing nothing until the next update.
Was it maybe because you said "I want an unclear job"? Kidding.
Back in the days that voyager launched, I could code in assembler for a bunch of processors, and I've even added a few over the years, but a one-off processor? They may as well just train someone - what is the chance of finding someone who has those skills already and isn't the same age as the guy doing it now?
Now FORTRAN on the other hand - Fortran IV was a very simple language to learn - in fact I learned it in highschool over a weekend using the famous "Fortran Coloring Book", when the choice was doing that or not getting hangout access to the computer lab. The teacher hated that I was correcting HIS work by a couple weeks later, but he let me stay.
What I don't understand is what a space probe is doing with a fortran runtime on board. You would think that it's all static binaries, at which point any language with a compiler that can target the processor would work fine - spend some time getting CLANG/LLVM or GCC working for that processor.
I'm going to have to see if I can learn more about the environment, it seems really odd.
I would imagine the space probe doesn't have a Fortran runtime on board, that particular requirement being for the command and control and received data processing back here. The assembler would be for the probe itself and I would think anyone suitable for the job would already have worked with several different assembly languages and would take adding another in their stride.
This post has been deleted by its author
I still have my 'ZAKS - Programming the Z80'. Just need to brush up a bit. Still remember reading through that on instructions clock cycles to get my Spectrum code to run faster (a WOW moment for me when I coded a version of Conway's game of life - first time ever did I see it work in real time as opposed to the graph paper versions).
Good man :)
I was told similar bollocks at skool. Not clever enough to be an engineer, why not go into the fire service? Nothing wrong with the fire service, but not what I wanted to do. Mr Grigg, you, Sir if you are still alive, are an arsehole.
I cut my teeth on the 6510, spent many years writing code for the old 4-bit Hitachi H400 series of MCU's and PIC's. At 48 I'm working for a semiconductor manufacturer. Is it just me or does the whole semiconductor industry seem to be comprised of people in their 40's? Where is the new blood?
COBOL, gah! that is a woman's langauage M'lord
The CPU on these would be much closer to the DEC PdP 10/11 CPU's with high level functions and structure availible in assembler than the 6502 which was almost a forerunner for RISC. Why do you think Acorn made the ARM simply because there was because there was nothing comparable to the 6502 availible as everything else at the time was going the CISC route.
The FORTRANs up to 66 are modest variations on a theme. FORTRAN III never saw the light of day. From FORTRAN 77 onwards the pace of change accelerated - Fortran 2015 is going to be largely unrecognisable to a programmer of the 1960s.
There was, of course, a time when the question was not "Which?" but WATFOR?
Why didn't NASA port to an OSS maintained, cross-compiler tool-chain decades ago, like say GCC, and have the toolchain automatically generate any extra machine code by life-cycle convention, like Maven does for Java artifacts? The test rig should only be need for final testing, all prior testing could be orders of magnitude faster in an emulator, so WTF have they been playing at?
I wrote assembler for single digit MHz machines as a child, and the reason it was slow and hard was the more primitive software development technology, not just the technology limits of the hardware.
Any employer who isn't proactive about, or doesn't allow, cost effective reduction of developer drudgery (e.g. via build automation, appropriate better hardware/software/process) deserves to loss all their developers and be rejected by potential recruits.
Sorry, software developers were never factory worker 'cogs'. We are well past the Industrial Age, and have been transitioning from the Information Age to the Design Age for well over a decade now!
Compilers can never produce code as efficient as hand-optimised assembly.
In most cases, this really doesn't matter in the slightest.
But sometimes it does - albeit very rarely these days.
Even in modern embedded hardware you can end up needing to hand-optimise (or even hand-write) assembly segments.
Compilers can never produce code as efficient as hand-optimised assembly.
A ridiculous claim. Of course they can in some circumstances, such as when a compiler produces code as efficient, by whatever metric you like, as possible, and thus as efficient as hand-optimized assembly.
What's more, they can generally do it faster. They can do it fast enough that they can profile the execution of a wide range of alternatives and pick the best-performing one. They can run against CPUs with hardware tracing and see exactly what's going on with pipeline stalls and cache misses and coherency.
Is there still sometimes an opportunity to improve performance by hand-tuning? Sure. But to claim that it always beats the compiler is foolishness.
Not sure why the downvotes, Infernoz is totally right. We've got 16mhz processors with 2k of RAM running C binaries today (any arduino or bare Atmel embedded processor) generated by GCC - it seems they would be far better off getting a modern compiler environment set up to target the voyager's processor. It would probably be easier than training up someone to use the ancient toolchain they have. I assume they have equivalent processors on the ground so they can test, so I don't see the big deal doing that.
If you don't realise why he's been so royally downvoted, oh well.....
I'm sure you'd just love the reputation of being the guy who lost 2 deep space probes that had gone the furthest of any manmade object and had been doing just fine for decades. Through your bright idea for how Things Could Be Done Better.
Seems you couldn't be arsed to upvote him, either.
This post has been deleted by its author
We've got 16mhz processors with 2k of RAM running C binaries today (any arduino or bare Atmel embedded processor) generated by GCC - it seems they would be far better off getting a modern compiler environment set up to target the voyager's processor. It would probably be easier than training up someone to use the ancient toolchain they have
So.... you're volunteering to rewrite all the assembly in C?
And, you're assuming that someone is willing to do the work to target that processor.....
Bliss! Great language.
AFAIK, unless you have Csharp or Java or other modern languages none of the Recruitment Agencies will have a clue as to what you are talking about. For example Algol and its derivatives.
I have a PDP-11/83 and a MicroVax in my Garage. Fortran 4+ compiler as well as Bliss Cobol etc.
Retro stuff and sadly no front panels but still proper computers. I love programming in Macro. Far more pleasurable that C or Java or .... just about anything in widespread use today.
Pascal is another language I like to use.
Is any of this use for finding a job? Nah. Not a chance.
But the principles of writing 'tight' code you have to learn with these memory restricted systems can make the code you write today a lot faster.
Currently I'm working with a number of awful XSLT's that are just ****. Even Coral-66 is better than that sort of shite.
unless you have Csharp or Java or other modern languages none of the Recruitment Agencies will have a clue as to what you are talking about
Oh, come on, very very few of the Recruitment Agencies have any clue about anything either their clients (aka victims) or jobseekers (aka victims) are talking about.
I'd better be AC, as I may possibly have need of one in desperation
This post has been deleted by its author
1989s_coder - Genuinely interested in what you say. Care to amplify?
I see lots of people my age who are doing contracting or consultancy.
I know I am skilled, but I am unsure of how they go about getting a similar or largre income than a full time job. Forgive my naievity (splelling)
I worked on a system where, bizarrely, my employer had been contracted to convert a system from CORAL-66 to FORTRAN-77. Never really understood why. Rumour had it that the CORAL-66 compiler for the PDP-11 was shite. Also there was a lot of PDP-11 assembler (using 2 completely different macro libraries) and some assembler for some obscure SIMD array processor.
Small boys, 3 way gotos for goal posts, isn't it? Hmm?
I used to like Coral66. The m/c that I was using at the time (an ICL1904S) had a limit of 2K for user programs and I managed to fit a lot of Coral66 into that space (although I did used PLAN macros in the Coral66 source to squeeze even more power). My first High Level Language was Fortran IV using WATFOR. I am sure that there are still thousands of programmers still working from 'my' generation.
There used to be a saying that a good programmer could write in Fortran in any language.
So, NASA, if you want us, just reply - we are ready and able.
Burroughs, Unisys, NEC, CDC, Honeywell - anything but IBM.
And then when Honeywell and Fairchild were bought out, "Fair Well, Sweet HoneyChild."
Burroughs was, to my recollection, the first company to actually make a commercial computer that used virtual memory.
God this is all so much fun. I hope it keeps on going until my dying day.
NASA for an old Z80 programmer?
Used to enjoy doing interupt driven programs, gives you a nice feeling when you have 64 mSec to execute the interupt code and return from it before the next interupt occurs.
(please note... bad things happen when it takes 65 mSec .. like calculating the number of clock cycles for each route through the code to see exactly which route and by how many you are too long)
But I'm still a dab hand at robot programming of one variety or another, so I guess brushing up on Fortan to command another robot should'nt be too hard.
"Re: Replace technology drudgery by automated life-cycle convention. "
Not sensible. There are two big objections (that would scuttle this technique even if reliabilty were not a concern, which it most definitely is.)
1) Custom hardware. So, you get gcc to emit code for this CPU (and make sure it handles any corner cases properly). You still must support the various 1970s-era hardware on this probe. Does it use interrupts or polling? DMA? I/O ports? Some other mechanism? It's unlikely the existing code has seperation between the "OS", "drivers", and "application code", this is an embedded CPU running a dedicated task in a limited-RAM environment.
2) They have an existing, working code base, and are not looking to make radical changes or rewrite from scratch or anything, they're not looking to port other stuff to run on this CPU either. So, to port this to C, you'd have to have people knowledgable with assembly and (I guess) Fortran anyway, to decipher what the existing code does and write up a description of this behavior. They would then write a C implementation of this behavior. Then, they would cross their fingers that GCC's build of this code would still fit in the same RAM that just managed to hold the hand-written-assembly implementation.
So, is NASA *really* hiring or is this one of those "woe is me...." type articles? I'm in my early 30s, know assembly (x86, PDP-8, 6502) and Fortran (and have a "Fortran IV with WATFOR and WATFIV" book just sitting on the shelf), have no problem with working on older hardware, and have full respect for the Voyager I and II probes and what they have done. I did several reports through my educational years as Voyager II flew past Uranus and Neptune and even got to talk to Professor Gurnett (who was largely responsible for the Plasma Wave instrument on board the Voyagers).
I live 40 miles from KSC and work at a large software company. The stories from ex-NASA tech folk that have escaped to either here or Lockheed-Martin down the road would cause all your hair to fall out.
You think the bureaucracy & managerial incompetence at where ever you work is bad? Try a government agency that doesn't really have a mission any more, with funding to match, and is used as a major political football by Congress, and doesn't have any weight to defend itself, unlike the military.
Then at the other end, you've got contractors that are like 100-year-old vampires sucking any money that's left, and knowing they don't have to produce because the project they're working on will be canceled by Congress next year. The order of the day is "Powerpoints and more Powerpoints"
Yeah, that's a ball 'o fun.
Don't forget this is the agency that's left data from old missions to rot on tapes until the equipment to read them is long gone, and people have to pull shit like the Lunar Orbiter Image Recovery Project (LOIRP) (http://www.moonviews.com/) as a semi-clandestine operation in an abandoned McDonald's, using crowdfunding to scrape together whatever remaining equipment and expertise they can to recover data before it's permanently gone.
Oh, and this is also the bag 'o dicks that decided NASA TV didn't need to show the GPS launch this morning, because that couldn't possibly be of interest to it's constituent audience of space fans. Both of 'em.
I'm sure this position has been open for a long time, and any one with the skills also has the sense enough to avoid a job for $35k/year in that sort of environment.
"A Structured Approach to Fortran 77 Programming"
I think that's the text I have too. I'd check but it's down in the basement somewhere and I don't feel like spending half an hour grubbing it out. Haven't had to use Fortran in many a year.
Not that I ever did a lot - mostly wrote test programs for the XGKS Fortran binding, and similar tests of inter-language calling.
These days, if I wanted to do any number-crunching, I'd probably use R or Julia. Not that there isn't still a place for Fortran, of course - there's a vast body of extant special-purpose code, and still a pretty large skill pool. But a lot of scientific computing seems to be moving to R or other languages.
Whenever my PC stutters and splutters or requires a bazillion gigabytes just to read my mail I curse modern programming. Oh yes, I understand it and all the techniques. Its just that programming has degenerated to a form of 'throwing mud at wall to see what sticks' process and a big part of the blame can be laid at the feet of misapplied object methodology. This technology got perverted to suit the needs of graphical interfaces and like all universal tools now everything's a nail to it.
So I'm one of those older programmers (thankfully mostly retired). We didn't use a lot of assembly even in the 70s, it was more a 60s thing, and if we did have to use it we used macros to tidy it up a bit (resulting in that universal macro assembler aka 'the 'C' programming language'). There were higher level languages about but these were for applications, not for systems or real-time work. I tended to avoid Fortran, its a dog's breakfast of a language, preferring Algol based languages and certain types of non-procedural language for the things I had to do back then.
Realistically not much has changed over the years except that the new hires where I've been working can't function without an IDE, can't work with scripting languages and haven't a clue how to work with things that don't come with complete support libraries. (Basically, all they're fit for is putting a few buttons and text boxes around a ready-made library)(and obviously they've never heard of Tk). (...and you should have seen the reaction when I used a bit of FORTH for something....everyone knows its impossible to make language interpreters that are smaller than a few tens of megabytes)
So I'm a bit of a curmudgeon. So what. Just bear in mind that the reason why I'm "almost" retired is that I'm still somewhat irreplaceable. Its nice to be wanted even if it is a bit of a drag (my time is worth more than their money).
Have an up-vote :) I wish I could give to 100 votes for the "universal macro assembler aka 'the 'C' programming language'" though!
Also share some of your views on FORTRAN, great for scientific work due to its built-in support for maths and complex numbers, extensive libraries (NAG & IMSL, etc) but had some horrible attributes as well (implicit typing, joys of GOTO being used far too often, being able to enter a function at multiple places, etc)
This post has been deleted by its author
" (Basically, all they're fit for is putting a few buttons and text boxes around a ready-made library)"
Never a truer word spoken IMHO, based on my 35 years in this industry and recent entry into the world of "retired but always interested in something that might be interesting"
Mine is the one with the MACRO-11 manual in the pocket!
When I started programming as somewhat of a hobby, all you needed was an assembler and C complier (for the mundane stuff). PC games needed assembly for speed as a C-compiler just wouldn't do it. To much bloat built in.
When Win98 hit, as far as employers... assembly was dead on the PC. Using contractors who only new Visual Basic and some other high levels, drove the coffin nails home.
It's good to hear that there's still hand assembly going in on with the embedded stuff. Clean, fast code is from a bygone era it seems.
This post has been deleted by its author
I guess in a lot of cases they don't have much choice, they have a good enough job and that company often wants them to keep doing what they need done. No offers of new projects or training on things as they come along.
I'm as guilty as any. I have not pushed myself to change job as life has been OK-enough here, and the steps in my relevant knowledge have come not by planned progression but by projects coming along and I end up doing them. Hence learning a new skill, like how to write "C code" in python...
Reminds me of my first real programming task back in the early 80's. Fortran IV on a PDP-11 (LSI11-73) with lots and lots of Macro-11 assembler. My supervisor Norman introduced me into the discipline of programming the hardware of the machine as libraries of macro-11 assembler which had the ability at assembly time to check the args passed down to each segment of assembler code. So I could write with maximum code efficiency but also benefit from calling a sequence of these code segments in a way that looked like a high level language. These also had to deal with fun things such as overlays where memory regions were swapped in within the 64k address range. I didn't have to resort to using self modifying code techniques to save on memory usage though which wise old Norman had to resort to on earlier PDP-8 machines. I don't really see what is so difficult about this project of programming using Fortran IV and assembler. It was an absolute fun time back then, being in total control of the machine hardware, counting cycles, understanding what the CPU is doing, dealing with interrupts, dma's, overlays, in/out ports, passing arguments between Fortran IV and the macro-11 libraries of assembler code routines I wrote. Oh and writing all of the user interface in Fortran IV to run on a VT-11 VDU. Ended up writing every line of the code myself in a period of 6 months and it was all done as my year in industry on the third year of my degree course. 3 years prior to this period I had no clue whatsoever about how computers worked at all and I was more interested in petantonic blues scales on a guitar playing in a rock band! The gig at NASA sounds like a fun gig!
OK - I have FORTRAN programming skills. I have been taught Motorola 68000 assembler while at Uni.
I have done IBM JCL, and have used the MVS and VM/CMS operating systems.
I have skills in a whole range of operating systems and languages gained over my career.
So where are the companies shovelling money at me?
I see these things all the time - there is a shortage of blah-de-blah skills and banks are shovelling money at people with those ancient skills. Well show me the money.
Sorry, I'm just so cynical about all these "there is a shortage of IT skills" stories.
Look at the modern counterpart of "coding" where "code schools" are tempting ordinary Joes into short courses in "coding apps". How many of them really end up in well-paid jobs?
According to Google, Lance A Leventhal is 69 years old and alive. He's the Guru, I'd have never passed my degree without his books on Assembly Language programming. NASA should go seek him out and he could train up any bright young team in case Voyager keeps going into the 2030s or beyond...
Hah! I wrote with Fortran IV in 1976...for use on mini-computers like Perkin Elmer, Interdata while working in Colorado. Our storage disk was so large and highly fussy. You can imagine my delight with the floppy disk and PC of the 1980's. Ancient? I was a female, baby boomer who took time out from the sex, drugs and rock roll life of the 1970's to learn a new language in digitizing the earth. Those brain cells are now gone or written over.
There are many linguists who can help my favorite space travelers...The Voyagers! They were the best thing to come out of the 1970's. The highly rated Gold record and the only way we 'bags or water' could leave the planet at that time.
Some humans actually love the ancient languages -- nothing to 'snark' about.
My native language. FORTRAN, my father worked at GE and got me access to their timeshare computer when I was 12. Used FORTRAN at UC Berkeley along with PDP-11 Assembler for my Computer Science degree. Don't want the job though, even though it sounds interesting. I am a full time Monash University student getting my RN degree. Finish when I am 67, no retirement, no money. Will get a full time job. If I put 55% of my gross income away each year can afford a house by the time I am 84. Yippee! At least there is longevity in the male line of my family. Should enjoy retiring at about age 98 for a quiet retirement, if my investments don't go sour.
tjb
my native language is FORTRAN also.
I taugth myself to program at the age of 12. My father managed a PDP 11/45 in the lab where he worked. The lab wrote a medical diagnostics package on the PDP - in FORTRAN!
The FORTRAN manuals I used were for an HP-85 desktop computer, borrowed from one of the professors.
I guess I was a lucky wee boy to get to use such an advanced system at the time - and store my programs on disk and to run on an interactive system using a terminal.
When I later went to Uni I learned Pascal by punching card decks and having them wheelbarrowe dup the hill to a batch processing ICL machine.
The weans of today don't know they are born - waiting till the next day to fix a syntax error!
And having the skills to repunch an already used card!
All of us Engineers now who graduated from UNB Fredericton, New Brunswick in the mid 70's took Fortran as well as Watfor then Watfiv..
Many of us working in western Canada's Oil and Gas Industry are being forced to retire due to lack of O&G engineering related work so would be willing to help out.
These 'oh there is no-one who understands this obsolete technology' stories always need to be taken with a huge pinch of salt in my experience when one does some Googling to find out a bit more. Still they liven up a dull, foggy Monday.
I bet there is an awful lot of FORTRAN 77 still in use - probably FORTRAN 77 tweaked over the years to use the odd bits of later FORTRAN. Probably more of it than native modern Fortran.
And, of course, as other people have said there are an awful lot of embedded processors with this sort of memory constraint floating around. I don't think we are yet at the stage yet where we need to conscript the people writing new games for Atari 2600s (128 bytes of RAM).
What NASA will probably have to do is - find someone with good general computer science knowledge/experience and get them to learn the system, under guidance from the guys who are nearing retirement.
Many of us who work on embedded systems are constantly having to learn new and even old technologies. It's a different piece of silicon every few months in my job. :)
Programming in Fortran, Assembly languages with and without Macros, small memory footprint where every bit counts, for Voyager... Sounds like fun to me:)
Remember 4K memory? Yes 4K with cores? The 8051 HDK had 256 bytes of memory and a hex keyboard to program it. 64K was a luxury. What would you do with so much memory?
Maybe Suzanne Dodd at NASA should consider crowd sourcing support for the old software. Make it open source and allow people to collaborate on any projects she needs to do. Wouldn't it be fun to be part of a NASA project in some small way?