A portion of US$55,000?
Those of us still (re)coding large projects in Fortran for Fortune500s don't even get out of bed for that kind of small change. Might want to offer a trifle more if you're serious, NASA.
NASA wants scientific computer experts to take a look at one of its oldest software suites in the hope they can speed it up. The code in question is called "FUN3D" and was first developed in the 1980s. It's still an important part of the agency's computational fluid dynamics (CFD) capability, and had its most recent release in …
Those of us still (re)coding large projects in Fortran for Fortune500s don't even get out of bed for that kind of small change.
Indeed, that's rather funny. I write Fortran for a living (it is still very much alive in scientific and technical programming, greybeards quips notwithstanding), and $55K will just about pay for a complete professional rewrite of a smallish numerical code. I'd estimate no more than 15K LOC - assuming that you need it to actually work correctly, to be documented, have at least a minimal-coverage test set, and not fall to pieces the moment I walk out of the door. Also assuming this is all done at cost, or even a bit below (ie by internal staff).
Messing about with a million-LOC monster which started out in the dinosaur era, and had multiple sheds built on its back since then is definitely extra.
Unfortunately, large fraction of the academic and government research universe is addicted to having a continuous supply of disposable students and postdocs. Invariably, they are led to believe that if they they just stick about for a little longer and finish their degree and couple of postdocs despite being paid a fraction of what their work is actually worth, the bright future is right ahead. Well, it never actually is: the academic system in most of the world has been operating in a steady-state regime (or even contracting, if we talk the UK) for decades. There are no jobs waiting for these students and postdocs; in fact by pursuing their "training", they destroy these jobs - why hire a competent, fully-trained staff when the same money will get you four eager students or two postdocs?
Call it internal outsourcing, if you will.
Quite true, but isn't that the way its supposed to work? Disposable students and postdocs b****r off to industry and make decent living with far better fundamental training and breadth than industry would/could give them, whilst miniscule number of really brilliant people stay in academia doing pure work, and a slightly greater number stay in academia to train the next lot of bright eyed grads? That is, after all, what Universities are for - provide an endless supply of trained people.
Quite true, but isn't that the way its supposed to work? Disposable students and postdocs b****r off to industry and make decent living ...
Why should it be like this? We do not train people as surgeons with an expectation that they will eventually work as nurses, do we? Then why do we train people for advanced degrees in physics, chemistry, biology, or math, knowing full well that nearly all of them will never be able to find gainful employment in the field they studied?
More importantly, why do we keep lying to the young people about their actual job prospects when we encourage them to get STEM education?
I was once invited to a career planning fair for third-year university students (chemists, if I recall correctly), to offer a perspective from a government research laboratory. I was really shocked to see how little idea these rather intelligent and motivated young people had about their actual prospects in research. Only a handful of them had any hope of landing an academic job, and may be a few more would ever work in their profession once they get their degrees - yet this was pretty much the first time anybody at the university has actually told them so, and shown then the actual numbers.
I was not terribly surprised not to be invited again next year: the universities have an obvious vested interest to mislead the students into taking useless degrees. For a vast majority of students entering the university, getting trained as a bricklayer (Have you tried to hire a half-decent bricklayer recently? What they charge is murder, and you actually have to get on a waiting list to get one!), plumber, electrician, or a chef would be a far better choice.
> Only a handful of them had any hope of landing an academic job, ...
> yet this was pretty much the first time anybody at the university has actually told them so,
Can anyone capable of getting a reasonable degree from a reasonable establishment really be so dumb and naive as not to realise that if there are, say, 100 students a year coming through your college, and ten staff members who will hope to have a 40 year career, then jobs in academia are going to be few and far between? None of us were in the slightest doubt about it in biological sciences at Imperial in the 70s. Hell, the not really very funny joke was that you needed a phD to wash bottles at the top establishments in zoology.
I recently found a nice little Blog engine that required regular compiling to update the blog with static pages. I'm no programmer but was pleased to see that FreeBSD has a compiler for modern Fortran in the base distribution so no problem. Things don't change as fast as we tend to assume.
Just don't try to use that compiler on legacy code, like SPICE. Through the misguided idea that people use Fortran to write new code, they 'updated' it, to the point where it fails miserably on memory management, misinterprets the common block and bitches about Hollerith formatting. I believe it isn't actually a compiler, but a preprocessor, which converts your Fortran to 'C', then runs that through gcc. I've seen the same approach used by other Fortran so-called 'compilers'.
"Hang on - if the code dates back to the 1980s then the CPUs it's running on now are about a million times faster than the original ones. If it was fast enough 30 years ago, why is it too slow now?"
They try to analyze more complex problems. And it can take several weeks to run.
If you have some old code, and you need to find someone with a very specific set of skills to modify it, isn't the usual route to, you know, offer a paid job? Of which I'd guess this is maybe 1/4 the expected annual salary, at the low end.
Can I try this if I have something I need done? Come and cut my grass! There's a chance you'll win a portion of £5!
Come and stack shelves in my shop for a year! I'll maybe give you a bit of £2/hour at the end of the year!
"[...] without any loss of accuracy."
Fortran programs are notorious for producing different results from the same data when you change something.
In the "old days" different computer models had different precisions for floating point. Even with the same number of bits for mantissa and exponent there could be different ways for the hardware to perform a single operation.
The same applies to the software - change the order in which parts of a calculation are performed and the intermediate rounding can change the end result.
Do "industry standards" eliminate those problems - especially for programs that may have originated before such standards were set?
The first "live" program I wrote was a floating point emulator to test the results of a prototype mainframe. That taught me that unless you copied the hardware microcode ogic exactly then the results did not always match in marginal conditions.
On an earlier model - a customer did designs for turbine blades. As they were not done very often there was a precautionary practice of repeating the last data set as a benchmark. One day the results didn't match and it was assumed there was a subtle problem with the computer. To everyone's surprise it was the previous results that had been wrong.
Do "industry standards" eliminate those problems - especially for programs that may have originated before such standards were set?
To a degree, yes. In particular, IEEE-754 is a great solace to anybody performing numerical math. The careful design of the Fortran numerical evaluation semantics also helps, as it allows you to specify the evaluation order of your expressions in great detail if this is desired.
However, all of this only goes so far when you are dealing with intrinsically unstable problems. In particular, CFD is quite notorious for numerical instabilities and tendency to chaotic behaviour - so I suspect that NASA's definition of "no loss of accuracy" is slightly more intelligent than "produces exactly the same outputs given the same inputs".
"However, all of this only goes so far when you are dealing with intrinsically unstable problems. In particular, CFD is quite notorious for numerical instabilities and tendency to chaotic behaviour "
Well worth the interested reading about Edward Norton Lorenz's experiences with early computers and the 'butterfly effect' : https://en.wikipedia.org/wiki/Butterfly_effect
In particular, CFD is quite notorious for numerical instabilities and tendency to chaotic behaviour
<tease>Isn't that what turbulence is?!?! Sounds highly appropriate for CFD...</tease>
I'm quite glad to not have had to write a CFD package... they're hard
Don't quite get why this is particularly a Fortran problem - it's simply a result of floating point maths. In fact to expect bit wise identical results for every compiler/library/etc. combination in complex codes like those discussed here displays some ignorance about the nature of the beast, and that's before we even think about talking about parallelism. And in practice for the performance these guys need your choice of language is Fortran, C or C++, nothing else will cut it however much quiche you eat, and for all 3 the floating point issues are similar.
IIRC variables named i, j, k were automatically of integer type.
All variables were implicitly declared. Any name starting with I, J, K, L, M, or N was integer, anything else was float, unless declared otherwise.
It was common practice, on compilers that supported it, to declare everything to be some unlikely datatype (such as 64-bit boolean) so that mistyped variable names would be picked up by the compiler.
IMPLICIT LOGICAL*64 (A-Z)
(please imagine that line indented by 6 spaces, " " doesn't seem to work here)
Of course ... that's going back a way. Modern Fortran, which isn't quite the oxymoron you might think, doesn't need such tricks.
The easiest way to tell a Real Programmer from the crowd is by the
programming language he (or she) uses. Real Programmers use FORTRAN.
Quiche Eaters use PASCAL. Nicklaus Wirth, the designer of PASCAL, gave a
talk once at which he was asked ``How do you pronounce your name?'' He
replied, ``You can either call me by name, pronouncing it `Veert', or call
me by value, `Worth'.'' One can tell immediately from this comment that
Nicklaus Wirth is a Quiche Eater. The only parameter passing mechanism
endorsed by Real Programmers is call-by-value-return, as Implemented in
the IBM\370 FORTRAN-G and H compilers. Real programmers don't need all
these abstract concepts to get their jobs done -- they are perfectly happy
with a keypunch, a FORTRAN IV compiler, ana a beer.
Don't quite get why this is particularly a Fortran problem - it's simply a result of floating point maths.
You are to some extent right, the problem is fundamentally the floating point maths.
However, the reason why it is a Fortran problem is typically you use Fortran on problems where the maths is critical because Fortran gives you lots of control over numbers (both integer and floating-point).
The number handling capabilities of C and C++ are rudimentary in comparison, but that is because these languages (and C specifically) were not intended to be much more than high-level assembly languages. So if you want Fortran like number handling in C you need to build the relevant libraries...
The storm was predicted - it was just not predicted as a hurricane which it wasnt but the press didnt give a fuck about that, I lost some of my roof, cried because I didnt park my car under that tree I knew would come down in a storm. The best bit was walking back from the pub at 2 in the morning into the tooth of the gale and almost being able to touch the pavement standing up!
This post has been deleted by its author
Or, maybe even more to the point, will be able to handle jobs that are ten times more complicated in the same time. It doesn't matter how fast your hardware is, complexity of CFD work can always be increased to overwhelm it, so more efficiency is always beneficial.
And you know there are always folk who fancy a very different project in their spare time, and if it brings in a few quid that's nice too. Someone who works on high frequency trading might fancy a bit of part time hobby work on a project that actually benefits humanity instead... Or for that matter someone burned out from working on that stuff who is sitting around getting bored.
"Or, maybe even more to the point, will be able to handle jobs that are ten times more complicated in the same time" - err, probably not.
If you are doing finite element modelling in four dimensions (three space + time), then a factor of 10 will not even allow you to half the mesh size.
For now I do not need a walking frame. I occasionally need wrist, back, ankle or knee support, but it is not because my beard is gray (which it is). It is because I have done something stupid playing sports or working on a DIY project.
Now, on the subject at hand - there are plenty of other gray (and not so gray) bears like me who can do Fortran and are not in need of a walking frame. The distinction is that we have NOT graduated with CS. Fortran was a necessary evil in doing the numerical methods courses in Chemistry and Physics up to as recent as 10 years ago (maybe still is). Some of this code is now in use in banking, computation of prices for airlines and god knows what else so it is fairly well maintained too.
However, as far 55k goes - you gotta be kidding, right?
... to get the story into El Reg, and to get commentards talking.
I expect most of us here are precisely those commentards who have worked in FORTRAN at some point in our careers. In my case it was back in the '80s and I haven't revisited since.
There's a lot of software around for which a 10x speed increase for little effort is entirely realistic, even unambitious. Not that I'm going to be tempted to this one: my memories of working with FORTRAN codebases range from nightmare to, at best, neutral.
"There's a lot of software around for which a 10x speed increase for little effort is entirely realistic"
There's lots of code out there where a 10x increase in speed is trivial, but if I had to guess where to find it I wouldn't have picked high performance CFD code from NASA!
Unless there is a trick like getting it to run on a GPU then this sounds like one hell of a challenge.
I was once asked to look at an aircraft simulator that was supposed to run 20 iterations per second, but the first version took 2 seconds per go. By rearranging the maths I got it going at 10 times per second. No need to resort to assembler.
So this kind of work needs numerical analysis skills as well as programming skills.
I am not eligible for this NASA job: (i) I'm a Brit (ii) I'm retired, and would have to be very tempted.
Thought you could learn all you needed to be able to program in Fortran from a single library book (or have I been misled by Hollywood again!)
Alternatively, on the subject of greybeards and $55k ... there's an ancient IBM mainframe in the computer history museum that was donated to them which thye wanted to try to get running again and someone had the idea of getting the IBM retirees newsletter to ask if anyone wanted to volunteer - the museum was astonished to get over 50 responses within days of the newsletter going out and between them they got it all running again. For some greybeards "working on history" is not always a matter of money!
"For some greybeards "working on history" is not always a matter of money"
Indeed. I might do some such thing for the fun of it or to help a friend (if I had any) working at the museum.
It's possible I would do it just for the chance to play on their kit.
"I wonder how many instances of SET-at-HOME we could run on this one without them noticing?"
I'd guess its more likely that someone with a lot of experience might just be able to spot where in the program accurate speedups can be obtained - modern fortran isn't slower its just easier to do more generic things so people do that rather than partition things properly. A lot of 3d problems can be mostly solved in 8 bit but only a few bits need to be calculated in 64bit and working out which is which can give you an order of magnitude speedup but leave people gibbering at the code cos that's not how it was done in college.
I'd guess being able to separate those sections out, and re-code them in a safe way, are the tricky bits of the task.
But since I'm not a 'merican it's strictly an academic exercise to me.
I recall the chapter Steve Connell wrote in "Code Complete" getting to implement the DES on an original spec IBM PC (4MHz) to code a 9600bps data stream at real time rates, and how many times he re-wrote it to get the speed up he needed.
Personally the first thing I'd do would be to bench mark it so I knew it was working properly to begin with. That way I'd know any errors later on were mine and I could always roll it back to a known good version. I know "It's been running for decades, how can it have bugs in?" I'll leave people who've used CFD codes for decades to answer that one.
"[...] but leave people gibbering at the code cos that's not how it was done in college."
That suggests the triangle of "speed - cost - maintainability".
A young friend with a gift for maths - graduated in Theoretical Physics. Couldn't get a job - so did a Masters in Computing. Landed a job as a Java programmer in The City.
His complaint at Xmas was that he spent too much time supporting other people's applications which had suddenly decided to fail when something changed.
He has now managed to career shift into the company's real business - and is now an investment manager.
Their website states: "Since the code is owned by the U.S. government, it has strict export restrictions requiring all challenge participants to be U.S. citizens over the age of 18."
This is probably nonsense. (I suspect any citizen of a NATO country can look at it.)
And which version is "Modern Fortran"? The latter ones have all sorts of good stuff (vide wikipedia article on fotran).
Well I was wondering what real access to the source code a challenge participant actually got. As given the small reward pot on offer, obviously NASA don't really value the code. So I'd suggest enterprising challenge participants uploading the source code to github or sourceforge...
"It's been running for decades, how can it have bugs in?"
There was a Harry Hargreaves cartoon in the Punch magazine many year ago. It showed a woodpecker handing out a certificate to guarantee almost all bugs had been removed from a tree. When asked why it was not "all bugs" he replied something like "I will need some more business in six months time".
I bought the original and had it framed for display on my desk in the IT support department.
The software-house-as-bug-creation-machine paradigm.
Not so sure it applies to in house S/W, although I've no idea where this thing comes from. Worst case is it's written by the back end of a 4GL (some of which produced filthy code. GOTO'S TO GOTO'S. Just awful).
Establishing a known baseline sounds like a desperate waste of time in the days of agile but I'm well aware of just how dumb I can be. Having it means if it all goes pear shaped you can always roll back.
Medicine would be a very different discipline if the first line of the Hippocratic Oath read "Just start cutting until something looks like it's working"
I'm retired. And have a lot of spare time.
Where can I find this "Modern" Fortran compiler and DL for free?
Heck, in the old days I used to translate Fortran's object code (IBM 360) into BAL to make it go faster.
Ahhh... the good ol' disassembly days! Apparently you go to jail for than now.
Biting the hand that feeds IT © 1998–2022