back to article Climate model code is so outdated, MIT starts from scratch

When faced with climate models coded in Fortran in the 1960s and 70s, MIT decided there wasn't any more cobbling together left for the ancient code, so they decided to toss it out and start fresh.  It's an ambitious project for MIT professors Raffaele Ferrari and Noelle Eckley Selin, who submitted their Bringing Computation to …

  1. 9Rune5

    "Dynamically typed"

    That feels like a monumentally bad idea.

    If you are in such a hurry that having a compiler bothering you when you get your types mixed up impedes your "progress", then you must be doing something wrong.

    1. Richard 12 Silver badge

      Re: "Dynamically typed"

      It also prevents a lot of optimization - both size and time - which really matters when trying to deal with really large datasets.

      1. Kevin McMurtrie Silver badge

        Re: "Dynamically typed"

        Sometimes. It ruins some optimizations but a JIT can be a big advantage with localized optimizations. Remember that even C++ like languages can have performance problems with virtual methods. A JIT doesn't have those limitations because it can add, remove, and recompile optimizations as needed for different implementations. It's how Java beats C++ in certain kinds of benchmarks.

    2. fg_swe Bronze badge

      Programming Fast Food

      If your application cannot kill anybody and does not store anything of much value(say Facebook), then maybe it is justified to use a dynamically typed language.

      In automotive, aerospace, trains and medical machinery you better use as much type safety you can get.

      https://www.adacore.com/uploads/customers/CaseStudy_Eurofighter.pdf

      Apparently, many FORTRAN programs have issues with index errors, according to Sir Tony Hoare. Index errors are considered a typing problem, too.

      1. fg_swe Bronze badge

        Static Typing / 2

        https://www.adacore.com/papers/ada-and-spark-at-welch-allyn

        https://www.adacore.com/papers/case-studies

        https://www.adacore.com/industries/automotive

  2. Joe W Silver badge

    A language they cannot read?

    Good grief! I picked up Fortran pretty quickly when I used it for a project. As soon as you know a handful of languages reading and understanding them is no big deal - if the code is well written. You can write nasty code in any language.

    So they want to run regional climate models on small devices? Statistical or dynamical models? I mean this is not what people already do, because global climate models just don't have that resolution, especially for longer runs (common era or longer, on a good temporal resolution, not the sped up stuff) - which is a problem, and will remain problematic, because maths tells us it is. And no, this is not new. Although you can do a lot with simple stuff a la Hasselmann ('74? cannot remember)...

    I should really check out the project, I feel the article glosses over these details. I'm pretty sure the project leads know more about this than I do, and it will be an interesting read!

    1. Doctor Syntax Silver badge

      Re: A language they cannot read?

      Yup. My introduction to computing was a 5 day FORTRAN course of which I missed the first day.

      Not being able to read it isn't much of a recommendation.

      1. herman

        Re: A language they cannot read?

        If they are so dumb that they cannot read FORTRAN, then it actually explains a lot.

        1. Someone Else Silver badge

          Re: A language they cannot read?

          You can write FORTRAN in any language -- including FORTRAN.

    2. John Sager

      Re: A language they cannot read?

      Part of the problem with existing models is that they are cruft upon cruft to many levels, so starting afresh is a good idea. I too wouldn't use a dynamically typed language that needs to be very optimised to be useful at scale. Since FORTRAN is still used for that type of fast vector program, it still would be a better choice than Julia. Modern FORTRAN now has a lot of goodies from modern language design - a far cry from my one-term course at uni in 1970 based on McCracken's book of that era!

      1. EBG

        cruft upon cruft to many levels,

        OK .. but .. is this the same climate science no -one is allowed to question because it is settled ?

      2. LybsterRoy Silver badge

        Re: A language they cannot read?

        Nah - the main problem is the older models don't automatically say carbon dioxide is the problem!

        DUCK & RUN

    3. Falmari Silver badge

      Re: A language they cannot read?

      @Joe W “Good grief! I picked up Fortran pretty quickly when I used it for a project. As soon as you know a handful of languages reading and understanding them is no big deal - if the code is well written.”

      They are not programmers they will not know a handful of languages. These students are probably postgraduate specialising in a field (climate science) after getting their first science degree. Modelling may only be a small of their postgraduate studies and could be the first time they have to do serious programming. So, unless they worked in Fortran before then it may not be able to pick it up quickly.

      My first job out of university was working at a research institute on a government funded project. Everyone on the project except me (just a BSC in Computer and Mapping Science) had a doctorate.

      Anyway, one thing I had to do was take computer models (large and small scale) that modelled nitrate concentration in the soil and turn them into modules that the system I was programming could call. Some were in C one was in Pascal and yes there were some in Fortran.

      Now while the models worked not that I understood the maths elegant code they were not. The code was poorly structured no comments (who needs comments I know what it is doing said one of his model, coded in Fortran). None of them knew any language other than the one they had programmed in. The reason for this is they come through their science degrees then pick up some coding when they needed it.

      Now I expect it is not as bad as that today as that was almost 30 years ago. But they will not be programmers the science they studied was not Computing.

      1. Anonymous Coward
        Anonymous Coward

        Re: A language they cannot read?

        They're supposedly engineering students. I challenge you to find a University in the world with an Engineering course that doesn't include a healthy dose of programming in the course work...

        1. Falmari Silver badge

          Re: A language they cannot read?

          @msobkow

          They don't just do Engineering look at their schools and what they do Literature Philosophy Biology Chemistry and others that would not have engineering.

          http://web.mit.edu/education/schools-and-departments/

          1. LybsterRoy Silver badge

            Re: A language they cannot read?

            You seem to be implying that students studying Literature Philosophy Biology Chemistry are going to be building climate models.

            Doubtful.

            1. Falmari Silver badge

              Re: A language they cannot read?

              No I was not implying that at all, just pointing out MIT does not automatically mean Engineering student and not all MIT students are on Engineering courses .

              I would expect sudents building climate models would be science students studying Earth, Atmospheric and Planetary Sciences not engineering students as @msobkow supposed.

          2. rnturn

            Re: A language they cannot read?

            That's called a well-rounded education.

            I wish the people I work with nowadays had done something besides just coding before they got into the work force. Most cannot express a coherent thought in writing and what little documentation that's available is mostly unreadable sentence fragments cut-n-pasted from various documents that are kinda-sorta related to the task at hand. As one old hand recently revealed, some parameters in the database are referred to in these documents by as many as four different phrases; if you're lucky, you may be working with a document that uses all four. Or... a set of instructions for an installation process may be out of order (i.e., don't forget to do //this// three steps ago). Crap like that that someone who was forced to take a literature and/or English composition course who likely be aware of and fix before foisting it on their co-workers.

      2. Doctor Syntax Silver badge

        Re: A language they cannot read?

        You probably discovered it's possible to write a BASIC program in any language.

      3. Roland6 Silver badge

        Re: A language they cannot read?

        >These students are probably postgraduate specialising in a field

        Only problem with your argument...

        The original models were also written by... Climate Scientists...

        So the real question is what has changed in the field of undergrad & postgrad climate science that people no longer study programming languages relevant to climate modelling...

        1. jake Silver badge

          Re: A language they cannot read?

          The original models were, indeed, written by climate scientists. It just so happens that those scientists were (mostly) post-grads specializing in climate science.

          To anwer your question, the thing that has changed is that "climate science" is political these days, with very little actual science involved. One must create pretty graphs to ensure funding. Don't need to waste time programming to make pretty graphs.

          1. nijam Silver badge

            Re: A language they cannot read?

            > One must create pretty graphs to ensure funding. Don't need to waste time programming to make pretty graphs.

            And for anyone upset that I upvoted this, it's equally true of any subject area with a political or populist press interest.

        2. Falmari Silver badge

          Re: A language they cannot read?

          @Roland6 “The original models were also written by... Climate Scientists...” Valid point +1.

          But I don’t think this follows “no longer study programming languages relevant to climate modelling”

          I would think a programming language forms a bigger part of todays courses in that field than it would have 50 years or even 30 years ago. Just Fortran is no longer considered the relevant language and has been supplanted by newer languages. Even 50 years ago Fortran was not the only language used.

          “CLiMA made the determination that old climate models, many of which were built 50 years ago and coded in Fortran”

          The point I was making was we might think it is shocking that they can’t read Fortran, but we are looking at it as programmers. We have been taught or learnt how to program irrespective of language. They have been taught how to use a specific tool to do a job in this case turn their model from a model on paper to a model on a computer.

          So, show them a model in a language they have not seen then they will not be able to read it. But these are bright people give a little time they will be able to do it. If they can do it for one language, they will be able to do it for another.

          1. Roland6 Silver badge

            Re: A language they cannot read?

            @Falmari - I think the point you are making is that having created the models, we've gone through a period where climate students learnt how to use the models - maintenance being left to a small and shrinking group of experts ie. those that created the model.

            The questions, which the article leaves unanswered is whether for the last 10 years Julia has been the language taught at MIT to climate scientists and whether it is largely replacing Fortran in applied science.

            Because you are right as (long-in-the-tooth) computing people we naturally think of "our" languages rather than those which are of use in application domains outside of Systems and web development.

        3. nijam Silver badge

          Re: A language they cannot read?

          > The original models were also written by... Climate Scientists...

          Or by one of their mates "who knew a bit about programming".

    4. Dr. Ellen
      WTF?

      Re: A language they cannot read?

      Lord, lord - my first language was Fortran. But this was long ago, circa 1960. Now there are so many languages running around we have the Tower of Babel under construction.

      1. jake Silver badge

        Re: A language they cannot read?

        The Fortran you and I learned in 1960 was not the program of today. It's kind of false to to think of today's Fortran and 1960 Fortran as the same language ... Rather, Fortran is a long sequence of individual languages, all named Fortran.

        What they have in common is the ability to crunch lots of data on machines with massive IO. Which is the point of Fortran. We have spent the last 60+ years updating it, speeding it up, and generally making it work better for things like climate data, and making it make use of bigger, faster better hardware. It;s an ongoing process.

        And the fizz-heads are going to throw away this massive installed base and community because Uni Students think Fortran is HARD?

        Cry me a fucking river. Learn something, or give your seat to someone elfse who is interested in learning, you twits.

      2. Roland6 Silver badge

        Re: A language they cannot read?

        Landin’s seminal paper “The next 700 programming languages” was published in 1966., the foundations of the Tower of Babel were being laid circa 1960....

        1. jake Silver badge

          Re: A language they cannot read?

          During the meanwhile, out here in the Real World ...

    5. nijam Silver badge

      Re: A language they cannot read?

      Good grief yourself.

      I picked up a bit of Latin at school - but it still makes no sense to use it conversationally. Or to write novels in it. Or anything in between, actually.

  3. AlanSh

    Is ML the answer?

    There's probably more historical data lying around on global and local weather than anything else. It sounds as if a generic ML (I won't call it AI) environment should be able to pick it up and predict future trends from past historical data with reasonable accuracy. So, why invent another model?

    Alan

    1. Tom 7

      Re: Is ML the answer?

      ML seems to be pretty good at extracting patterns from data that we dont yet have a clue about. But just cos we dont know what those patterns are doesnt mean they dont predict things more accurately than out current science.

    2. Doctor Syntax Silver badge

      Re: Is ML the answer?

      Systematic meteorological record keeping isn't that old historically and the oldest records aren't that well distributed geographically. Some of the best records are from the UK which has notoriously changeable weather which must make them a tad noisy. The coverage of proxies such as tree rings and varves will be better.

      1. Missing Semicolon Silver badge

        Re: Is ML the answer?

        However the analysis of tree rings and varves is less well controlled!

    3. This post has been deleted by its author

    4. m4r35n357 Bronze badge

      Re: Is ML the answer?

      Attempting to model a (pathologically) chaotic system with half-precision arithmetic? Ha-ha!

      Historical data has low precision.

      Extrapolation is _not_ modelling.

    5. Anonymous Coward
      Boffin

      Re: Is ML the answer?

      No, it is not.

      ML is perhaps a very nice approach for weather but is bad for climate. Reason for this is two things (1) purpose of weather prediction is to predict weather, not to be science (no-one cares if prediction system uses tealeaves), and (2) training data is endless – you predict weather n days out and in n days you have training data to improve that prediction. And because weather prediction is short timescale you can assume say composition of atmosphere is constant on those timescales.

      Climate is different. First of all training data is less easy to come by: if you predict climate in 2046 under assumptions of atmospheric composition changes xyz over time and other changes tuv over time, then to get training data for this both takes a very long time, may be undesirable experiment to do, and is mutually exclusive with any other training data unless you can clone universe, which is expensive. You can check model against past climate and of course people do this.

      And also climate models are (must be) science. If your model predicts x then it is important that you can go into model and say, OK, model is solving these equations using this solver and this data and this is why x is prediction. And if you think x is unrealistic you can then say, OK what is wrong with our solver, is our cloud model inadequate (yes it always is) and so on and so on. If all you can say is 'there is this opaque blob of trained weights in some neural network' that is very unsatisfying.

  4. Tom 7

    Reinventing the square wheel

    I dont know for certain but I'd imagine 43 year old BLAS and other fortran libraries that will do the heavy lifting are not replaced with Julia ones that wont be as fast as the Fortran ones which have been optimised for SIMD and other modern GPU speedups for, well generations. I doubt even the most experienced Julia group have the faintest idea how to match that.

    1. jake Silver badge

      Re: Reinventing the square wheel

      Telling that Juia was built specifically to be able to easily call Fortran (and C) libraries.

    2. Bitsminer Silver badge

      Re: Reinventing the square wheel

      I think there's a bit too much negativity about the Julia ecosystem here.

      Both Fortran and Julia are LLVM under the covers. And Gnu Fortran is competitive to the LLVM kind.

      But when running very large models using large clusters of speedy machines (400Gb Ethernet between nodes, for example) a lot of the problem is not CPU speed but managing to overlap CPU and IO and networking all at the same time. Otherwise one of those becomes the bottleneck and you can't easily make the calculation run faster.

      Julia allows a computational software architecture that readily provides a lot of overlap. And it has some flexibility in how it is applied or tuned. Unlike the Fortran case.

      Plus, a rewrite is a good career move.

      1. Roland6 Silver badge

        Re: Reinventing the square wheel

        >Both Fortran and Julia are LLVM under the covers.

        So Julia is effectively enhanced C and constrained by having to be used on C-biased OS's...

        Interesting to note that none of the official releases support anything other than an x86 level workstation, personally I was expecting it to have executables for super and hyper-scale computers, as used by the (UK) Met Office.

        1. jake Silver badge

          Re: Reinventing the square wheel

          "So Julia is effectively enhanced C and constrained by having to be used on C-biased OS's..."

          I'm sad to inform you that the above sentence is not even wrong.

    3. Kane
      Joke

      Re: Reinventing the square wheel

      All right, Mr. Wiseguy, you’re so clever, you tell us what colour it should be.

  5. EmilPer.

    students can't learn Fortran ?

    students can't learn Fortran ?

    ... what do they do to them at the university these days ? lobotomy ?

    I know python is bad but not that bad

    1. DS999 Silver badge

      Re: students can't learn Fortran ?

      Probably more that they don't want to learn Fortran, and they think they'll get more of the best and brightest involved in climate modeling if they they use a more modern language.

      1. jake Silver badge

        Re: students can't learn Fortran ?

        Yeah, but Julia?

        Has anybody followed the money?

        1. Tom 7

          Re: students can't learn Fortran ?

          I dont think Julia is the problem. The problem is people dissing other languages rather than learning them, or even learning a language completely. Most people, due to their career paths, tend to live in a small corner of a language and diss other languages largely because they dont know them well enough. I've been learning this crap since 1974 and still learning and I like to think I'm not a slow learner.

          I doubt there is anything to be gained from re-writing stuff in another language when its been possible to use the best bits of other languages since pretty much day 1. Learning other computer languages opened up more of the computing space for me and I think it would do the same for others. Re-writing things you dont actually need to is just an act of self harm which allows you to paint yourself further into your little corner. What next, rewriting machine code in Julia?

          1. 9Rune5

            Re: students can't learn Fortran ?

            In my experience, a code base will degrade over time.

            It is only recently (a decade ago) that my team started doing code reviews. That isn't 100% true as we did them before that, but it was only around 2011 that we finally got access to tools that made code reviews viable (through pull requests). With gated check-ins we finally had a way to formalize the process. (Tooling is important!)

            So on that project, there is a clear watermark between the 'old-school' code and the new code that put generics, LINQ, ORM and other nice stuff to proper use. The original team had used this project as a way to familiarize themselves with C# and it showed.

            But in the end patching the old mess was deemed unviable. Especially as we wanted to target a new platform. We were going to do a complete rewrite, but in the end our company was sold off and we were put on other teams (where we face the same kind of decisions and are indeed rewriting everything from scratch).

            Having recently worked on a Python-based project, I cannot stress enough that I will avoid dynamically typed languages like the plague in future projects. If there is any chance the code base will live more than a couple of years, I'm going to have a say in what dev lang to use.

            As for Fortran, I have never been exposed to it. I've read good things about various compilers and I do not believe performance is much of an issue. I do believe that the language hasn't evolved much over the past couple of decades, and I doubt it makes any sense to choose Fortran for a new project in this day and age. But I could be wrong about that! It would be interesting to see a comparison of 50 lines of code that is considered to be best-in-class Fortran code and what the equivalent Java/C#/Pascal code would look like.

            1. Anonymous Coward
              Anonymous Coward

              Re: students can't learn Fortran ?

              > Having recently worked on a Python-based project, I cannot stress enough that I will avoid dynamically typed languages like the plague in future projects.

              I'm certainly no expert in Python and, as such, perhaps one of the last people to be telling its maintainers what to do but one thing I think it could benefit from is something like Perl's "use strict". I can't remember how many times I've had code run for ages and then crash because of a typo in a variable name.

              > It would be interesting to see a comparison of 50 lines of code that is considered to be best-in-class Fortran code and what the equivalent Java/C#/Pascal code would look like.

              Many years ago, I was working in an engineering group that was headed up by a CS grad while all the engineering group and their managers were staffed by either EEs or Physics majors. When new desktop computers were ordered, they were all bundled with MS Pascal compilers. The very next thing the group managers did was order Fortran compilers for their teams. Over the next year or so, I never saw *any* of the Pascal compilers that were not in the original shrink-wrap.

      2. Doctor Syntax Silver badge

        Re: students can't learn Fortran ?

        Not being able to learn a simple language like FORTRAN doesn't sound like the best and brightest. Maybe that was one of those throw-away lines said without thinking and intended to diss the language.

      3. Version 1.0 Silver badge
        Thumb Up

        Re: students can't learn Fortran ?

        The students will be busy learning Julia and getting it working to model the climate model, so it's going to be a while before we get to start verifying that the new code replicates the original climate model. However that will be a very good experience for the students and will probably make them very good coders - it will be a good lesson:

        println ("hello whirled")

        But a FORTRAN programmer would just be writing the original code in Julia, FORTRAN programmers can write FORTRAN programs in any language - that's a traditional joke but the reality is that creating accurate and functional code is determined by the writer, not the language.

    2. LybsterRoy Silver badge

      Re: students can't learn Fortran ?

      -- ... what do they do to them at the university these days ? lobotomy ? --

      No they teach CRT

  6. jake Silver badge

    Julia? Really? I thought Uni was to help prep kids for the rest of their lives?

    "he's realized that "traditional climate models are in a language [MIT] students can't even read.""

    So basically, instead of teaching the kids Fortran (which is hardly dificult), they are going to throw out 50+ years of climate modeling, redo it all from scratch, and instead teach the kids Julia?

    That makes zero sense. At least with Fortran, once they realize there is no money in climate modeling research, the poor kids will be able to get a job at the financial institution of their choice, and at a very high rate of pay. Julia? Not so much ...

    1. Jellied Eel Silver badge

      Re: Julia? Really? I thought Uni was to help prep kids for the rest of their lives?

      I dunno, it's probably a better use of money and resources than a lot of climate stuff. There, the big question is ECS, or how much warming for how much CO2. Current models diverge from each other, and more importantly from reality.

      So the good news is Thermageddon may be averted. If CO2 sensitivity is low, we'll never hit 2C warming from CO2. Creating a new model from first principles might be a way to either validate existing models, or figure out why they diverge.

      1. M.V. Lipvig Silver badge

        Re: Julia? Really? I thought Uni was to help prep kids for the rest of their lives?

        "So the good news is Thermageddon may be averted. If CO2 sensitivity is low, we'll never hit 2C warming from CO2. Creating a new model from first principles might be a way to either validate existing models, or figure out why they diverge."

        Funny. The goal never was to prevent a "Thermagedden", but I like that term. The goal is to establish a global GDP tax, showing they finally figured out how to tax the air that we breathe IF they can convince enough people that it's needed. It's also why the dire consequences will never hit more than 50 years out, and why predictions already made never panned out. And, more importantly, even those preaching it hardest don't believe their own words. They all live by the sea. If they really believed, they would all live inland, higher than even the most dire predictions claim.

        Once the very tiny GDP tax is in place, and getting it established is the hard part, raising the tax later to rather large percentages will be easy. And once it's in place, it will never go away.

        1. codejunky Silver badge

          Re: Julia? Really? I thought Uni was to help prep kids for the rest of their lives?

          @M.V. Lipvig

          "The goal is to establish a global GDP tax"

          I do wonder how people keep falling for this stupidity. So far they struggle to define a climate problem (cooling, warming, runaway warming.. oh damn still not right lets call it 'climate change') then do the opposite of what could be considered 'working' according to their 'science' and then we watch the main pushers buy property they claim will be sunk and do activities they claim to be 'bad'.

  7. Doctor Syntax Silver badge

    The current model isn't good enough so they're going to train the now one on the results of the old. Why?

    1. jake Silver badge

      Because they've finally trained the old to provide the results they are looking for would be my guess ... can't have the new upsetting the apple cart, now can we? Might lose their funding.

      Yes, as a matter of fact, I do find it sad that I have such a dim view of academia.

  8. robert lindsay
    Meh

    I mean sure, rebuilt your model however you want. However, if you pick a language like Julia then the only people who can tinker on it are both Climatologists AND Julia users. This may limit your pool. For now.

  9. Anonymous Coward
    Anonymous Coward

    links

    other link: https://news.mit.edu/2022/computing-our-climate-future-0413 "April 13, 2022"

    Computing our climate future

    To put global climate modeling at the fingertips of local decision-makers, some scientists think it’s time to rethink the system from scratch.

    --------------------

    as I see this is the github repo: https://github.com/CliMA

    website: https://clima.caltech.edu/

    https://clima.caltech.edu/publications/

  10. bombastic bob Silver badge
    Megaphone

    I just have to LAUGH at the level of cluelessness here...

    FORTRAN code, when properly written, is as efficient (and possibly MORE efficient) than code written in any OTHER compiled language.

    I taught myself FORTRAN back in the late 70s on a university computer. In the early 90's I was in the M.i.S. department at a company writing stuff in FORTRAN for the ASK/MANMAN system. NOT! HARD!

    But the problem is NOT the lingo in which these "climate change" models were written. IT! IS! THE! FACT! THAT! THE! ENTIRE! PREMISE! IS! JUST! PLAIN! WRONG!

    WHY does ANYONE believe that these FAILED CLIMATE CHANGE MODELS, that have been WRONG time and time again for DECADES are worth ANY kind of creditibiltiy? BLAMING THE PROGRAMMING LINGO is *JUST* *PLAIN* *LAME*

    Making an issue about FORTRAN is MEANINGLESS. It's those nearly-always-wrong algorithms that are BOGUS that need to JUST GO AWAY.

    It's JUST as bad as claiming that CO2 is causing anthropogenic climate change in the FIRST place, *ESPECIALLY* when you look at the REAL science, that CO2 is at equilibrium at about 0.04% of the atmosphere (refer to basic chemistry when understanding an equilibrium state) both biologically AND chemically, is at a STABLE LEVEL that varies a bit according to temperature (and does NOT cause the temperature change) because of solubility of gasses in large bodies of water, *AND* has a *TERRIBLE* infrared absorption spectrum for IR radiation that normally escapes the earth to cool it at night, specifically THOSE FREQUENCIES corresponding to ACTUAL TEMPERATURES found on earth. in short, a TERRIBLE greenhouse gas that VARIES VERY LITTLE from ANYTHING that humans do, and is basically a NON-FACTOR unless your goal is to CONTROL PEOPLE by making it EXPENSIVE to do normal every day things with travel, electricity, and FREEDOM.

    Sea levels NOT rising, not flooding coastal cities by 2000, or 2020, or any OTHER time for that matter.

    Just HOW MUCH hype and nonsense (to justify price gouging on energy and restricting our freedom) must we COLLECTIVELY SWALLOW before WAKING THE HELL UP?

    1. Pascal Monett Silver badge

      Re: I just have to LAUGH at the level of cluelessness here...

      So, you're saying that climate change isn't happening ?

      Back in the 1980's, I clearly remember trudging to school in winter through 10-20cm of snow for weeks on end.

      This winter, we got barely 1cm of snow and it lasted a day.

      I think there's a change, there.

      1. MiguelC Silver badge

        Re: I just have to LAUGH at the level of cluelessness here...

        Climate change is not synonym with hotter weather (climate <> weather), what it means is that extreme events, be it storms, extreme cold or hot events will be more frequent - and that we can observe happening

        1. LybsterRoy Silver badge

          Re: I just have to LAUGH at the level of cluelessness here...

          Ah yes. Initially the climate change gurus shouted (climate <> weather) now they shout (climate = weather) when we want it to. Hmmmm

          1. codejunky Silver badge
            Thumb Up

            Re: I just have to LAUGH at the level of cluelessness here...

            @LybsterRoy

            "Ah yes. Initially the climate change gurus shouted (climate <> weather) now they shout (climate = weather) when we want it to. Hmmmm"

            Climate change has done so much damage to the term 'scientist' and 'science'. The problem with global cooling...global warming... run away global warming... climate change is how it never seems to work. Weather doesnt equal climate unless its an event which then its proof of climate change. And your not a climate change denier by not believing the climate changes but because you dont believe in a particular theory that has failed to predict anything.

            It does seem to have brought together mud hutters, anarchists and human haters.

            1. Anonymous Coward
              Anonymous Coward

              Re: I just have to LAUGH at the level of cluelessness here...

              "The problem with global cooling...global warming"

              This would result in things being utterly melted? How Ironic.

          2. Anonymous Coward
            Boffin

            Re: I just have to LAUGH at the level of cluelessness here...

            I know it is hard to understand esp when you don't want to. Climate is the long term average of weather. This means individual weather events ('it is hot today', 'there is a big storm') are not climate, but the long-term averages of them are. As climate changes both the probabilities of certain events and the actual values of various characteristics of those events will change.

            It is sometimes (but not always) possible to ascribe individual events or their characteristics to changes in climate with good probability. As modelling and understanding improves this becomes easier but it is very seldom certain. Example might be long hot dry (or long cold wet) spells in mid northern latitudes: because climate warms far more in arctic than at equator this means jet stream slows (less angular momentum to shed) which means it will become much more wiggly and this can result in these long trapped weather patterns. So would expect these to become more common and it seems they are. But it is still questionable to say that individual events are due to climate: their statistics are, certainly.

            Is easier to ascribe things which are directly sensitive to these averages, for instance plant growth. Is very well attested that growing season in UK is far longer that it was even in recent history for instance.

        2. nijam Silver badge

          Re: I just have to LAUGH at the level of cluelessness here...

          > ... what it means is that extreme events, be it storms, extreme cold or hot events will be more frequent

          No, it's not that either.

          In the UK, we recently started giving "storms" names. (Not that enyone from a region with proper storms would think they're that bad.) The intention is to create the impression that storms here are now more serious and more frequent that they used to be. It is a fairly blatant marketing technique. As an exercise for the reader: what is being marketed?

      2. LybsterRoy Silver badge

        Re: I just have to LAUGH at the level of cluelessness here...

        There is a <sarcasm> small </sarcasm> difference between man made climate and natural climate change.

        Of course the climate is changing, its been changing since the planet was formed.

        shout on BOB

        1. Arthur the cat Silver badge

          Re: I just have to LAUGH at the level of cluelessness here...

          Of course the climate is changing, its been changing since the planet was formed.

          Sure, but rate of change matters as well. One of the fastest natural shifts in CO₂ we know about is the Azolla event in which atmospheric CO₂ dropped from 3500 ppm to 650 ppm in 800,000 years. That's a ΔCO₂ of 3.56 ppm/kiloyear. In my lifetime CO₂ has gone from 313 ppm to the current 417 ppm, a ΔCO₂ of 1.53 ppm/year. That's 430 times faster than an unusually fast natural event and is definitely not a normal terrestrial process at work.

          1. Jellied Eel Silver badge

            Re: I just have to LAUGH at the level of cluelessness here...

            That's an assumption. A problem with climate science is in UN terms, climate is 30yr chunks of average weather. The further back in time, the sparser the data. Keeling's CO2 data gives a snapshot of 1 location, otherwise historical data comes mostly from ice cores. Those have challenges, and probably can't give absolute values for CO2 in atmosphere.

            But they can give a general idea of trends, and some potentially inconvenient truths, like warming preceding CO2. I think that's to be expected given warming stimulates natural CO2 emissions, which dwarf human activity. Kind of a pet hate. Dogma assumes a pre-industrial equilibrium, data shows natural climate change in the past.

            So we had the Medieval Warm Period followed by the Little Ice Age. They're also inconvenient, so some climate activists deny them outright, or try to claim they were limited geographicaly or to the northern hemisphere. But that explanation lacks any compelling mechanism that supports localised climate change.

            Plus there's inconvenient evidence uncovered by glacial retreat. That's exposed plant and tree remains dating back to the MWP, and strongly suggests it was warmer there in the past, or they wouldn't have grown. That also fits historical events, like the Viking's settling Greenland. Nice weather when they got there, but climate changed, it got colder and the colony failed.

            But such is politics. Despite being a climate sceptic, I support new approaches to climate modelling. Also intrigued by any potential for it's localisation. UK has a long CET weather record, so if this model could be run at a UK level and compared with historical data.

            I also think teaching students a statistics based approach is a good thing. Climate science is all about data and signal analysis, and mistakes have been made in the past, eg Steig's Antarctic warming, or my favorite, the 'novel' Rahmstorf Smoothing. Both examples where warming was caused by bad math, rather than CO2.

          2. 9Rune5

            Taking the average

            We need to know how much CO2 can naturally change within a century.

            Within longer time frames what mechanism would keep CO2 from increasing violently one 'moment' and decreasing nearly the same amount the next? I would find it very strange that within a 1MY timespan there won't be several spikes along the way.

            1. Anonymous Coward
              Boffin

              Re: Taking the average

              what mechanism would keep CO2 from increasing violently one 'moment' and decreasing nearly the same amount the next?

              For that to happen something must emit and or absorb vast amounts of CO2: to push up atmospheric CO2 by 100ppm humans have emitted about 1.6E15 kg of CO2.

              Rapid increases on this scale would be large scale burning, large scale vulcanism for instance. Events like that show up in the geological record, and sometimes they do happen (The event we are in now will certainly show up in geological record).

              Rapid decreases at this rate would require something which can absorb CO2 at a spectacular rate. I don't think we know of such mechanisms, but if they existed they would certainly also show up in the geological record.

              So we know some mechanisms which allow big increases, they show up, occasionally (like now,) they happen. We know no mechanism which can extract that much CO2 that quickly, but are confident that if it existed it would show up too.

              1. Jellied Eel Silver badge

                Re: Taking the average

                Don't forget changes in CO2 are gradual and natural, and natural fluxes vastly exceed human contributions.

                Keeling's data show clear seasonal variations, ie more CO2 during summer than winter. More warm, more biological activity, so more natural CO2 from plants, soil bacteria etc. Ice core records like Vostok show 600ka of natural climate change, and appear to show CO2 increases following temperatures.

                Why climate changed in the past is a bit of a mystery, but a bit of a snag for the idea that CO2 drives climate. If effect preceeds cause, that's a rather unscientific explanation. Also a bit of a snag for large climate events like triggering or ending Ice Ages, ie explaining those by solar variation and/or orbital excursions.

                Of course it could be our ancestors experimenting with hydrogen production given H2 has a Global Warming potential of 11 vs CO2's 1. Seems strange we're trying to prevent global warming by transitioning to a much more potent GHG.

                1. Anonymous Coward
                  Boffin

                  Re: Taking the average

                  Hydrogen has a lifetime in the atmosphere of one to two years and any hydrogen leaked into the atmosphere means you are wasting it. CO2 has atmospheric lifetime of hundreds of years and is an inevitable consequence of burning carbon. These two things are not the same.

                  And yes it is expected that CO2 level changes would probably lag temperature in ice cores since these changes were not driven by large emissions of CO2 but rather by various factors of which most significantly Milankovitch cycles (orbital changes). CO2 concentration and climate are in fact correlated: either can drive the other. For recent glaciation cycles there was no huge civilisation puking out CO2 so we expect (and see) a lag. Now there is and it is leading climate.

                  Ah well there is never any purpose arguing with denialists like you is there: it is like arguing with mud. I will not reply further.

                  1. jake Silver badge

                    Re: Taking the average

                    "Ah well there is never any purpose arguing with denialists like you is there: it is like arguing with mud. I will not reply further."

                    I've heard that very line from any number of other religious types.

                  2. Jellied Eel Silver badge

                    Re: Taking the average

                    Shame.

                    Hydrogen loves to leak. It's one of the challenges around production & distribution. CO2 doesn't have an atmospheric persistence of hundreds of years, more like <5 on account of solubility and the whole carbon cycle thing.

                    Milankovic makes great booby traps. Ice Age cycles can't be explained by CO2 dogma, so needs another theory. So obviously what happens is the Earth moves for everyone. We get further away from the Sun, it gets colder, then we move closer and warm up again.

                    Simple really.

                    Well, except for finding an explanation that allows the Earth's orbit to expand and contract. Perhaps the Sun goes on a periodic diet so it's gravitational effect goes in cycles. Or perhaps it's the Sun's output that varies. But the IPCC dedicates about a page to denying solar variability. Which is junk science given we've only been able to measure changes in spectral composition for less than a century.

                    But such is politics. Raises fun questions though. CET record goes back a few hundred years. Can you find the longest IR dataset? Or why WMO standard weather stations weren't all fitted with pyrgeometers as soon as climate 'science' turned into a multi-trillion dollar industry.

                    1. Anonymous Coward
                      Boffin

                      Re: Taking the average

                      Milankovitch cycles are not changing the orbital radius: they are changes in orbital eccentricity, axial obliquity and axial precession. Orbital eccentricity changes are due largely to Jupiter but also Saturn. These cycles are pretty well understood over short (few million years) times thoigj there is probably chaos in longer (billion) year times. End result of these is change in northern hemisphere (where most of the land is) insolation which drives climate change.

                      Is funny that you thought it was changes in radius: either you can't read wiki page or you are lying, or both of course.

                      Also everyone knows solar luminosity changes (in fact models of stellar evolution are things I have worked on though for early universe) and it would be extraordinary if such changes had not driven climate change. We know for instance that luminosity has increased very significantly over time: by factor of at least 5/4 in last 3 billion years for instance, and early-dim-sun is a problem people worry about I believe. And there will be shorter term variations no doubt. We also know that solar luminosity change since say 1950 is nowhere near enough to drive observed climate change since then, because we have good measurements of both things. That is what the IPCC section that you are lying about (or more likely you did not manage to read it) says.

                      So please: stop lying. And now I really will stop: arguing with liars and idiots is a waste of my time.

      3. 9Rune5

        Re: I just have to LAUGH at the level of cluelessness here...

        First you need to define the term "climate change".

        According to Cook's metastudy (2012?), the term implies "man-made".

        Unfortunately I do not think they answered the question on what term to use when discussing the changes in climate that happened on Greenland a millennia ago. I doubt very much that anyone thinks those changes were caused by any mammals. So what label to use?

        The climate has never been static, has it?

        Now that we've learned that many climate scientists are awful software developers -- how much faith exactly should we put into the climate models they've developed?

      4. 9Rune5

        Re: I just have to LAUGH at the level of cluelessness here...

        My wife is from Georgia. Her mum and brother still lives there.

        We sent her brother a snow brush for his automobile more or less as a joke, but for the past couple of winters he has had good use for it. They cannot remember having had so much snow ever. His neighbors were impressed, because they had never even seen such a brush before.

        The coast of Georgia has certainly had more snow than what I've had on the west coast of Sweden in recent years.

        YMMV.

        1. jake Silver badge

          Re: I just have to LAUGH at the level of cluelessness here...

          During the meanwhile, I have my great grandfather's planting records, my grandfather's planting records, my father's planting records, and my own planting records.

          These are pretty detailed (temperature, rainfall, wind speed & direction, seeds planted, crop yield, etc.).

          Near as I can tell, since the 1850s there has been no appreciable change (outside statistical aberration). Even our current drought has historical precedence. I'm in Northern California.

      5. jake Silver badge

        Re: I just have to LAUGH at the level of cluelessness here...

        The Romans were growing wine grapes between Hadrian's and Antonine's walls. Try that today ... clearly it's now much colder than it was.

        On the other hand, when was the last time you attended a Frost Fair on the Thames? Clearly, the temperature has gone up.

        What a quandary ... The answer is that the climate doesn't "change" (a loaded, scary word), rather, it fluctuates. Not quite as scary.

        Perhaps you should ask yourself instead, who is trying to scare you into compliance ... and why.

      6. Martin Gregorie

        Re: I just have to LAUGH at the level of cluelessness here...

        A major problem is that all too many people simply don't notice or care what the weather is once the weather forecaster has told them whether they should wear something waterproof or beware sunburn.

        Sadly, that describes the attitude of almost everybody who live in cities, which is most of us. These are also the people who deny climate change because they have never taken any notice of the weather or its long-term trends because they've never needed to. As a result they don't understand the effect of weather on their lives or the consequences of it changing.

        The only people I know of who do notice weather are dependent on it for their livelihood, i.e. farmers and sailors, plus a few minorities such as hill walkers, some model flyers, and glider pilots whose leisure activities are critically weather-dependent.

        1. Jellied Eel Silver badge

          Re: I just have to LAUGH at the level of cluelessness here...

          I think you've inverted your theory.

          People who are exposed to the weather may be more sceptical than people who live & work in climate controlled offices in places like Penn State. There's a lot of historical weather data, as well as crop and tax records.

          Or just anecdotal stuff. As a kid in the '70s, I remember waking up, scraping frozen condensation off the inside of my window and seeing snow. Then listening to the radio to find out if school was closed and I could go play. Deep snow over my knees, which hasn't happened since. OK, my knees gained altitude since then. But older people have experienced more weather & climate than 20-something ecoterrorists who've been brought up on a diet of global warming dogma.

          1. jake Silver badge
            Pint

            Re: I just have to LAUGH at the level of cluelessness here...

            That was just beautiful, man.

      7. M.V. Lipvig Silver badge

        Re: I just have to LAUGH at the level of cluelessness here...

        I remember snow up to my backside in the late 70s, midwestern US. Same place, there are pics of my older brother running around outside in a diaper and nothing else in the 1960s because it was so hot outside - on Christmas day. Same area, TODAY, it was 37 degrees outside this morning. So, what's your point? The weather's been the same here for the last 60 years at least, some years hot in winter, other years not. Seems to run on a 7 year cycle.

    2. Bertieboy

      Re: I just have to LAUGH at the level of cluelessness here...

      When I left school in the early 60's, we were taught that atmospheric CO2 levels were of the order of 0.03% not 0.04% - i.e. a 33% increase. Stable?

      1. bombastic bob Silver badge
        Boffin

        Re: I just have to LAUGH at the level of cluelessness here...

        0.03%, 0.04%, whatever. Any difference is ultimately based on round-off, when and where the data was collected, and the atmospheric and water body temperatures at the time. When it is warmer, the CO2 levels will be higher since a) bodies of water that retain most of the carbonates in the world have a lower capacity to contain them, either as CO2 gas or carbonates, and b) rain has a lower affinity for absorbing the CO2 out of the atmosphere. There are more factors also but that's the point. 0.04% is a ballpark figure that is generally considered to be accurate. It is also TINY especially when compared to WATER VAPOR, which NOBODY is trying to control for some reason, and yet by my estimate changes in water vapor in the atmosphere have 100 TIMES THE EFFECT when compared to any corresponding changes in CO2. That was my opinion, yes, based on the IR absorption spectrum and the amount of it found in the atmosphere. Btu when you compare the SOCIETAL HYPERVENTILATION over CO2 and complete LACK OF CONCERN over humidity and clouds, I think my point becomes obvious. right?

        and therefore if those HIGHLY INACCURATE MODELS apparently MAKING BAD ASSUMPTIONS in the FIRST place, were re-written in PYTHON vs FORTRAN, it would make NO difference in how BAD they are. This is like assuming a geocentric universe and then making the math fit the planetary motion as observed from earth. Moon and Sun, ok, because they appear circular. Everything ELSE, however, was a Ptolomaic NIGHTMAIRE. ("But, But, math proves it!" - must have been what they were thinking, right?)

        1. Anonymous Coward
          Boffin

          Re: I just have to LAUGH at the level of cluelessness here...

          It is also TINY especially when compared to WATER VAPOR, which NOBODY is trying to control for some reason, and yet by my estimate changes in water vapor in the atmosphere have 100 TIMES THE EFFECT when compared to any corresponding changes in CO2.

          Poor bob. Despite all of the silly capitals you are one of the only ones here who could actually understand it but you are so blinded by your own mind you won't.

          Yes, as you have worked out water vapour is a vastly more aggressive factor in climate than CO2. All climate people know this (I am not one but I am adjacent to some). The great majority of the (badly-called) greenhouse effect is due to water vapour. But the Earth has rather large amounts of liquid water on its surface (most of its surface is in fact covered in these enormous pools of liquid water). So the water vapour in the atmosphere is continually condensing into aerosols and then falling as larger droplets (sometimes frozen) to end up in these pools (and also directly condensing into the pools) and evaporating from the same pools into the atmosphere. Because of this vast reservoir of liquid water on the surface the amount of water vapour in the atmosphere is not influenced by for instance human emissions: it is controlled by pressure, temperature and so on. If there was some momentary pulse of water vapour in the atmosphere it would fade in hours or days as it condenses out. In the presence of these enormous reservoirs of liquid water you cannot control the amount of water vapour in the atmosphere.

          CO2 is not like this: mechanisms for scavenging CO2 from the atmosphere take a very long time. If you emit a lot of CO2 into the atmosphere then quite a lot of it will go into the pools of water quite quickly (years to decades), they will then reach approximate equilibrium with the atmosphere and then it will rather slowly get scavenged by plants, weathering and so on.

          And CO2 is a greenhouse (again, shit term) gas so this will raise the temperature a little bit. And now of course there is a nasty which I am sure you know about: the saturation pressure of water depends on the temperature of the atmosphere. So the temperature has risen a little bit and this saturation pressure goes up a little bit: there is now a bit more water vapour in the atmosphere. And oh dear, water vapour is a greenhouse gas, so things get a little warmer and so on. This does not run away (we know this because we are alive), but it does greatly amplify the warming from CO2, most of which is in fact due to the greater partial pressure of water vapour.

          This is all very well understood of course. And if you wanted to you are I am sure competent to understand it. But you won't because shouting is probably more entertaining to you.

      2. jake Silver badge

        Re: I just have to LAUGH at the level of cluelessness here...

        "0.03% not 0.04% - i.e. a 33% increase."

        I think if you look, you will find that that is that's a 0.01% increase.

        Not quite so scary now, is it.

        1. FelixReg

          Re: I just have to LAUGH at the level of cluelessness here...

          No, @jake, it's 33%. It's also 0.01. Without the percent sign after the 1.

          I goof like this with percents myself. A lot.

  11. Anonymous Coward
    Anonymous Coward

    Why the push for Julia language ? Some industry must need programmers who can use the language I wonder who ? Lets follow the money, who is funding the language!

    RESULTS:

    Odd I searched for "site:wikipedia NumFOCUS" and no page, that is strange (a non profit that pumps money into research tools using numbers, about->history mentions J.P. Morgan and Microsoft, Bloomberg, NVIDIA, IBM, Amazon). Makes me believe that people using Fortran may not be financially optimal for those companies.

    If Intel are involved you can be damn sure they want their money back a thousand fold, by selling more chips, so my guess is that this will be far less efficient then Fortran, which has such small overheads.

    DARPA because Julia is probably useful for modelling bombs.

    NIH (National Institutes of Health) never heard of them before, based in Maryland (same state as the NSA, not exactly next door neighbours, but it does reduce security costs to put all your jewels in one basket), I'm half tempted to suspect that they might be like the UK Porton Down Ministry of Defence's Defence Science and Technology Laboratory (people who indirectly work on bioweapons).

    INFO:

    Sponsors From wikipedia:

    NumFOCUS fiscally sponsored project in 2014 in an effort to ensure the project's long-term sustainability

    Dr. Jeremy Kepner at MIT Lincoln Laboratory was the founding sponsor of the Julia project in its early days.

    Gordon and Betty Moore Foundation

    Alfred P. Sloan Foundation

    Intel

    and agencies such as NSF, DARPA, NIH, NASA, and FAA

    Mozilla, the maker of Firefox web browser, with its research grants for H1 2019, sponsored "a member of the official Julia team" for the project "Bringing Julia to the Browser"

    1. fg_swe Bronze badge

      As A Matter Of Fact

      GC-languages consume 100% more memory than reference-counted languages* (for heap memory). So it should be a conspiracy of RAM manufacturers :-)

      * Rust, Sappeur

      1. Arthur the cat Silver badge

        Re: As A Matter Of Fact

        GC-languages consume 100% more memory than reference-counted languages

        No, it depends on the garbage collection technique used(*). You're probably thinking of a simple dual space collector. Even that doesn't use twice as much real memory except in very worst case, only twice as much virtual memory. Using ephemeral spaces reduces that considerably. From memory, it's possible to reduce GC space overhead to ~10% at the cost of extra GC work and still be practical.

        Oh, and Rust isn't a GCed language. It can be, but it was specifically designed not to require one (or any other runtime environment).

        (*) For full coverage of the topic see The Garbage Collection Handbook: The Art of Automatic Memory Management and its web site.

        1. fg_swe Bronze badge

          Experimental Results

          In the past I have created a moderately complex program which performs statistics on CSV files.

          Program A uses Java Standard Library, does lots of "new String()"

          Program B uses Sappeur strings, ("new String_16()"). They are refcounted and will be reclaimed in (pseudo) realtime.

          The findings were:

          F1) If the Java program should be competitive in speed with B, it needs 2x the RAM of B.

          F2) If the Java program is only allowed to use 1.2x the RAM of B, it will become very slow (as compared to B), due to lots of GC runs.

          That test was done on Solaris in 2011 with the then current Java version.

          I should redo the experiment to see whether the Java GC has become better.

          Regarding your argument "it will only create virtual memory load" - that is still serious system load, as the memory must be swapped to disk even if unused. Whether it can actually be swapped out depends highly on the allocation patterns. A single active object per page can thwart the paging to disk.

          1. Anonymous Coward
            Boffin

            Re: Experimental Results

            Regarding your argument "it will only create virtual memory load" - that is still serious system load, as the memory must be swapped to disk even if unused.

            So you believe a VM system must page allocated memory to disk even if that memory has never been touched? Was going to say that the last time that happened was before I was born, but not sure if it ever happened in fact.

            So many stupids here.

          2. Arthur the cat Silver badge

            Re: Experimental Results

            That test was done on Solaris in 2011 with the then current Java version.

            11 years is a very long time in computing. An awful lot has been done to Java in that time and a single measurement from 2011 isn't really relevant today.

            You might care to read this on GC improvements from JDK 8 to JDK 17 which covers changes from 2014 to 2021. You now have a choice of four different garbage collectors, so you can pick the one best for your work load.

            as the memory must be swapped to disk even if unused

            As someone else remarked, that hasn't happened in decades. Unmodified memory has no need to be written out so isn't. Unused memory probably doesn't even have swap space allocated for it.

            A single active object per page can thwart the paging to disk.

            Which is why many/most GCs compact to reduce VM usage. You really need to read up about the latest GC technology or you'll confuse yourself with long dead misconceptions.

      2. Anonymous Coward
        Alien

        Re: As A Matter Of Fact

        This is false.

        A GC'd language which uses a copying GC and has a single space for allocation will need less than double the space than a reference counted system, because of course RC system requires reference counts in its objects.

        But no-one has used a system like that for a very very long time. Instead people use for instance generational GCs which have many spaces. The worst case overhead is now the size of the largest generation. And that is the case only if you use a copying GC for the largest generation: some do not, but compact it in place.

        But still, the old idiocies continue

    2. This post has been deleted by its author

  12. stiine Silver badge
    Facepalm

    If they have students that can't read FORTRAN, how the F did they graduate from Cal Tech, much less, get accepted into a post-doc program at MIT?

    I spent one evening in college in a computer lab with a physics doctoral student, during which he taught himself Unicos C because he had been told by his adviror that it was faster than CP-6 FORTRAN. Before the end of the night he was able to replicate, using his newly written program on the Cray Y-MP, the results of his original program that had run for 6+ days on the Honeywell DPS-90* system.

    * - the DPS-90 was a dual system with vector processors

  13. Anonymous Coward
    Boffin

    So much stupidity here

    First of all fortran. There is nothing wrong with fortran. But most climate model code is written by PhD students and postdocs. Most PhD students and postdocs do not go on to become academics, and want things on their CVs which will enable them to get jobs at fashionable high paying companies. That is not fortran, it is data science languages like Python, Julia, and so on. If you say 'no you will write fortran while you starve like your ancestors did' they will say 'no, we will not'. So you use fashionable data science languages, so you can keep your army of voluntary slaves.

    Second of all low-level language performance is not what matters. Problem with old code is not that its loops are not optimised enough, it is that it was designed for what an HPC system looked like in 1975, which was a Cray 1. Then it has been modified and modified to run later HPCs which many of were really big SMP machines (often not cache coherent). But modern HPCs are not anything like a Cray 1 or a big SMP machine: they are huge numbers of processors with no shared memory and message passing and (either now or soon) where the physical geometry of the machine influences latency & bandwidth. So code is modified and modified again, but there are limits. Climate models I knew about (5 years ago) simply could not scale to use all the processors available. But climate model (or weather model) clearly can be scalable because it is classic case of system which should: what happens over Norway takes many hours to worry what is happening over China, so you do not need long-distance connections. So the architecture of the models must be reworked. That is what this is about.

  14. usa1

    Like all previous climate model, this one did not accurately predict the climate over the last several decades. Time to scrap that work and come up with a new fake model to scare the public and avoid pesky questions on why climate models are so pathetically unreliable. Oh I know, because they only look at CO2 and CO2 is NOT the control knob on the climate.

    1. Roland6 Silver badge

      Well having to totally rewrite it, will require all embedded assumptions to be identified and re-evaluated, including the forgotten ones.

      >CO2 is NOT the control knob on the climate

      Maybe, however the knowledge about the role of CO2 has arisen in recent years and could be taken to be unknown in the 1970's when the models were first constructed.

      So you and other climate septics should be pleased about this project and lobby to ensure it gets all the funding it needs so that it can be completed and be useful...

      1. jake Silver badge

        "when the models were first constructed."

        You seem to be under the impression that said models were constructed, the programmers slapped each other on the back and congratulated each other for a job well done, and we've been using those very same models ever since.

        Nothing could be further from the truth. In reality, the models have been updated/extended/improved near weekly, just like the hardware they run on. Same for Fortran, BTW. The models are far from static.

        Now if only we can convince ThoseInCharge to provide us with raw data that hasn't been dicked around with ...

        Note: I'm not a "climate skeptic"[0]. I'm a scientist. Don't TELL me. SHOW me.

        [0] Why do you lot always use loaded, scary words when referring to people who don't march in lockstep with your belief system? Do you think it somehow makes your points valid? In my mind, it makes you rather sad and pathetic ...

  15. CFtheNonPartisan

    Another celebration of arm waving in computerdom. Not impressed by that, nor on the amount of misinformation from the esteemed academics, or possibly as mangled by the reporter who misreported it?

    Last I was involved (a decade ago) the models could get down to grids a few km square but the amount of compute power to run a climate model at that resolution was unattainable. That was saved for localised weather forecasts requiring obtainable compute power. Physics of land, ocean, atmosphere, cloud, and a bit of air pollution got included.

    It is about compute power not language, and Fortran has always been about optimal efficient computing for mathematical models. Nobody needs to read the code excepting the programmers and scientists, and if they are that incapable?

  16. andrel2r

    Does the new code give the same answers as the old Fortran code?

    Because if the new code predicts worse climate than the old Fortran, MIT will be accused of ideological bias.

    And it will be a reasonable accusation.

  17. Sparkus

    As long as the new models are fully open-sourced

    and not kept under wraps as proprietary IP belonging somehow to the researchers or MIT.

    2x on this for the complete data sets/

  18. M.V. Lipvig Silver badge
    Facepalm

    Aren't these the same models

    they've been beating us over the heads with, claiming the planet dies in 50 years if we don't pass a global GDP tax, for the last 20 years? How can they say the planet is doomed if the models they're using are inaccurate and can't predict anything?

    Global warming advocates just lost what little credibility they might have had. Might as well go back to claiming the next Ice Age will start in 50 years just as they were claiming 50 years ago.

    And, I'll just add in that I've lived in the same geographic part of the planet for over 50 years (aside from 6 years in the US military) and the climate has not changed in this area during this time. I'm not going off massaged data, I'm going off living here. It's not as hot as they claim, and if their trend line was accurate it means the average summertine high when I was born should have been 50 degrees. It's always been high 90s to low hundreds in the summer here, and anywhere from 0 to 40 in the winter with occasional weeks in the 70s.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like