back to article Neural networks give astronomers huge boost in identifying galaxies: 27 million done, 600 million to come

A neural network has helped astronomers catalog a whopping 27 million galaxies collected from one of the largest astronomical surveys probing the mysterious nature of dark energy. The Dark Energy Survey (DES) kicked off in 2013, and aims to snap galaxies and supernovae across an eighth of the Earth's night sky. Although that …

  1. Nick Ryan Silver badge

    They hope that their work can help the wider scientific community figure out how dark energy is accelerating the expansion of the universe
    Controversial and unpopular thought... but maybe, just maybe, relying on something that cannot be seen or detected to fill in the gaps to make models work might mean that there are more fundamental issues with the models themselves?

    I am not a physicist or astronomer but Occams Razor? Whenever other models have been found to be lacking it's been the model that has been at fault and adding hypothetical things to a model to bludgeon it into mostly working just doesn't feel right.

    1. Mike 137 Silver badge

      adding hypothetical things to a model to bludgeon it into mostly working

      That is indeed one way of looking at the problem. But on the other hand there's no legitimate reason to assume that we've discovered all the forces and fundamental phenomena in the universe. "Dark energy" could quite possibly be a verbal encapsulation of several things we just haven't found out about yet working together.

      1. John Brown (no body) Silver badge

        Re: adding hypothetical things to a model to bludgeon it into mostly working

        I was reading just the other day that there may be some evidence pointing at another hitherto unknown force of nature causing muons to wobble more than expected.

        Some have even said "For example, the observation that the expansion of the Universe was speeding up was attributed to a mysterious phenomenon known as dark energy. But some researchers have previously suggested it could be evidence of a fifth force."

        Maybe someone should ask Leeloo

    2. werdsmith Silver badge

      I think that what you describe is near enough how it's meant to work.

    3. Pascal Monett Silver badge

      I agree with you, but the problem is that any modification of Einstein's theory to "make it work" hits a wall somewhere that makes things worse.

      I've been reading up on this for some time now and, from what I understand, astronomers are not happy about the situation either.

      This is Science, with a very capital S. Very educated people are making an educated guess to help explain why galaxies are not flying apart which, from the data we have, is what they should apparently be doing. We don't know why, and this is the best hypothesis we've got for the moment.

      This lecture from one of the greatest scientific minds in recent history should help : Feyneman on The Scientific Method,

      1. Anonymous Coward
        Anonymous Coward

        Magic constants everywhere

        Can I point something out, it's a basic logic fail.

        You test property 1 at location 1 using experiment 1 and get result X.

        You test property 2 at location 2 using experiment 2 and get result X again.

        Property 1 and property 2 are the same *type* of property.

        Experiment 1 and experiment 2 are the same *type* of experiment.

        Location 1 and location 2 are different, perhaps different motion, or different location or different surrounds.

        Bingo you say, X is a magic constant, we've found a universe wide magic constant, its the same at location 1 and location 2.

        You then indirectly try to measure property *2* at location 2, while you're experiment is at location *1*. And you get odd results. Why?

        It's a simple logic fail, X isn't a magic constant, its a magic ratio, any difference at location 1 affects BOTH the experiment and the property you are measuring. A common thing when you try to measure the stuff of the universe by comparing it to the stuff of the universe!

        The obvious case here is light, apparently traveling at C, regardless of the motion or location of the observer.

        You can see the stuff under matter is moving (Schrodinger aside, you measure it, its position changes, so accept that yes it is indeed moving), its the same motion over the same field that light is moving over. So the size of an electron comes from that motion, and so does the distance the light moves per oscillation of the underlying field.

        So the speed of light isn't a constant, it's just a CONSTANT *RATIO*. Our choice of the second (derived from the number of oscillations of a nucleus in an atomic clock) determined that number.

        Beware magic constants, and corresponding magic zeroes, that constant is measured against.

        1. Paul Kinsler Silver badge

          Re: the speed of light isn't a constant,

          The speed of light (in its space-time metric/ relativistic sense), is probably best just given the value 1. It's simply the maximum speed allowed by the universe.

          Any other amusing numerical values and units given to the speed of light based on cultural mores, perhaps based on archaic unit systems or the like are simply a matter local convenience (or deliberate weirdness) on the part of the user. :-)

          1. GrumpenKraut

            Re: the speed of light isn't a constant,

            Afraid you just answered our resident "electric universe" crank.

            1. John Brown (no body) Silver badge

              Re: the speed of light isn't a constant,

              At least he's moved on from clockwork. Eventually, he may catch up to reality, relatively speaking..

            2. Anonymous Coward

              Re: the speed of light isn't a constant,

              I have to confess that he is, in fact ... me. Or to be precise, 'he' is a lightly-used neural network I had to hand which I trained on various earlier cranks in a moment of boredom together with some papers from vixra (but not the green-crayon ones: the handwriting recognition was too hard), and which I now allow to roam freely on certain select corners of the internet. It does OK I think: it's obviously repetitive and incoherent on any scale much larger than a phrase. It's certainly not immediately distinguishable from human cranks, I think. Don't tell anyone though: it will spoil the joke.

              1. GrumpenKraut

                Re: the speed of light isn't a constant,

                "not immediately distinguishable from human cranks"

                Indeed many real cranks actually do worse.

    4. Esme

      Well, yes, that's exactly what they're doing!

      @Nick Ryan

      Methinks you may have misunderstood the situation (my apologies if this is not the case)...

      We have a model of the universe, called the Standard Model. We have observed a couple of things in reality that don't fit that model. In order to talk about them, they have been given names that seem reasonably appropriate given the observable effects they cause viz "dark energy" (accelerating expansion of the universe) and "dark matter" (galaxies spin rates being anomalous if all the matter we can see is all the matter there is). It's possible that these things may actually be very different to the kinds of things that they at first glance appear to be behaving like.

      Having spotted these anomalies, further research has been done and continues to be done to try to ascertain what is causing these effects. Given that those effects do not exist in the Standard Model as it currently exists, then obviously a revised Standard Model will be required once we find out what the heck is causing these two effects. That's just how science works. What you've suggested is exactly what's happening; nothing controversial or unpopular at all, and no bludgeoning involved. It is expected that the Standard Model will have to be overhauled and replaced with a better version, because clearly something is amiss with the current version, despite its successes.

      If there's a better way to tackle problems, a gazillion scientists would love to hear about it!

      I am an amateur astrophysicist, not a professional one, though I know people who are. I can say with certainty that there are a LOT of hypotheses as to what's causing the anomalous effects observed, and they can't all be right! Time will tell which one is right or whether something entirely new is necessary to explain 'em. That's what's so exciting about science!

    5. Anonymous Coward

      Controversial and unpopular thought... but maybe, just maybe, relying on something that cannot be seen or detected to fill in the gaps to make models work might mean that there are more fundamental issues with the models themselves?

      From the perspective of General Relativity, 'dark energy' is not any kind of defect in the model at all.

      Any mathematical model for physics will have a number of free parameters which need to be determined experimentally. GR, astonishingly, really has only one such parameter (the 'standard model' of particle physics has 19). Well, as it's normally written it has more than this: it has three:

      Rab - Rgab/2 + Λgab = (8πG/c4)Tab

      The constants here are G, Newton's gravitational constant, c, the speed of light, and this Λ (uppercase lambda) thing. R, g and T are not free parameters: R and g measure the curvature of spacetime and T measures the energy-momentum that curvature corresponds to: those quantities are the things the theory relates.

      But two of those parameters are in fact not really parameters: they're there simply because people have chosen bad units. Because we didn't know that time should be measured in metres (or distance in seconds, it doesn't matter) we chose the unit of time – seconds – to be an absurdly large number of metres, and so if we want to work in units where time is in seconds and distance in metres we end up with this huge fudge factor, c. Similarly G is another fudge factor (this time a very small number in SI units) which is really because we didn't realise that mass should be measured in units of length, and that kg were absurdly tiny unit of length (the mass of the Sun is about 1.5km for instance). So by fixing the units we use to be sensible (sensible by the standard of someone interested in the underlying geometry, not sensible in any engineering sense!), both of those fudge factors go away, and we end up with just one:

      Rab - Rgab/2 + Λgab = 8πTab

      So all we have left is this mysterious Λ thing. And you really can't make Λ go away: it's a genuine free parameter of the theory. Mathematically, it's really a constant of integration, so it's a thing which actually needs to be measured. Its name is the cosmological constant.

      Well, to cut a long story short for a very long time people just blindly assumed that Λ must be zero, really because Einstein originally assumed it would allow GR to support a static model of the universe which people then thought it was, but it won't do that, so he just said that it should be zero.

      But there was never a good reason for that assumption: it's a free parameter of the theory and it therefore needs to be experimentally determined. And, well, we now have been able to begin measuring it, and lo and behold it's not zero: it seems to be small and positive. And what this means that that empty space essentially has some tiny amount of positive energy, and that means that, on very large scales, things get pushed apart from each other. And so we invent this rather silly name for it, 'dark energy', as if it is some mysterious thing in the same way dark matter really is a mysterious secret thing. But it's not: it's just a free parameter in GR which turns out not to be zero.

      (Particle physicists will differ about this and claim that Λ must arise from QFT, but I'll takt the GR answer that it's a free parameter against the QFT calculations of what it should be which are out by 10120, thanks. Also 19 free parameters, srsly, what kind of cheap shit theory is that?)

      1. GrumpenKraut

        I am *seriously* impressed.

        ...and that last sentence actually made me laugh. ----->

        1. Anonymous Coward

          Well at least the standard model works really pretty well (and by 'pretty well' I mean 'lots of decimal places well'). String theory now ... don't get me started on string theory. Just ... no.

    6. DS999 Silver badge

      Dark energy and dark matter

      are stand ins for "physics as we understand it and are able to demonstrate is correct in almost all situations fails in certain scenarios". The problems would be explained by the existence of dark matter and dark energy with certain properties.

      There's no reason to believe that we know all the particles that exist. Dark matter that does not interact with other matter other than by gravity (except for possibly very narrow circumstances) but in particular PHOTONS would be by its nature the last thing we'd detect.

      Dark energy that only impacts the expansion of the universe on distance and timescales that utterly dwarf anything in our experience on Earth would as well be the last thing we'd detect when we were figuring out what forces affect the movement of particles/objects.

      I don't know why people have such a problem with this. If I present you a math problem that has an unknown variable 'x' you wouldn't bat an eye. Physicists put unknown matter and energy 'x' and 'y' into their models and some people want to whine about Occam's razor and claim that's not science just don't understand how science works. Saying "we should stubbornly ignore all evidence of flaws in our theories and carry on trying to shoehorn things into them until it is forced to fit" is what creationists do, not scientists.

      You can't begin the process of solving for 'x' if you refuse the accept the existence of 'x' in your equation.

  2. Little Mouse Silver badge

    Thank you

    It's refreshing to read an article involving Neural Networks that doesn't go all "Machine Learning" and "Artificial Intelligence" on us.

    1. HildyJ Silver badge
      Thumb Up

      Re: Thank you

      It's also refreshing to see neural networks used in a way that they're good at, rather than things like face recognition.

      1. eldakka Silver badge

        Re: Thank you

        > It's also refreshing to see neural networks used in a way that they're good at, rather than things like face recognition.

        I would think that logically the underlying pattern matching algorithms are the same.

        The difference is that a 3% error (97% success) rate is acceptable in identifying galaxies, but not when being used to throw humans into jail, or even to them being put in the frame for suspicion. Not to mention that any biases introduced by the training set when training about galaxies has no real human consequences, just could screw up some papers and cosmological theories that have no real everyday human impact (beyond being completely fascinating!). Using a biased training dataset for training facial recognition is the complete opposite of that, it can literally kill people and destroy lives.

    2. Anonymous Coward

      Re: Thank you

      Well, this really is machine learning, isn't it? You give it some training set of classified galaxy images and then it learns to classify other galaxies. Conflating this 'I have a vast data set to train an NN on' with intelligence as is now fashionable is kind of a stretch though, I agree. But then conflating some of the ideas in ye olde symbolic AI with intelligence was as well: AI has been driven by absurd hype since its prehistory.

  3. AndrueC Silver badge

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like

Biting the hand that feeds IT © 1998–2022