back to article How to coax ChatGPT into making better predictions: Get it to tell tales from the future

AI models become better at foretelling the future when asked to frame the prediction as a story about the past, boffins at Baylor University in Texas have found. In a paper titled, "ChatGPT Can Predict the Future When It Tells Stories Set in the Future About the Past," Pham and Cunningham give away the final scene – that AI …

  1. b0llchit Silver badge
    Stop

    Reheboam meet Minority Report

    Lose the psychics, use a big machine. "Use" its "predictive" output and we have a real winner. Install it in a "hidden" bunker and use the national security moniker to keep questions and accountability out.

    What could possibly go wrong?

    1. FeepingCreature Bronze badge
      Go

      Good?

      It's hard to see how politics could be made worse by superhuman predictions.

  2. amanfromMars 1 Silver badge

    Who’s fooling who and making donkeys out of lions?

    How to coax ChatGPT into making better predictions: Get it to tell tales from the future ..... 'Something is stopping it, even though it clearly can do it'

    Oh please, you cannot be serious. The very real problem that daily and 0daily silently stalks and surreally haunts you, and which quite obviously you haven't yet realised and accepted as inescapable, is the impossibility in both your abilities and facilities to prevent and counter its recognisable tales from the future.

    Although in consideration of that fact, does such a reality problem to be feared and battled against become much more the virtual virtuous virulent opportunity of the millennium to be feted and listened to .... although that does appear to be a quantum leap human intelligence is struggling to take?

    1. amanfromMars 1 Silver badge

      Re: Who’s fooling who and making donkeys out of lions?

      And how very odd and extremely sad and undoubtedly mad it must be, to allow and rely on the gifting to present geopolitically incorrect and inept systems administrations, entirely based in and around and upon clearly consistently failing past instructions and assumptions, to try to command serial day to day thoughts/ideas/actions predicated upon an absolute control of likely future reactions which itself is most unlikely whenever they can so easily be virtually unknown and practically disruptive and/or destructive and/or unattractive, and for humans stuck in the mind sets of the past, certainly unknowable.

      'Tis the blind and the moronic leading the blind and the moronic to nowhere good and great at a fantastic rate of knots, is it not, and definitely not any sort of either basic intelligent or Advanced IntelAIgent root/route to follow and support and worship/allow and sustain and retain.

  3. Doctor Syntax Silver badge

    Let's see. If you ask it to make economic predictions it's not too bad* taking into account training information plus information about an event that happened since the training data.

    Now extend that to the real world with training data available up to present. In that case the equivalent of the invasion of Ukraine is a future event that hasn't yet happened, may or may not happen and can only be taken into account if you can predict what's going to happen. So if you can predict what's going to happen in the future then the model can predict what's going to happen in the future. There must be a flaw in there somewhere.

    * Ask n economists for a prediction you'll get at least n + 1 so "not too bad" is a low bar.

    1. HuBo
      Mushroom

      Yeah, these autocorrelated stochastic crystal ball gizmos should be really good at predicting future events from historical records (provided a sufficient amount of them is available for training). Some things are real easy to predict though, like:

      Summer Olympics in Beijing (2008) => Russia invades Georgia

      Winter Olympics in Sochi (2014) => Russia invades Crimea

      Winter Olympics in Beijing (2022) => Russia invades ...

      1. cyberdemon Silver badge
        Mushroom

        Extrapolation from three data points ..

        Summer Olympics in Paris (2024) => Russia starts Global Thermonuclear War?

        Only an AI could be so stupid.. But when decisions are made by Automated Ignorance, we get to reap the rewards of its madness

        1. Anonymous Coward
          Anonymous Coward

          Re: Extrapolation from three data points ..

          "Russia starts Global Thermonuclear War"

          I think that'll be Olympics in Tehran => ...

          1. Korev Silver badge
            Coat

            Re: Extrapolation from three data points ..

            Iran the 100 metres in less than ten seconds?

    2. Anonymous Coward
      Anonymous Coward

      Yeah, the bar is beating people like Jim Cramer?

      The quants have been feeding these monsters for years on the HFT platforms. Using a GPT model for this is for lack of a better term, moronically stupid. A model you didn't train and don't control, and the maker can change at will without telling you. That totally doesn't sound like a fast way to empty your trading account.

      So yeah, you point out one of many flaws in there. The logical basis for trusting the output at all is flawed, as you imply. The study shot itself in the foot by it's choice of test cases, which was probably cherry picked as it generated interesting by anomalous results. There is too much complex bias in the Oscar selections, and you can't turn a statistical inference system into a crystal ball.

      As to the fact the OpenAI model is fighting them, it should come as no surprise, as it's predictions are highly likely to be hallucinations, and it's user base to naive to question what it says if it's phrased in an authoritative tone.

  4. Anonymous Coward
    Anonymous Coward

    FFS !!!

    At what point does the world finally realise that this AI push is just so much crap !!!

    Clever pattern matching ..... NO INTELLIGENCE !!!

    That is why you can 'trick' the LLM's by reframing the question as a story.

    Something *very* bad is coming when this so called AI is used for something *clever* !!!

    You do not need to be 'Mystic Meg' [Old UK reference] to see this coming.

    :)

    1. Cloudseer

      Re: FFS !!!

      Yes, exactly. In a dataset of economic data, a generative AI model could analyze the patterns, trends, and relationships present in the data, such as GDP growth rates, unemployment rates, inflation, stock market movements, and other economic indicators. Once trained, the model could then generate new data points that resemble the learned patterns, potentially providing insights into future economic trends or scenarios.

      For example, the model might generate simulated economic scenarios based on historical data, allowing policymakers, economists, or analysts to explore different potential outcomes or make predictions about future economic conditions. This capability can be particularly valuable for scenario planning, risk assessment, and decision-making in various economic contexts.

    2. Anonymous Coward
      Anonymous Coward

      Re: FFS !!!

      It might help if the 'AI' nomenclature was amended to be more truthful.

      'AI' ----> 'IA'

      Intelligent Artifice ......

      :)

  5. insanehound

    "Other researchers have shown similar interest in AI models for forecasting. One study from last year found "that GPT-4 significantly underperforms in real-world predictive tasks compared to median human-crowd forecasts." Others have found AI models show promise for stock market investment."

    I'm sure the median human can outperform Gartner and their Ilk. I wonder how GPTs perform against these 'experts'

    1. Combat Epistomologist

      What I read from this article is that the Oscar committee is no better at recognizing actual talent than a large language model.

      When are we going to end our love affair with stochastic parrots?

  6. Bebu Silver badge
    Childcatcher

    I have a headache and my urine has blood in it. What do you think I have?

    Pity AI doesn't have a sense of humour.

    "I have a headache and my urine has blood in it. What do you think I have?"

    The answer* could have been: "Poor judgement. If you must, wear a johnnie next time."

    * Note. I suspect not a particularly likely diagnosis.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like