* Posts by putaro

3 publicly visible posts • joined 28 Feb 2011

Have we stopped to think about what LLMs actually model?

putaro

I have a degree in Cognitive Science (BA UCSD '91) which while very dated still gives me some insights. One of the the things we used to talk about was the pieces of the brain - at the time there was little understanding of how pieces worked, but we did have ideas of what different pieces of the brain did, largely based on people who had had some form of brain injury. One of the things we learned about was an area called the "grammar box" which was responsible for making sure that sentences came out correctly formed. People who had had an injury to their grammar box were no longer able to form grammatical sentences, but, with some difficulty, could piece together meaningful sentences. On the other hand, people who had had injury to another piece of the brain (I forget what it's called now) were now making sense, but the sentences they formed were grammatically correct.

LLMs remind me of the grammar box. Given a prompt, they can generate what, statistically speaking, a reasonable output would be. However, LLMs lack the ability to create those prompts themselves.

I met a man recently with aphasia. He has had a stroke that affects his mobility and his ability to form full sentences. However, his desire to communicate and his ability to form the "prompts", if you will, that he wants to communicate remains. He communicates in sentence fragments and single words along with some hand gestures, but his meaning, while it might take a big of effort to decipher, is clear. He is missing the grammar box, or the predictive output section of his brain, the LLM that knows how to string words together.

We still have a way to go before we create AIs that actually think. LLMs seem like they're thinking but they're just stringing word together based on the prompt and the rest of the corpus they've absorbed. It's an impressive trick and very useful, but it's not thinking yet.

It's 50 years to the day since Apollo 10 blasted off: America's lunar landing 'dress rehearsal'

putaro

That's a very incorrect characterization of the Saturn V. The "tanks" as you call them were difficult engineering. The second stage had some whacky stuff with the common bulkhead between the liquid hydrogen and liquid oxygen sections. Flight control surfaces, flight control computing, etc. were all quite necessary. It wasn't a model rocket!

Apple vanishes Java from Mac OS X Lion

putaro
WTF?

You downloaded the version you needed for Mac OS X?

How, pray tell, did you download the latest version of Java for Mac OS X from Sun/Oracle? The only source for Java on the Mac was (and remains until the OpenJDK port is complete) Apple.

You're talking through your hat.