Blimey
Who would have thought it?
A study of how people use ChatGPT for research has confirmed something most of us learned the hard way in school: to be a subject matter expert, you've got to spend time swotting up. More than 10,000 participants took part in a series of experiments designed to determine how people's understanding of a subject differed when …
Almost identical report from the 1990's, or maybe 2000 decrying the use of the Internet in general demonstrated that people are less likely to learn and commit facts to memory when "all of human knowledge" is just a few clicks away.
No doubt universities were a bit peeved at the easy access to books created by Caxton and is printing press because now the students didn't need to remember anything if they could just look it up in a book :-)
Although the research involving the AI is probably more concerning because so many are using the same source and getting the same answer with little to no oversight on how correct the answer is.
I think you are confusing facts and knowledge. Books, searching the internet gives you facts, LLMs are supposed to be giving you knowledge. Outsourcing your thinking is never a good idea.
Using tools (even LLMs) correctly is the way to go. I remember someone who worked for me (I'd inherited him but not for very long) who if he pressed 2+2 on a calculator would have believed an answer of 5.
I have found this with many things that present you with the answer instead of requiring you to 'learn' the answer.
The immediate thing that comes to mind is how it seems to take me twice as long to learn a route using sat-nav than it did when I had to use maps & planning. Being given the answer instead of having to find the answer doesn't seem to embed the knowledge as quickly.
Or I could just be old...
The immediate thing that comes to mind is how it seems to take me twice as long to learn a route using sat-nav than it did when I had to use maps & planning. Or I could just be old...
I have found these navigators don't give sufficient notice of upcoming turns etc which can be a right PITA so ages ago I went back to the old map techniques but of course paper maps are now mostly memory.
Using Google/Apple maps I plot a route on a mud map with landmarks and the exits or turns and also those immediately before and after. Once consigned to paper I notice I remember most of the landmarks, street names in sequence.
Pretty obvious, I would have thought, that actual learning and comprehension is necessarily an active process requiring engagement on the part of the student.
This was a problem before AI with students cutting/pasting from web pages or "outsourcing" their assignments. With AI their essays are more likely to be complete codswallop rather than just garnished with bollocks.
"how to plant a vegetable garden " I suppose it never occurred to the participants to toddle off to the local library and borrow a book on the topic. Plenty on offer — even William Cobbett's A Cottage Economy offers advice on the subject albeit somewhat dated (and opinionated.)
Whilst I agree that you don't learn the route so quickly using satnav, you need to try a better/more recent satnv.
On a recent Autoroute trip through France, mine gave me 178km notice of the next turning - enough notice for you?
(note that I always use 2D mode - so just like a map - and often with the sound off, so brain remains engaged)
>” Using Google/Apple maps I plot a route on a mud map with landmarks”
A bit irritation I have with the online maps (including Waze) is how they insist on putting motorway junction numbers in smaller and thus largely unreadable type, whereas paper maps and paper traffic maps, like the AA road atlas made these numbers larger so they stood out.
Google streetview is wonderful for this.
You do a virtual drive of your route a couple of times, and then when you do the real journey, you know to turn left after the big red office block, then right just after the church then left at the big pub, etc...
You can identify the navigation landmarks, rather than road numbers and 'travel for XX miles then turn left'.
Not around here... things change often and streetview does not seem to be getting the money it once was, so it's often years old.
I've noticed the last couple times I did this, the landmarks I'd picked were not there.
Also, another irritating thing is Google calls major roads a different name than what they're signposted. It's NOT "Jimmy Buffet Memorial Highway" it's State Road A1A and has been for decades, and is signed as such.
>” Also, another irritating thing is Google calls major roads a different name than what they're signposted.”
Local radio does similar, they give junctions names which even as a local I have to think about, but when elsewhere and relying on the information, are totally meaningless until occasionally you are on top of that junction and so can see the name on the signs, at which point the avoid this junction warning is meaningless.
-- Google calls major roads a different name than what they're signposted. --
Its not just major roads and its not just Google (in fact its not their fault). I found out recently when the Highland council were wanting to put road calming measures in. Their documentation quoted "Village Road" and a B number. The Royal Mail and us villagers refer to "Main Street". Apparently OS used the designation Village Road sometime century before last and no one in government circles can be arsed to change it.
There may well be a way for a pleb like me to make the change but I'm not sure I'd live long enough to determine the method and implement it.
I live in the Scottish Highland so although I have satnav built into the car tend not to use it as I know both of the roads up here. On those occasions when I've driven down south and used it I mainly want to punch out that woman who keeps saying "take the next turn left" just as I sail past it.
What could possibly go wrong allowing AI to do academic work ? writing nonsense that you might not realise to be nonsense because you failed to study and understand the work. On the plus side it should weed out a few howling idiots from the herd. Obviously that does not apply to political studies where they do not understand empirical evidence or think it has something to do with royalty.
Learning anything beyond basic reflex action, walking, and talking, is just way too painful and time-consuming, a real time-waster nobody has patience for in this busy-busy world of today ... I mean, hey, we need to keep up with the attention-seeking joneses of the Social Media (S&M) sphere thinkfluencers and continuously up our credentials and likes so we're part of the in-crowd don't we!? After all, ain't attention dissipation the name of this Pavlov's dog game of status-affirmation whackamole?!
Yes it is, and learning anything challenging is for wimps, and gimps, and chimps, in these way-advanced modern days of ChatGPT will just do it for you far better anyways ... if only it could also handle the breathing and breeding parts ... </somebody-please-Heimlich-me-out-of-this-S&M-mindsuit-hype-chokerama!!! ;) help! >
"It found that participants who used ChatGPT and similar tools developed a shallower grasp of the subject they were assigned to study, could provide fewer concrete facts, and tended to echo information similar to other participants who'd used AI tools."
I'm broadly anti-AI and pro-Human, however in fairness I think the last bit re echoing is a bit biased.
I rather suspect that if someone had looked, they would have found that people also 'tended to echo information similar to other participants who'd used traditional non-AI tools'.
In my experience, it's both the what and way you are taught that makes things stick and shapes your understanding. Ever had something explained to you and still felt confused at the end? Every had something explained to you and then had the light-bulb moment?
One of my lightbulb moments was visiting a site for some knowledge and experience training on a hard-to-configure product. The explanation I received on one area of the thing 'clicked' and made sense of a bunch of other things I'd always been unsure of.I still explain that concept (Ethernet switching priorities, actually), pretty much the same way it was explained to me, adjusted to keep up with the technology/speeds.
bears were seen heading into wooded areas carrying what was believed to be rolls of Andrex
They are the only ones with half a clue and any idea of what they should be doing.
Meanwhile humanity is in the process of drowning itself and everything else in vast oceans of sewerage.
I tried vegan "smoked salmon" a while ago and it was one of the wettest and most revolting things I have ever eaten. It's a shame because I don't eat fish (there doesn't seem to be a way to source it ethically) and I miss it.
Meat free stuff in general has come on incredibly well over the last few years, but not-fish is clearly a tough challenge.
This post has been deleted by its author
This post has been deleted by its author
AI marking systems are used where student's papers are checked and compared to what the AI can come up with, get too good a match and they will mark you as a cheater. My daughter can barely use a keyboard, she doesn't get on at all with technology but she knows how to write. 3 people in her uni class got accused of cheating and they all had to prove their innocence by sitting in front of a tutor for 2 hours, writing a document on the spot with an hour's notice. When they fed in the essays in, the marking system stated they must have cheated. The tutor wouldn't accept the the system was flawed, they had to call someone else in and the other tutor gave back the marks the stduents should have had. In the end my daughter got "snagged" once more and just gave up fighting with lazy tutors, she left the university, came home and studied elsewhere.
The most appropriate comment to make at this point is "no fuckin' shit, Sherlock"
Like technical debt, cognitive debt is oh so easy to kick down the road for convenience, but at some point in the future, that debt needs to be addressed. And if you haven't accrued the necessary cognitive skills to pick over information because all you've ever done is outsource your thinking to a thing that can't think, then you're going to get an extremely bitten arse eventually.