Leaning Toward Learning
Call me old fashioned, but it seems using AI to "help" write code, term papers, laws, etc. is misguided at best, and outright dangerous for humanity at worst.
There are plenty of good use cases for the technology - iterating through billions of permutations to help discern what *might* become a new blood pressure drug comes to mind - but it seems to be antithetical to straight-up learning.
True intelligence is partly knowing how to combine data, knowing not only what to include in solving whatever problem may be at hand, but also what to ignore.
I fear that overuse of this technology will result in humans losing sight of how to think, and that just can't be a good thing.
There's a difference between getting a refresher on the capital of Zimbabwe if it comes up in conversation, and remembering how to solve a differential equation.
Having it readily available to spit out whatever answer is being sought, the tool becomes nothing more than a crutch, and you never absorb knowledge yourself because you can always get the machine to regurgitate it for you.
Sure, if in a hurry I'll query StackOverflow - it's way more convenient than sloughing through many, many boring pages of documentation - but am not just a copy/paste sort of fellow, which is the vibe I get from all these "assistants". I want to know *why* the solution works, not just that it *does* work.
Plus, who knows if they're even correct? I've talked to many a folk that don't have a clue but are still willing to spout as if it's gospel. Do most people fact check the results they get? Survey seems to say "no".
Do we really want to put all of our faith in this still relatively infant technology?
Am sorely hoping that at least in its current state, AI is a modern Hula Hoop, down the road used by some enthusiasts, but not in the mainstream.
Otherwise, we might as well just stop trying to improve our own minds and rely sadly and solely on the tool, awaiting the day it goes away and we've completely lost the ability to think for ourselves.