Nostalgia
Remember the good old days of Professional Software Engineering, when we cut and pasted from SO rather than relying on this AI slop
Stack Overflow, the once-popular dev community, has abandoned a planned redesign that was meant to refocus the site more on discussions than the question-and-answer format that built its reputation. Philippe Beaudette, VP community, announced the change in a post last week. "We will be retiring the beta shortly and will be …
StackOverflow's entire business model was being what bad outsourced programmers / students went on to beg people to do their work / homework for them and/or to just cut and paste together existing answers and bang on them till it compiled and call that done. And then on the other hand you had the karma / resume whores who actually knew how to program who fed them (which I totally get, glass houses!).
Well, 'AI' makes complete incompetents slightly competent. All those completely incompetent coders can now just get Claude or Gemini to write their code for them far better than they would have been able to do themselves with SO. The only thing StackOverflow could usefully do is turn into a site where actual expert coders can ask for help with serious problems from other actual experts. But that's too low volume and would remove most of their membership, they wouldn't do that. So they have no reason to exist.
I've been a steady user since it was in beta and I recently had a perfectly answerable question closed for not containing a complete reproduction code, a bar to entry that would effectively prevent any non-trivial problems in more complex systems from being viable topics for questions. If I could reproduce it in a ten line sample I would have fixed the problem already, obviously.
To me that's the biggest sign that the site is really dying - I was part of another StackExchange site that got a single super-zealous moderator who destroyed it as a viable community in a couple of months by closing every question that didn't match his incredibly specific and limited set of ideas about what it should be about. I still wonder whether he grasped what he had done or whether he felt he had accomplished something useful in that role.
Yes, my understanding is that Stack Overflow and Stack Exchange had *already* been in serious decline for reasons related to moderation and the culture in general- including the ones you gave- for some time *before* LLMs were around.
If the controversy surrounding AI-generated answers wasn't already the final nail in the coffin for a site that had alienated the community and users it relied upon, the fact that LLM summaries are now drawing users away from the site itself almost certainly will be.
It's ironic that much of the information those LLMs were trained on will have been obtained via Stack Overflow in the first place and- since technology doesn't stay still- one wonders where they will get it from in the future.
Not just the moderators - at times I've dipped into that site and decided to post a reply about a comment that is clearly wrong, yet I've been denied for not having enough reputation points or something like that.
Are their moderators like Reddit moderators? https://old.reddit.com/r/whoathatsinteresting/comments/1qg7jfm/reddit_moderators_should_be_treated_like/
Talking about reddit, from the article:
"...that it looked more like a general discussion site such as Reddit and..."
Ouch! Reddit, the site which never retired its "old" format, and even that is worse than a generic PHP web forum.
For me, the reason my visits to Stack overflow has dropped off considerably has been because the site now wants me to enable JavaScript and cookies to view it on my mobile device to get past their bot protection gateway even though the site itself works perfectly fine without JavaScript.
Maybe you need to suck it up.
I know of at least two sites that have a lot of information available without hiding it behind a login wall, that are being absolutely crippled by Bot access.
IP addresses are distributed and hard to block, but collectively it’s millions of hits. All presumably AI scrapers. So every business that has a site like this is having to act.
That response reminded me of the attitude "If it's too slow, you need a faster computer" when someone grumbles about the calculator utility being slow.
There are better ways and cleaner ways of blocking bots. I've stopped using a number of sites because i can no longer quickly fire them up from the terminal in "w3m".
One of my heavy cgi sites (cgit with over half a million commits in the archive) only gets about a million hits a week from distributed bots, but I've still spent some time looking into reliably fingerprinting them and blocking them.
ALPN and GREASE are great indicators for this. apache 2.6 has support for exposing the required fields to CGIs, which was easy to backport to apache 2.4
JS front ends are an anti-DDOS hack.
> That tendency has also fed a reputation for being hostile to newcomers, causing a further decline in traffic.
Yup, that was my experience, so went elsewhere.
Yeah AI is now faster and broader than SO.
Still need to test and weed out the solutions that do not work, same as SO. More often API calls that do not exist.
As SO slides in to obscurity where do the AI models get their new training data from? Public git repositories?
I quit using SO years ago because of the hostile attitudes I encountered.
Stuff like ...if you don't know that, you shouldn't be here...
Or someone posting a non-working code fragment as a solution to a problem. Then being put down because you couldn't read "between the lines" and figure what the person posting the so-called solution meant.
In my case, I don't remember it being Stack Overflow they ripped off specifically- I'm pretty sure it was more often other forums. But, yes, there definitely *was* a period ten to fifteen years ago(?) where I'd do a search and get a result matching what I wanted, then upon closer inspection it would become clear they'd just ripped off the thread wholesale from another, better-known developer site (e.g. Microsoft's).