-26

I've been observing the changes of Stack Overflow and generative AIs like ChatGPT and Bard.

I think that when it comes to things with different "versions", generative AIs might have a difficulty adapting to newer versions. When it comes to programming questions relating to relatively new versions, it would be best to ask on Stack Overflow.

Real experience

I was trying to figure out things about Laravel Livewire. So I tried to ask questions on generative AIs too.

The AI responded with solutions for Laravel Livewire v2.* even though at that time the current version of Laravel Livewire is already v3.*.

Stack Overflow is way better when it comes to newer versions.

To be honest, the response of the AI was blazingly fast and really works for the older version though.

My suggestion is that Stack Overflow should deviate from that aspect of what generative AIs can do, and maybe SO should focus more on new directions and features that can greatly help many users.

How about you? What are still the best use cases of Stack Overflow in today's generative AI era? For me, Stack Overflow is still best for questions about new versions of stuff (prog lang, package/library).

I'm encouraging everyone to bring your actual experiences these days. That's just what this post is all about.

Don't get me wrong: This is not about AI. This post is intended to help Stack Overflow prosper in this ever-changing landscape of online Q&As. We have to face it as concerned users.

23
  • 12
    Please research before considering asking a question in a post or comment. Hover your mouse over the voting arrows & on meta sites up/down votes also reflect agreement/disagreement with a proposal, view or premise. help center meta.stackoverflow.com Meta Stack Exchange I expect people will find this to reflect no research effort, not be useful or novel & to come from a(n unexpressed) naive & uninformed view of LLMs & LLMs vs SO/SE Q&A. Commented Jan 18, 2024 at 5:38
  • 3
    I don't understand stackoverflow anymore. It says here What is meta? Unlike normal Stack Exchange sites, Meta invites the community to discuss, debate and propose changes to the way the community itself behaves, as well as how the software itself works. On posts tagged feature-request, voting indicates agreement or disagreement with the proposed change rather than just the quality or usefulness of the post itself. [discussion] tag Questions that may not necessarily have a clear-cut right or wrong answer and are often subjective. Commented Jan 18, 2024 at 6:45
  • 2
    I was just trying to open up a topic that can be a direction towards proposed changes. I care about StackOverflow, but I think a community like this kind of hostile environment has no room for the future. Commented Jan 18, 2024 at 6:49
  • 5
    This feels more appropriate for a collective Discussion on the main site (if there was an AI / LLM collective that is). It's not on-topic for Meta because it's not about Stack Overflow Commented Jan 18, 2024 at 6:56
  • 11
    Your comments don't address the issues in my comment & I don't know what you think is hostile in what I wrote or wrote about or in the feedback people have given. Commented Jan 18, 2024 at 7:25
  • 20
    It's about "AI". Zero chance to be received well, people are tired of the subject at this point. In any case, I don't use either. I'm certainly not going to converse with some stupid chatbot, I am perfectly capable of working problems out on my own using my experience, skill and knowledge. Experience young developers are going to have trouble building up when they outsource everything and have little reason to retain knowledge. I don't "use" Stack Overflow; I use Google. Stack Overflow happens to be in the top 5 search results usually; that's the purpose of the thing. Not to ask questions. Commented Jan 18, 2024 at 10:06
  • 14
    “I care about StackOverflow, but I think a community like this kind of hostile environment has no room for the future.” - Ok? Were hostile because we don’t think a question as broad as you asked is helpful nor would a discussion about LLM be helpful. Your question isn’t even about Stack Overflow. Your question is about the uses for Stack Overflow? Decade old profile, hundreds of contributions, and thousands of reputation and it doesn’t appear you understand the rules of the community. Commented Jan 18, 2024 at 13:50
  • 3
    "I refer as a hostile environment of SO is the way people" What's rude & hostile is to ignorantly presume. Commented Jan 19, 2024 at 7:47
  • 3
    Yet another "SO must follow every current trend, because I say so" thread. The fact that AI exists doesn't need to be important for SO. When people want to use AI, it's their decision and doesn't concern anyone else. Commented Jan 19, 2024 at 16:57
  • 9
    You're beating a dead horse, Preventing an unmanageable tidal wave of easy to generate and hard to curate garbage has nothing to with "repelling innovation" (and blaming us for that is hardly original anymore). People that actively curate the site are painfully aware of how the current model cannot scale even without those BS generators. Although I see how someone who managed to cast total of 9 downvotes in 12 years might not see that issue. Commented Jan 19, 2024 at 16:59
  • 7
    Garbage torrent or no, if an question can be answered with AI, does it really need to be here? At this point in AI, they're just cloning and aggregating existing stuff. If it clones Stack Overflow questions, then the question must be a duplicate. If it has to aggregate answers, the question needs focus. In the future we'll have Language models that actually model programming languages, and I expect that will ramp up the quality of generated code, but we'll still need to work on correctly implementing behaviour. Commented Jan 19, 2024 at 20:09
  • 2
    And once we have that down, the nature of a career in programming will be fundamentally different Commented Jan 19, 2024 at 20:09
  • 3
    Is AI going anywhere? Yes. That's my biggest problem. We're in a transitional period. AI sucks right now. It's more impressive than the masses expected, but it's not the transformative tool that it will be. Right now we're just seeing glimmers of what AI will be in another few years. Get in, but be damn sure you can migrate before the next wave of disruption hits, be that new technologies or laws regulating its use. Commented Jan 20, 2024 at 0:16
  • 2
    @AbelCallejo Survival of the fittest with an AI that can't even handle the most simple logical issues? Actually funny. If it can handle all your simple tasks, fine, have fun with it, but SO doesn't need to dumb down just to "ride the waters". Commented Jan 20, 2024 at 4:22
  • 2
    @AbelCallejo "Survival of the fittest?" SO has absolutely no chance of being the fittest w.r.t. AI. Null, zero, nada. Putting chips on AI, or even worse in AI strenghts without AI (the magical more responsive, more tailored, faster, yada yada answer style that crop up in these discussions) is a sure way to die a fast and pointless death. The core concept of SO - curated, general Q&A - to the contrary is something it actually can compete in. But if that isn’t needed anymore and it dies a slow death - why should users be concerned about that? Commented Jan 20, 2024 at 6:18

2 Answers 2

10

I think you are conflating the concepts. Getting help from SO does not always mean asking a new question on SO. In fact in most cases it doesn't, because after so many years of development SO has built up a fairly comprehensive database of knowledge that most simple questions are already answered. I would say most of the time I need help with programming I found SO is the site that brings me the final answer, usually without me asking a new question. Let's say SO has helped me with 1000+ concrete programming questions without me asking a question in the end, you are not going to see this 1000+ data anywhere1 from the raw statistics. You're asking for personal experience, and this is my personal experience.

How much research effort is expected of Stack Overflow users? A lot. For me, using a generative AI is just part of my research. I typically take generative AI as another form of glorified search engine, with the caveats kept in mind.

  • Results from a traditional search engine can land me on the page that contains the necessary context to evaluate the quality of the content, which is missing in the case of genAI. Compare the following scenarios:
    1. Scenario 1: A traditional search engine lands me on a blog. Then I can go and check another blog of the same blogger. If I find that the other blog shows fundamental misunderstanding (for example, claiming that an array is a const pointer in C), I will put less trust in the blog and maybe move on to the next search result.
    2. Scenario 2: A traditional search engine lands me on an SO answer. The vote system can give me a score that gives me a general idea whether it is trustworthy or not. If I discover something I find ridiculous in the answer, but turns out the answer is from someone with a golden tag badge, I am inclined to believe that it is me that has the fundamental misunderstanding and double check and learn. In fact this is exactly how I got rid of the "An array is a const pointer in C" misunderstanding that I acquired from elsewhere.
    3. Scenario 3: Traditional search engines give me nothing that immediately answers my question and I decide to give genAIs a shot. Now I have absolutely no external measures to evaluate the quality of the response. Examining other responses by the same genAI tells me nothing about the quality of this response. Nor are there anything like "golden tag badges" to ensure the quality of the response is up to par. I need to vet the response carefully myself.
  • GenAIs are very reluctant to disobey or even just disagree with the user. Even though they try hard to appear as unbiased as possible, in most cases it just ends up with distorted responses. This is one thing I particularly dislike about genAIs. Even if you ask about something that is generally (not yet univocally) considered bad practice, genAIs will list 3 points of pros and 3 points of cons and end the response with the signature phrase "Ultimately, ... depends on...". This kind of response is not entirely garbage and can actually help if you have enough critically thinking, but it is kind of misleading. On SO you'll see people point out various defects of your code in comments. Admittedly sometimes the comments are blunt or even borderline rude, but such feedback with strong disagreement is necessary if you are honestly seeking improvement. GenAIs spoil every asker in the same way that impedes them from learning from mistakes they unconsciously make. I digress a bit here, but our sister site, Code Review, can never be replaced by GenAIs in the slightest way. Don't even get me started on XY problems. GenAIs can never generate responses that say "Why do you do X? It seems strange to me, what is the Y you're seeking? There may be better approaches." This is exactly due to the lack of the ability of genAIs to disagree with the asker.

In a sense, the attitude taken by genAIs is preposterous. Typically, a genAI assumes the user is almost always correct, and at the same time it assumes it knows everything, evidenced by that it chooses to make up plausible responses instead of just admitting "I don't know". However, "the user is always right" and "I know the answer to everything" is already a logical contradiction and when the contradiction manifests itself in practice you see all kinds of dramas in genAI responses.

With all these said, it doesn't mean genAIs are completely useless and many times genAIs have brought me attention to the blindspots in my initial evaluation of the problem. I didn't have the slightest idea that a keyword is what I should look for, or that a convenient library function exists and after asking a genAI I do and can investigate deeper. The take-away message is that using a genAI is just another part of my research before I would actually ask a question on SO. GenAIs offer extra tooling, but in my mind they are just another form of glorified search engines with novel caveats that traditional search engines don't have.


1Not really, the number of my saves and upvotes can actually kind of tell that.

1
  • Re-reading this answer in 2026, I realize that some of descriptions of GenAI behaviors are now not 100% accurate, and I did use GenAI more often in the last few years, but the fundamental reasoning remains unchanged, and I expect it to remain like that until maybe when neurosymbolic GenAI becomes mature, which then may be a real game-changer in how I may use GenAI vs Stack Overflow. Commented Jan 1 at 9:55
3

My suggestion is that Stack Overflow should deviate from that aspect of what generative AIs can do and maybe SO should focus more on new directions and features that can greatly help many users.

Stack Overflow doesn't control what users ask here: we can only answer what people feel like bringing up here if they can do so in a way that is constructive and useful.

Unfortunately... this trend of just dumping coding questions into GPT and running with the result is going to make it far less likely that new questions asked here will be constructive and useful. It trains users to ask questions in a way that gets results from an LLM, but doesn't provide what a real developer would need to reach useful outcomes without wasting valuable time.

Asking a group of users on the internet who don't have access to your codebase how to solve a given problem is an acquired skill: it requires many of the same skills that you'd need to solve the problem on your own.

10
  • I'm gonna go classical comment here. At least consider answering the actual question which is: What is your programming stuff that SO can help with that generative AI can't? Commented Jan 19, 2024 at 23:38
  • 2
    For me personally, there’s nothing either are helping me with currently. In the past SO was a place for me to improve as a developer, both in ability and in confidence in my ability Commented Jan 20, 2024 at 0:25
  • 3
    You're thinking too low-level, Abel. The hard part of software development is not the programming. The hard part, and the parts of the job I still expect to see standing after we get decent AI, are defining the behaviours we want and then proving that the produced software exhibits those behaviours and nothing else. I will celebrate when generative AI can eliminate the codemonkey scutwork. What that will mean is not, "No more programming!", but instead a filling in of the blanks around the behaviours that have to be very carefully described. Commented Jan 20, 2024 at 0:33
  • @user4581301 That's a higher abstraction level you have there. I have to go to low-level. SO community can't go that high. We can only just discuss simple concepts here so that we would understand each other easily. I agree with you on that "behavior defining" thing. That is what software development really is. Commented Jan 20, 2024 at 1:03
  • @KevinB good for you. But dang! The way I understand it with your case is that you now just rely on Documentation and Source-code to understand stuff? How about learning new stuff? Do you get still help more from SO or AI? Commented Jan 20, 2024 at 1:17
  • 3
    I rely mostly on documentation, since any problems i run into are solved with general debugging. When there's no documentation, I don't use that tool. Kinda nice to be in a position where i can be picky with what tools i use. Commented Jan 20, 2024 at 1:18
  • @KevinB huh! Your case is kinda fishy to me Commented Jan 20, 2024 at 1:40
  • 1
    @AbelCallejo Source code -> Docs -> SO and other online sources. Same things that the models (no intelligence there) were trained on. They don't understand what they produce, and are not reliable. So even if I used their output, I'd have to do at least the same amount of research to vet it and trust it. Might as well cut that unnecessary step out. | On another note, for last 20 years I've worked (amongst other things) with particular brand of industrial Lidars. The number of questions about those here can be counted on single hand (I answered some). You think some chatbot will help me there? Commented Jan 20, 2024 at 2:32
  • @AbelCallejo I also follow the idea that if you suck at something and want to get better, the way to do it is practice, practice , practice. I'm terrible at writing... I won't get better at it if I let some program do it for me. And so on... | In same way, I have yet to ask a question on StackOverflow -- it's not that I know everything, but I just do the research first... at which point I generally don't need to ask it (or it wouldn't be a good question anyway). Commented Jan 20, 2024 at 2:37
  • 2
    @DanMašek I agree. It's probably a generational thing. Developers who started during: (1) Pre-Google era, (2) Pre-SO era, (3) Pre-AI era, (4) AI era. You are either one of the first two generations. Me on the second. Probably this post is not for you and KevinB. More of for those in the third and fourth generations. Commented Jan 20, 2024 at 3:03

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.