If you've been a programmer for more than two years, you're probably familiar with Stack Overflow, a popular online forum where developers can ask and answer questions about programming languages and problems. It's a valuable resource for programmers of all levels, and it's also a source of many jokes about developers who rely too heavily on Stack Overflow answers.
I've recently noticed that there is an increasing reliance on ChatGPT, Bard, and other AI platforms among up-and-coming programmers. Could this signal the end of Stack Overflow? Are these new AI tools good replacements for Stack Overflow? Is the idea of relying on a platform for debugging a good one?
In conclusion, are AI-generated codes suitable for young developers to add to their codebase without extensive refactoring? I'd love to hear your thoughts.
Top comments (7)
Where do you think ChatGPT takes its answers from???
Yep. Github issues, stackoverflow, etc.
The difference is that when I find an answer on StackOverflow I have context. I have comments and alternative answers. I have reference links to support the answer. When ChatGPT is wrong (which for my questions is roughly 60-70% of the time) there's nothing...
The problem is that people who use ChatGPT never give back. They don't comment or post a correct answer. So the knowledge is lost for good. I'm fine, I developed before we had code completion or IDEs. But I'm afraid that the younger generation would end up with siloed information.
There is a tragedy of the commons at play where ChatGPT is take take take without the giving back part.
Yes, it’s better for everyone if we continue to build out our collective knowledge base, but there are all sorts of misaligned incentives which help prevent that.
It’s a really tough problem.
What about copy rights? StackOverlfow could disagree to use them content and taking away users from them. AI owner should first make some agreement with content owner..
Edit. I meant StackOverflow is keeping infrastructure alive so we can use content from their site. They should have something in return.
I think there are three potential problems with going AI only:
It's not advisable to put sensitive data into AI as it becomes integrated into the data model. There was news recently of how some secrets were 'leaked' from a well known company because its engineers asked ChatGPT to analyse some code. However, this is preventable with locally hosted AI.
In its present state, AI output can contain hallucinated information. Granted, there may be some contributors to Q&A websites that give inaccurate information, but there's usually some feedback on that, e.g. a voting system.
The current state of AI was trained on existing data, possibly including sites like Stack Overflow. If everybody abandoned Q&A websites in favour of AI, I'm not sure if the quality of responses would be as good with future technologies. I'm sure AI could parse API documentation of various technologies, but it's unclear whether that's enough.
Things may change in future, but I think a combination of the two is the best approach for now.
I don't think they're permanent replacements, however it's a lot easier to ask GPT about basic concepts than it is to ask Google.
But, you know, ChatGPT is blocked in some countries such as Saudi Arabia. Thank you.
Some comments may only be visible to logged-in visitors. Sign in to view all comments.