ChatGPT Reliability Issues
The cluster discusses widespread skepticism toward ChatGPT's accuracy, focusing on its tendency to confidently produce incorrect, hallucinated, or misleading information without reliable verification methods.
Activity Over Time
Top Contributors
Keywords
Sample Comments
because ChatGPT loves to make up some “facts” in authoritative tone?
When I ask ChatGPT about stuff I actually know about, it always gives glib, misleading and completely wrong answers. I therefore don't trust it to answer questions about stuff that I don't know much about.
What are the harmful things chatgpt is saying? I thought it only said wrong things with great confidence
How can you tell if ChatGPT is lying to you and what you're learning is total hogwash?
The problem with ChatGPT's "knowledge" is that it isn't trustworthy. It will happily output very confident sounding nonsense, or blatantly incorrect statements. We need a way to verify how accurate it's outputs are
ChatGPT doesn't say anything of the sort. In fact, it will vehemently insist that what it says is not necessarily true or accurate if you challenge it.
ChatGPT speaks authoritatively about everything and is mainly wrong so yeah, this makes sense.
I'm not sure that chatGPT actually has the correct information - it just responds to prompts for explanation as it does for any prompt. Confabulating. If its guess sounds correct, we think it is.Since it's drawing on the same information both times, it probably often is correct.In other words, like us.
How do you determine whether or not ChatGPT just made up whatever answer it gives you?
ChatGPT says "ChatGPT can make mistakes. Check important info." directly under the prompt box. If people will blindly trust a source that explicitly states that it isn't a reliable source, then they've got much bigger problems than AI.