ChatGPT Understanding Debate

Cluster focuses on debates about whether ChatGPT and GPT models possess true understanding, comprehension, or knowledge, or if they merely statistically predict responses from training data without genuine cognition.

šŸ“‰ Falling 0.1x AI & Machine Learning
4,946
Comments
19
Years Active
5
Top Authors
#3397
Topic ID

Activity Over Time

2007
1
2009
1
2010
2
2011
10
2012
3
2013
2
2014
5
2015
17
2016
14
2017
11
2018
8
2019
40
2020
417
2021
206
2022
680
2023
2,543
2024
507
2025
458
2026
21

Keywords

GPT2 e.g AI AGI LLM HN ML GPT i.e SIMPLE chatgpt gpt understanding text human model language context models trained

Sample Comments

ccppurcell • Jan 5, 2025 • View on HN

Not according to the article. Chatgpt could be argued to have knowledge. It cannot be argued to have understanding (yet).

otabdeveloper4 • Dec 12, 2022 • View on HN

ChatGPT just matches the most statistically-likely reply based on a huge corpus of internet discussions, it doesn't actually have any ideas.

chlorion • Dec 25, 2022 • View on HN

ChatGPT is a language model, it does not "understand" anything, concepts, words or otherwise. It is even programmed to give you a canned response saying something roughly equal to this when asking it about its comprehension abilities.ChatGPT has no more capacity for understanding concepts than any other computer program, it's just very finely tuned to emit responses that make it appear as if it does.

scotty79 • Mar 26, 2023 • View on HN

It's pretty much the truth. What the ChatGPT is good at is "keeping in mind" various associations between words that occurred in the session so far. To keep those associations some internal structure bound to get conjured. It doesn't mean the transformer understands anything or can do any kind of reasoning, despite the fact that it can mimic a bit how reasoning output looks like and even get it right sometimes if the context is fairly close to something it seen in the trainin

mrtranscendence • May 21, 2023 • View on HN

ChatGPT does not fulfill that definition because it does not have any ā€œmental representationā€; it has no mind with which to form a ā€œmental modelā€. It emulates understanding — quite well in many scenarios — but there is nothing there to possess understanding; it is at bottom simply a very large collection of numbers that are combined arithmetically according to a simple algorithm.

richardw • Mar 26, 2024 • View on HN

The fact that you ask that is in many ways the difference. You feel there’s a limitation in your knowledge of the term ā€œunderstandā€ and its use in this context and would like clarification before you’re more certain, either way. At some point either enough information arrives to convince you, or you decide it’s not true. Whatever that process and internal states are, is something GPT can’t do. It’ll 100% confidently produce something and be fully rewarded that it chose tokens that humans would

kilgnad • Feb 17, 2023 • View on HN

It has not been trained to specifically to produce this text. It was trained randomly on random text from the internet.The production of this text is a side effect of that training.You will note that this text is more detailed and nuanced then what most humans can produce. A human when asked to prove he understands what a human is via text is unlikely do a better job then this. You realize when I asked chatGPT about this I deliberately emphasized it to produce more exact details for it be

SpicyLemonZest • Jul 27, 2020 • View on HN

How can we tell whether or not GPT-3 has understanding of the data?

godelski • Jun 12, 2023 • View on HN

I have to say it because a lot of HN users try to convince me that GPT has understanding. We actually have 0 ML models that demonstrate understanding. Frankly, we don't know if this is even possible yet. Sure, in the future I believe we'll have AGI, but an AGI would likely recognize that my statement is contextualized around the current environment and fitting common human speech patters rather than making an absolute immutable statement.

qbasic_forever • Apr 20, 2023 • View on HN

It's confabulating (or hallucinating) meaning. GPT has no understanding of anything. Given an input prompt (tell me what this made up language sentence means for example), it's reaching into its training set and stochastically parroting a response. There is no cognition or understanding of language.