AI Medical Diagnosis

The cluster centers on debates about whether large language models (LLMs) like ChatGPT can diagnose medical conditions as effectively as or better than human doctors, featuring personal success stories, skepticism on maturity, and discussions on augmentation or replacement.

➡️ Stable 0.9x AI & Machine Learning
4,174
Comments
20
Years Active
5
Top Authors
#1060
Topic ID

Activity Over Time

2007
2
2008
15
2009
13
2010
37
2011
92
2012
70
2013
134
2014
119
2015
97
2016
157
2017
291
2018
205
2019
202
2020
140
2021
253
2022
271
2023
707
2024
532
2025
699
2026
138

Keywords

e.g US MYCIN LLM youtu.be AI HN RAG AUC i.e doctors doctor medical diagnosis llm ai diagnose patient llms patients

Sample Comments

pid-1 Sep 12, 2024 View on HN

Why can't medical doctors be automated?

asdaqopqkq Jan 23, 2026 View on HN

As for me, ChatGPT and Claude were able to diagnose and fix health issues that multiple doctors failed to fix. I trust LLMs more than random doctors and blogs.Cause they are able to search the web deeply, search for up to date info/research and synergize all that. You can have back and fourth for as long as you need.The issue is that using LLMs properly requires a certain skill that more people lack.

tsoukase Dec 10, 2025 View on HN

Using an LLM to diagnose/treat a health issue is not yet mature and not going to be such in the foreseeable future. It's not like predicting a protein fold or presenting a business report. Medical advice is in many dimensions incompatible with artificial support: non-quantifiable, fuzzy, case-specific, intimate, empathic and awfully responsible. Even AGI is not enough. In order to become a doctor, the machine has to develop next-level dexterities, namely to follow the interplay of mult

gjulianm Oct 22, 2022 View on HN

Modern medicine already incorporates wide ranges of data. Doctors use flowcharts, scales, point systems, etc, to diagnose certain conditions because those tools have been developed by studying and considering a lot of cases.However, there's a lot that isn't covered with data. The "middle of the scale", the "almost but not quite there", the "this is weird"... Doctors are good at that, through experience, and those are the difficult cases. Those are the o

HillaryBriss Oct 4, 2017 View on HN

i.e. do people in the HN community think that their doctors would do a better job if they leveraged ML in some form? do doctors rely too much on their own personal in-brain understanding?

newaccount74 Aug 6, 2023 View on HN

I was with you until the last sentence:> There's an absurd amount of potential for LLMs here.The problem is not that doctors are stupid, the problem is that a lot of ailments are just not easily diagnosed. Even if the doctor has one (or more) suspicions what the red rash on your skin is, there just aren't any tests for many conditions. Many diseases are only diagnosed by symptoms and the underlying cause is unknown.And even if you get a diagnosis, there is often nothing you

satisfice Apr 19, 2025 View on HN

Doctors are still better than LLMs, by a lot.

afavour Mar 14, 2023 View on HN

An AI trained on the past work of diagnosticians doesn't really render diagnosticians obsolete.

taylodl Jul 24, 2025 View on HN

Meaning it's pragmatically easier to ask AI over a doctor

shortrounddev2 Dec 9, 2025 View on HN

If you're going to ask an LLM for a medical diagnosis, stop what you're doing and ask a doctor instead. There is no good advice downstream of the decision to ask an LLM for a medical diagnosis