AI Medical Diagnosis
The cluster centers on debates about whether large language models (LLMs) like ChatGPT can diagnose medical conditions as effectively as or better than human doctors, featuring personal success stories, skepticism on maturity, and discussions on augmentation or replacement.
Activity Over Time
Top Contributors
Keywords
Sample Comments
Why can't medical doctors be automated?
As for me, ChatGPT and Claude were able to diagnose and fix health issues that multiple doctors failed to fix. I trust LLMs more than random doctors and blogs.Cause they are able to search the web deeply, search for up to date info/research and synergize all that. You can have back and fourth for as long as you need.The issue is that using LLMs properly requires a certain skill that more people lack.
Using an LLM to diagnose/treat a health issue is not yet mature and not going to be such in the foreseeable future. It's not like predicting a protein fold or presenting a business report. Medical advice is in many dimensions incompatible with artificial support: non-quantifiable, fuzzy, case-specific, intimate, empathic and awfully responsible. Even AGI is not enough. In order to become a doctor, the machine has to develop next-level dexterities, namely to follow the interplay of mult
Modern medicine already incorporates wide ranges of data. Doctors use flowcharts, scales, point systems, etc, to diagnose certain conditions because those tools have been developed by studying and considering a lot of cases.However, there's a lot that isn't covered with data. The "middle of the scale", the "almost but not quite there", the "this is weird"... Doctors are good at that, through experience, and those are the difficult cases. Those are the o
i.e. do people in the HN community think that their doctors would do a better job if they leveraged ML in some form? do doctors rely too much on their own personal in-brain understanding?
I was with you until the last sentence:> There's an absurd amount of potential for LLMs here.The problem is not that doctors are stupid, the problem is that a lot of ailments are just not easily diagnosed. Even if the doctor has one (or more) suspicions what the red rash on your skin is, there just aren't any tests for many conditions. Many diseases are only diagnosed by symptoms and the underlying cause is unknown.And even if you get a diagnosis, there is often nothing you
Doctors are still better than LLMs, by a lot.
An AI trained on the past work of diagnosticians doesn't really render diagnosticians obsolete.
Meaning it's pragmatically easier to ask AI over a doctor
If you're going to ask an LLM for a medical diagnosis, stop what you're doing and ask a doctor instead. There is no good advice downstream of the decision to ask an LLM for a medical diagnosis