LLM Determinism Debate

The cluster focuses on debates about whether Large Language Models (LLMs) produce deterministic outputs, highlighting issues like inherent randomness, sampling methods, seeds, floating-point variations, and execution environments that lead to inconsistent results for identical inputs.

➡️ Stable 1.9x AI & Machine Learning
3,335
Comments
20
Years Active
5
Top Authors
#3592
Topic ID

Activity Over Time

2007
1
2008
5
2009
14
2010
30
2011
24
2012
27
2013
37
2014
53
2015
59
2016
70
2017
102
2018
100
2019
125
2020
165
2021
131
2022
200
2023
463
2024
449
2025
1,137
2026
143

Keywords

AI CPU ASIC LLM SD FPGA OS thinkingmachines.ai SSE2 deterministic determinism llms llm seed non input output floating point token

Sample Comments

setq Jun 27, 2017 View on HN

That only works if you have some level of determinism.

cnnlives78 Nov 4, 2025 View on HN

The LLMs most of us are using have some element of randomness to every token selected, which is non-deterministic. You can try to attempt to corral that, but statistically, with enough iteration, it may provide nonsense, unintentional, dangerous, opposite solutions/answers/action, even if you have system instructions defining otherwise and a series of LLMs checking themselves. Be sure that you fully understand this. Even if you could make it fully deterministic, it would be determinist

overgard Oct 15, 2025 View on HN

Are you trying to say I'm old? Machines are deterministic.. LLM's are very much not.

arrow7000 Feb 15, 2022 View on HN

So in other words it's not perfectly deterministic at all?

korkybuchek Oct 18, 2024 View on HN

Interesting -- is there any impact from LLM outputs not being deterministic?

TOMDM Jul 11, 2025 View on HN

I think the better statement is likely "LLMs are typically not executed in a deterministic manner", since you're right there are no non deterministic properties interment to the models themselves that I'm aware of

Not sure that things are that deterministic.

input_sh Oct 7, 2024 View on HN

Or it's just non-deterministic, like with every LLM.

jdnend Jun 30, 2025 View on HN

Why wouldn't it be deterministic?

weakfish Sep 10, 2023 View on HN

This won’t ever work as long as LLMs are non deterministic