LLM Token Performance
Discussions focus on token processing speeds (tokens/s), token limits, tokenization methods, and efficiency in AI language models, often questioning performance claims and comparisons to diffusion models.
➡️ Stable 1.1x AI & Machine Learning
2,036
Comments
20
Years Active
5
Top Authors
#9198
Topic ID
Activity Over Time
2007 2
2008 8
2009 6
2010 10
2011 15
2012 21
2013 36
2014 30
2015 27
2016 34
2017 102
2018 101
2019 59
2020 82
2021 137
2022 165
2023 330
2024 350
2025 483
2026 40
Top Contributors
Keywords
ALOT tokens token openai generated input generate simplification digitally exceeded awards
Sample Comments
apparently it's not diffusion, but tokens
What do you mean with 'tokenized'?
What's the performance like in tokens/s?
What's the tokens/s on those?
Pardon me, how many "tokens" ;-)
Your usage of the word “token” shows that it can’t work.
Isn't this a serious simplification? Tokens are just the medium
Would you like that with or without tokens?
The most coherent pitch for tokens?
How many tokens/s would that be though?