LLM Token Performance

Discussions focus on token processing speeds (tokens/s), token limits, tokenization methods, and efficiency in AI language models, often questioning performance claims and comparisons to diffusion models.

➡️ Stable 1.1x AI & Machine Learning
2,036
Comments
20
Years Active
5
Top Authors
#9198
Topic ID

Activity Over Time

2007
2
2008
8
2009
6
2010
10
2011
15
2012
21
2013
36
2014
30
2015
27
2016
34
2017
102
2018
101
2019
59
2020
82
2021
137
2022
165
2023
330
2024
350
2025
483
2026
40

Keywords

ALOT tokens token openai generated input generate simplification digitally exceeded awards

Sample Comments

thesparks Mar 26, 2025 View on HN

apparently it's not diffusion, but tokens

jacquesm Sep 4, 2014 View on HN

What do you mean with 'tokenized'?

sigmoid10 Feb 28, 2024 View on HN

What's the performance like in tokens/s?

detrites Mar 11, 2023 View on HN

What's the tokens/s on those?

bittlingmayer Mar 25, 2024 View on HN

Pardon me, how many "tokens" ;-)

teddyh Oct 29, 2015 View on HN

Your usage of the word “token” shows that it can’t work.

kfrzcode Jul 24, 2023 View on HN

Isn't this a serious simplification? Tokens are just the medium

pizza Dec 17, 2024 View on HN

Would you like that with or without tokens?

tptacek Jan 9, 2018 View on HN

The most coherent pitch for tokens?

SkiFire13 Mar 5, 2025 View on HN

How many tokens/s would that be though?