Local Optima in Optimization

The cluster focuses on the common problem in optimization algorithms, particularly gradient descent, where processes get stuck in local maxima or minima instead of reaching the global optimum. Discussions extend to real-world analogies like machine learning, business practices, and human behavior.

➡️ Stable 0.7x AI & Machine Learning
2,012
Comments
20
Years Active
5
Top Authors
#5782
Topic ID

Activity Over Time

2007
10
2008
13
2009
16
2010
45
2011
36
2012
32
2013
51
2014
50
2015
112
2016
130
2017
179
2018
198
2019
136
2020
174
2021
164
2022
115
2023
163
2024
203
2025
170
2026
15

Keywords

e.g AI OK StarCraft POLA google.com McNamara ES VIPS DeepMind local optimum gradient descent stuck maximum optimization algorithm global optimal

Sample Comments

msellout Dec 3, 2015 View on HN

Too bad. They're probably chasing local optima that are quite different from the global maximum.

zelphirkalt May 11, 2020 View on HN

Wouldn't that lead to hitting a local maximum / minimum and getting stuck there?

gk1 Dec 14, 2013 View on HN

You're doing it all wrong! You're only optimizing for the local maxima of your optimization's optimization!

rightbyte Aug 20, 2022 View on HN

Assuming it does not get stuck at some local maximum.

sfpotter May 23, 2025 View on HN

Just because you've never felt the need doesn't mean you aren't stuck in a local minimum.

RGamma Jan 14, 2021 View on HN

Not your problem. Just let 'em be stuck in their local optimum (probably better this way) :P

andybak Jul 1, 2018 View on HN

It's very hard to get unstuck from a local maximum

coderzach Jun 27, 2013 View on HN

The issue is that you'll hit a local optimum, and all progress will halt.

jupp0r Oct 3, 2019 View on HN

“Gradient Descent: The Ultimate Optimizer”Did they get stuck in a local optimum?

blackeyeblitzar Sep 15, 2024 View on HN

Isn’t this just saying things seek a local minima (or maxima)?