Automatic Differentiation

Discussions center on automatic differentiation (autodiff), its relation to backpropagation, reverse-mode vs forward-mode AD, and applications in machine learning and differentiable programming.

➡️ Stable 0.5x AI & Machine Learning
2,684
Comments
18
Years Active
5
Top Authors
#4655
Topic ID

Activity Over Time

2009
10
2010
5
2011
10
2012
18
2013
24
2014
41
2015
69
2016
169
2017
204
2018
263
2019
282
2020
260
2021
220
2022
233
2023
334
2024
253
2025
268
2026
21

Keywords

TL RNN MPS LSTM wordpress.com GPU AI BFGS STOC CAD differentiation gradient automatic gradients descent linear loss layer programming derivatives

Sample Comments

adamnemecek May 10, 2019 View on HN

Look into automatic differentiation. It's magic.

hnuser355 Feb 13, 2019 View on HN

What the hell is that supposed to mean and how it’s different from automatic differentiation

adamnemecek Sep 9, 2020 View on HN

Are you familiar with automatic differentiation?

adamnemecek Feb 24, 2022 View on HN

Seems like automatic differentiation should be covered.

platz Jan 9, 2016 View on HN

Forward-mode AD doesnt't really scale. Reverse kode AD is useful for the backpropigation algo in machinelearning however

meowkit May 17, 2023 View on HN

Going to go out on a limb and say they are probably referring to the gradient calculus required for updating the model.https://en.wikipedia.org/wiki/Differentiable_programmingSee automatic differentiation.

adamnemecek Jun 22, 2019 View on HN

I'm out right now but this doesn't sound right. How does autodiff struggle with it?

thatsadude Dec 20, 2016 View on HN

Isn't backprop a clever application of chain rule in multivariate calculus?

brilee Mar 20, 2019 View on HN

This title is really confused - backpropagation is one implementation of automatic differentiation, and yields a gradient. What you do with that gradient next is up to you - you can just follow it (steepest descent), or try other fancier second order techniques.

exikyut Jan 8, 2018 View on HN

Discussion from 2016: https://news.ycombinator.com/item?id=11766092I also found this article on differentiable programming: https://pseudoprofound.wordpress.com/2016/08/03/differentiab... but a foundational TL;DR would be