Automatic Differentiation
Discussions center on automatic differentiation (autodiff), its relation to backpropagation, reverse-mode vs forward-mode AD, and applications in machine learning and differentiable programming.
Activity Over Time
Top Contributors
Keywords
Sample Comments
Look into automatic differentiation. It's magic.
What the hell is that supposed to mean and how it’s different from automatic differentiation
Are you familiar with automatic differentiation?
Seems like automatic differentiation should be covered.
Forward-mode AD doesnt't really scale. Reverse kode AD is useful for the backpropigation algo in machinelearning however
Going to go out on a limb and say they are probably referring to the gradient calculus required for updating the model.https://en.wikipedia.org/wiki/Differentiable_programmingSee automatic differentiation.
I'm out right now but this doesn't sound right. How does autodiff struggle with it?
Isn't backprop a clever application of chain rule in multivariate calculus?
This title is really confused - backpropagation is one implementation of automatic differentiation, and yields a gradient. What you do with that gradient next is up to you - you can just follow it (steepest descent), or try other fancier second order techniques.
Discussion from 2016: https://news.ycombinator.com/item?id=11766092I also found this article on differentiable programming: https://pseudoprofound.wordpress.com/2016/08/03/differentiab... but a foundational TL;DR would be