Hyperparameter Optimization Techniques
Comments discuss methods for hyperparameter optimization in machine learning, such as Bayesian optimization, genetic algorithms, and tools like SigOpt and Hyperopt, comparing their performance, costs, and applications like NAS.
Activity Over Time
Top Contributors
Keywords
Sample Comments
This article seems to discuss genetic algorithms which could be used for hyperparam optimization. We use another method called Bayesian (GP) hyperparam optimization. According to our internal benchmarks and academic research Bayesian methods outperform genetic algorithms. Another thing to keep in mind that we automate the entire process for you. You only need to provide a list of parameters you'd like to tune.
the article doesn't seem to consider cost of hyperparameter optimization prior to the final training...
So you want a optimizer for hyperparameter optimizers? This is getting quite meta.
How does this compare to hyperopt?
Directly addresses the "dark art" of hyperparameter tuning in machine learning.
Are you already employing Bayesian optimization techniques? These are commonly used to explore spaces where evaluation is expensive.
bayesian optimisation can be used in cases where gradient evaluation is expensive. in neural networks this can be applied to selecting hyper-parrameters to a trained model
This is very cool. What optimization algos are being used?
I'm used to hyperopt, do you know how they compare?
did anyone used it for something other then hyperparameter tuning ?