问题
vw-hypersearch is the Vowpal Wabbit wrapper intended to optimize hyperparameters in vw models: regularization rates, learning rates and decays, minibatches, bootstrap sizes etc. In the tutorial for vw-hypersearch there is a following example:
vw-hypersearch 1e-10 5e-4 vw --l1 % train.dat
Here %
means the parameter to be optimized, 1e-10 5e-4
are the lower and upper bounds for the interval over which to search. The library uses golden section search method to minimize the number of iterations.
But what if I want to search over multiple hyperparameters? From the sources like this github issue discussion, I get a hint that possibly no multidimentional search methods are realized in vw. Thus, the only way out is to write one's own task-specific optimizers. Am I right?
回答1:
Now this can be done with the module vw-hyperopt.py
that lives at /vowpal_wabbit/utl/
in the repository.
See my pull-request here: https://github.com/JohnLangford/vowpal_wabbit/pull/867
In the near future this will be better documented.
来源:https://stackoverflow.com/questions/33242742/multidimensional-hyperparameter-search-with-vw-hypersearch-in-vowpal-wabbit