Gradient-based Hyperparameter Optimization Over Long Horizons
Forward-mode Differentiation with hyperparameter Sharing (FDS), a gradient-based hyperparameter optimization method.
Research notes, library releases, and other updates.
Forward-mode Differentiation with hyperparameter Sharing (FDS), a gradient-based hyperparameter optimization method.
Hyperband is a novel bandit-based approach to hyperparameter optimization that speeds up random search by adaptive resource allocation and early-stopping. It outperforms Bayesian optimization metho...
Deep image classifiers’ logits retain far more visual information than just class labels, enabling realistic image reconstruction via a GAN and revealing model invariances, robustness, and sensitiv...
Spherical loss functions and their performance compared to log-softmax in multi-class classification tasks.