Architecture-aware Bayesian Optimization for Neural Network Tuning

A. Sjöberg, M. Önnheim, E. Gustavsson, M. Jirstrand. In Proceedings of the 28th International Conference on Artificial Neural Networks 2019, Munich, Germany, 17-19 September, 2019.

Abstract

Hyperparameter optimization of a neural network is a non-trivial task. It is time-consuming to evaluate a hyperparameter setting, no analytical expression of the impact of the hyperparameters are available, and the evaluations are noisy in the sense that the result is dependent on the training process and weight initialization. Bayesian optimization is a powerful tool to handle these problems. However, hyperparameter optimization of neural networks poses additional challenges, since the hyperparameters can be integer-valued, categorical, and/or conditional, whereas Bayesian optimization often assumes variables to be real-valued. In this paper we present an architecture-aware transformation of neural networks applied in the kernel of a Gaussian process to boost the performance of hyperparameter optimization. The empirical experiment in this paper demonstrates that by introducing an architecture-aware transformation of the kernel, the performance of the Bayesian optimizer shows a clear improvement over a na\”{i}ve implementation and that the results are comparable to other state-of-the-art methods.

 




Photo credits: Nic McPhee