What are hyperparameters?

A hyperparameter is a certain parameter with which the learning process can be controlled. They can be used for model selection tasks - on the other hand, there are also algorithm hyperparameters which in principle have no influence on the performance of the model, but which influence the speed and quality of the learning process. Thus, one can speak of model hyperparameters that influence the topology and the size of Neural networks characterise. On the other hand, algorithm hyperparameters are the learning rate and the mini-batch size.

Different model-training algorithms require different hyperparameters - some simple Algorithms do not require any. With these hyperparameters, the training algorithm learns the parameters from given data. The time needed to train and test a model may depend on what choice is made in hyperparameters. A hyperparameter can usually be a continuous or an integer type. This can lead to mixed type optimisation problems. The existence of some hyperparameters is conditional on the values of others, such as the size of each hidden layer in a neural network, which may be conditional on the number of layers.

Hyperparameter optimisation is used to search for optimal hyperparameters in machine learning. A hyperparameter is determined before the actual model training.

When is hyperparameter optimisation used?

Automated machine learning and deep neural networks use hyperparameter optimisation. There are black-box function optimisation methods based on model-free methods and Bayesian optimisations. The optimisations entail high computational requirements. Every machine learning system has hyperparameters. The hyperparameters are automatically set to optimise performance.

What is hyperparameter optimisation?

Hyperparameters are adjustable parameters that you can use to control the model training process. In neural networks, you can set the number of hidden layers and determine the number of nodes that are on each layer. The performance of the model depends heavily on the hyperparameters.

Hyperparameter optimisation is performed with the process of searching for a suitable configuration. The configuration selected is the one that delivers the best performance. Such a process is practically always computationally intensive and must be carried out manually.

What are the differences between parameters and hyperparameters?

A model parameter is a configuration variable that is internal to the model used and whose value is estimated from the data. Such parameters are needed to make predictions. The values define the capability of the model for your problem. They are estimated or learned from the data. Usually they are not set manually by the user. Model parameters are often stored as part of the learned model. They are the key to machine learning.

In contrast, a hyperparameter is a configuration that is external to a model and whose value cannot be estimated from the data. These hyperparameters are often used to estimate model parameters in processes.