0 votes
in Deep Learning by
What do you mean by hyperparameters in the context of deep learning?

1 Answer

0 votes
by

Hyperparameters are variables that determine the network topology (for example, the number of hidden units) and how the network is trained (Eg: Learning Rate). They are set before training the model, that is, before optimizing the weights and the bias. 

Following are some of the examples of hyperparameters:-

Number of hidden layers: With regularisation techniques, many hidden units inside a layer can boost accuracy. Underfitting may occur if the number of units is reduced. 

Learning Rate: The learning rate is the rate at which a network's parameters are updated. The learning process is slowed by a low learning rate, but it eventually converges. A faster learning rate accelerates the learning process, but it may not converge. A declining Learning rate is usually desired.

Related questions

0 votes
asked Dec 11, 2022 in Deep Learning by john ganales
0 votes
asked Dec 12, 2022 in Deep Learning by Robin
...