in Deep Learning by (14.1k points)
What do you mean by hyperparameters in the context of deep learning?

1 Answer

0 votes
by (14.1k points)

Hyperparameters are variables that determine the network topology (for example, the number of hidden units) and how the network is trained (Eg: Learning Rate). They are set before training the model, that is, before optimizing the weights and the bias. 

Following are some of the examples of hyperparameters:-

Number of hidden layers: With regularisation techniques, many hidden units inside a layer can boost accuracy. Underfitting may occur if the number of units is reduced. 

Learning Rate: The learning rate is the rate at which a network's parameters are updated. The learning process is slowed by a low learning rate, but it eventually converges. A faster learning rate accelerates the learning process, but it may not converge. A declining Learning rate is usually desired.

Related questions

0 votes
asked Dec 11, 2022 in Deep Learning by john ganales (14.1k points)
0 votes
+1 vote
asked Oct 13, 2022 in Deep Learning by AdilsonLima (6.3k points)
+1 vote
asked Jun 10, 2021 in Deep Learning by Robindeniel (20.8k points)
...