in Data Handling by
Q:
Why should we use Batch Normalization?

► Click here to show 1 Answer

0 votes
by

Once the interviewer has asked you about the fundamentals of deep learning architectures, they would move on to the key topic of improving your deep learning model’s performance.

Batch Normalization is one of the techniques used for reducing the training time of our deep learning algorithm. Just like normalizing our input helps improve our logistic regression model, we can normalize the activations of the hidden layers in our deep learning model as well:

batch_normalization_deep_learning

We basically normalize a[1] and a[2] here. This means we normalize the inputs to the layer, and then apply the activation functions to the normalized inputs.

Here is an article that explains Batch Normalization and other techniques for improving Neural Networks: Neural Networks – Hyperparameter Tuning, Regularization & Optimization.

 

Loan/Mortgage
Learn More with Madanswer

Related questions

0 votes
asked Jan 30, 2020 in Other by rajeshsharma
...