0 votes
in Deep Learning by
What Is Data Normalization, and Why Do We Need It?

2 Answers

0 votes
by

The process of standardizing and reforming data is called “Data Normalization.” It’s a pre-processing step to eliminate data redundancy. Often, data comes in, and you get the same information in different formats. In these cases, you should rescale values to fit into a particular range, achieving better convergence.

0 votes
by
Data normalization is very important preprocessing step, used to rescale values to fit in a specific range to assure better convergence during backpropagation. In general, it boils down to subtracting the mean of each data point and dividing by its standard deviation. If we don't do this then some of the features (those with high magnitude) will be weighted more in the cost function (if a higher-magnitude feature changes by 1%, then that change is pretty big, but for smaller features it's quite insignificant). The data normalization makes all features weighted equally.

Related questions

0 votes
asked Aug 3, 2023 in Kubernetes K8s by SakshiSharma
0 votes
asked Dec 11, 2022 in Deep Learning by john ganales
...