• Home
  • Recent Q&A
  • Java
  • Cloud
  • JavaScript
  • Python
  • SQL
  • PHP
  • HTML
  • C++
  • Data Science
  • DBMS
  • Devops
  • Hadoop
  • Machine Learning
in Deep Learning by
Q:
Why is zero initialization not a good weight initialization process?

1 Answer

0 votes
by

If the set of weights in the network is put to a zero, then all the neurons at each layer will start producing the same output and the same gradients during backpropagation.

As a result, the network cannot learn at all because there is no source of asymmetry between neurons. That is the reason why we need to add randomness to the weight initialization process.

Related questions

+2 votes
asked Jan 23, 2021 in Deep Learning by SakshiSharma
0 votes
asked Jun 13, 2021 in Deep Learning by Robindeniel
...