0 votes
in Data Handling by
In a neural network, what if all the weights are initialized with the same value?

1 Answer

0 votes
by

In simplest terms, if all the neurons have the same value of weights, each hidden unit will get exactly the same signal. While this might work during forward propagation, the derivative of the cost function during backward propagation would be the same every time.

In short, there is no learning happening by the network! What do you call the phenomenon of the model being unable to learn any patterns from the data? Yes, underfitting.

Therefore, if all weights have the same initial value, this would lead to underfitting.

Related questions

0 votes
asked Jan 28, 2020 in Data Handling by rahuljain1
+1 vote
asked Jan 28, 2020 in Data Handling by rahuljain1
...