0 votes
in Deep Learning by
What are the issues faced while training in Recurrent Networks?

1 Answer

0 votes
by

Recurrent Neural Network uses backpropagation algorithm for training, but it is applied on every timestamp. It is usually known as Back-propagation Through Time (BTT).

There are two significant issues with Back-propagation, such as:

1. Vanishing Gradient

When we perform Back-propagation, the gradients tend to get smaller and smaller because we keep on moving backward in the Network. As a result, the neurons in the earlier layer learn very slowly if we compare it with the neurons in the later layers.Earlier layers are more valuable because they are responsible for learning and detecting simple patterns. They are the building blocks of the network.

If they provide improper or inaccurate results, then how can we expect the next layers and complete network to perform nicely and provide accurate results. The training procedure tales long, and the prediction accuracy of the model decreases.

2. Exploding Gradient

Exploding gradients are the main problem when large error gradients accumulate. They provide result in very large updates to neural network model weights during training.

Gradient Descent process works best when updates are small and controlled. When the magnitudes of the gradient accumulate, an unstable network is likely to occur. It can cause poor prediction of results or even a model that reports nothing useful.

Related questions

0 votes
asked Dec 13, 2022 in Deep Learning by Robindeniel
0 votes
asked Sep 25, 2020 in Web Hosting by SakshiSharma
...