user image

Deepika Deepika

Job Interview Skills
English
2 years ago

How will you tackle an exploding gradient problem?

user image

Abhishek Mishra

2 years ago

By sticking to a small learning rate, scaled target variables, a standard loss function, one can carefully configure the network of a model and avoid exploding gradients. Another approach for tackling exploding gradients is using gradient scaling or gradient clipping to change the error before it is propagated back through the network. This change in error allows rescaling of weights.

Recent Doubts

Close [x]