How will you tackle an exploding gradient problem?
By sticking to a small learning rate, scaled target variables, a standard loss function, one can carefully configure the network of a model and avoid exploding gradients. Another approach for tackling exploding gradients is using gradient scaling or gradient clipping to change the error before it is propagated back through the network. This change in error allows rescaling of weights.