Keras Neural Network Accuracy Is Always 0 While Training
Solution 1:
EDIT: I realized that my earlier response was highly misleading, which was thankfully pointed out by @xdurch0 and @Timbus Calin. Here is an edited answer.
- Check that all your input values are valid. Are there any - nanor- infvalues in your training data?
- Try using different activation functions. - ReLUis good, but it is prone to what is known as the dying ReLu problem, where the neural network basically learns nothing since no updates are made to its weight. One possibility is to use Leaky ReLu or PReLU.
- Try using gradient clipping, which is a technique used to tackle vanishing or exploding gradients (which is likely what is happening in your case). Keras allows users to configure - clipnorm- clip valuefor optimizers.
There are posts on SO that report similar problems, such as this one, which might also be of interest to you.
Post a Comment for "Keras Neural Network Accuracy Is Always 0 While Training"