Skip to content Skip to sidebar Skip to footer

Keras Neural Network Accuracy Is Always 0 While Training

I'm making a simple classification algo with a keras neural network. The goal is to take 3 data points on weather and decide whether or not there's a wildfire. Here's an image of t

Solution 1:

EDIT: I realized that my earlier response was highly misleading, which was thankfully pointed out by @xdurch0 and @Timbus Calin. Here is an edited answer.

  1. Check that all your input values are valid. Are there any nan or inf values in your training data?

  2. Try using different activation functions. ReLU is good, but it is prone to what is known as the dying ReLu problem, where the neural network basically learns nothing since no updates are made to its weight. One possibility is to use Leaky ReLu or PReLU.

  3. Try using gradient clipping, which is a technique used to tackle vanishing or exploding gradients (which is likely what is happening in your case). Keras allows users to configure clipnorm clip value for optimizers.

There are posts on SO that report similar problems, such as this one, which might also be of interest to you.


Post a Comment for "Keras Neural Network Accuracy Is Always 0 While Training"