0
votes

I have just started to learn CNN on Tensorflow. However, when I train the model the Loss and accuracy don't change.

enter image description here

I am using images of size 128x128x3 and the images are normalized (in [0,1]). And here is the compiler that I am using.

model.compile(optimizer=tf.keras.optimizers.Adam(lr=0.000001), loss='binary_crossentropy', metrics=['accuracy'])

And here is the summary of my model

enter image description here

I tried the following things but I always have the same values:

  • Change the learning rate from 0.00000001 to 10
  • Change the convolution kernel I tried 5x5 and 3x3
  • I added another fully connected layer and a Conv layer.

update

The layers' weights didn't change after fitting the model. I have the same initial weights.

1
Is your model updating the weights after every epoch? I would recommend checking the layer weights after every epoch.Anurag Reddy
I don't know how to check the weights in every epoch. But yes, before training and after the 5 epochs I have the same weight. What is the problem ??Wassim Bouatay
What is the activation you are using at the output node? And could upload your code so that I can see what is wrong. Because this seems weird why the model won't train.Anurag Reddy
I am using 'softmax' Here is a link to my project: colab.research.google.com/drive/…Wassim Bouatay

1 Answers

1
votes

You could try this,

 model.compile(optimizer='adam',
          loss=tf.keras.losses.BinaryCrossentropy(from_logits=True),
          metrics=['accuracy'])

Also, remove the softmax activation from the last layer, a binary classification problem does not need a softmax. So, what softmax does in this case is clip the value always to 1 since there is only one probability and thus the network doesn't train. This link might help you understand softmax. Additionally, you can try using sigmoid activation at the final node. That clips the output to a value in the range of 0 to 1 and the network weights wont blow up because of a very high loss.