2
votes

I'm using Tensorflow Object Detection API for detection and localization of one class object in images. For these purposes I use pre-trained faster_rcnn_resnet50_coco_2018_01_28 model.

I want to detect under/overfitting after training of the model. I see training loss, but after evaluating Tensorboard only shows mAP and Precision metrics and no loss.

Is this possible to plot a validation loss on Tensorboard too?

3

3 Answers

7
votes

There is validation loss. Assuming you're using the latest API, the curve under "loss" is validation loss while "loss_1/2" is the training loss.

4
votes

To see the validation curve you should change faster_rcnn_resnet50_coco.config:

1- comment max_evals line
2- set eval_interval_secs: 60 .
3- num_examples should be equal or less than the number of "files" that you have in "val.record" .

eval_config: { . 
  num_examples: 600 . 
  eval_interval_secs: 60 . 
  # Note: The below line limits the evaluation process to 10 evaluations.  
  # Remove the below line to evaluate indefinitely.  
  # max_evals: 10 .
}
1
votes

Using model_main.py for training gives two curves in tensorboard. They supposed to be train and validation losses.

you can use the following command at CMD.

python object_detection/model_main.py --num_eval_steps=10 --num_train_steps=50000 --alsologtostderr --pipeline_config_path=C:/DroneMaskRCNN/DroneMaskRCNN1/mask_rcnn_inception_v2_coco.config --model_dir=C:/DroneMaskRCNN/DroneMaskRCNN1/CP