loss

How to implement an adaptive loss in Keras?

家住魔仙堡 提交于 2021-02-10 12:01:02
问题 I am trying to use Keras to implement the work done in A General and Adaptive Robust Loss Function. The author provides tensorflow code that works the hard details. I am just trying to use his prebuilt function in Keras. His custom loss function is learning a parameter 'alpha' that controls the shape of the loss function. I would like to track 'alpha' in addition to the loss during training. I am somewhat familiar with Keras custom loss functions and using wrappers, but I am not entirely sure

Plot validation loss in Tensorflow Object Detection API

做~自己de王妃 提交于 2021-02-08 03:27:47
问题 I'm using Tensorflow Object Detection API for detection and localization of one class object in images. For these purposes I use pre-trained faster_rcnn_resnet50_coco_2018_01_28 model. I want to detect under/overfitting after training of the model. I see training loss, but after evaluating Tensorboard only shows mAP and Precision metrics and no loss. Is this possible to plot a validation loss on Tensorboard too? 回答1: There is validation loss. Assuming you're using the latest API, the curve

Plot validation loss in Tensorflow Object Detection API

江枫思渺然 提交于 2021-02-08 03:27:02
问题 I'm using Tensorflow Object Detection API for detection and localization of one class object in images. For these purposes I use pre-trained faster_rcnn_resnet50_coco_2018_01_28 model. I want to detect under/overfitting after training of the model. I see training loss, but after evaluating Tensorboard only shows mAP and Precision metrics and no loss. Is this possible to plot a validation loss on Tensorboard too? 回答1: There is validation loss. Assuming you're using the latest API, the curve

Plot validation loss in Tensorflow Object Detection API

删除回忆录丶 提交于 2021-02-08 03:26:27
问题 I'm using Tensorflow Object Detection API for detection and localization of one class object in images. For these purposes I use pre-trained faster_rcnn_resnet50_coco_2018_01_28 model. I want to detect under/overfitting after training of the model. I see training loss, but after evaluating Tensorboard only shows mAP and Precision metrics and no loss. Is this possible to plot a validation loss on Tensorboard too? 回答1: There is validation loss. Assuming you're using the latest API, the curve

TensorFlow2-tf.keras: Loss and model weights suddenly become 'nan' when training MTCNN PNet

心已入冬 提交于 2021-01-29 12:59:14
问题 I was trying to use tfrecords to train the PNet of MTCNN. At first the loss was decreasing smoothly for the first few epochs and then it became 'nan' and so did the model weights. Below are my model structure and training results: def pnet_train1(train_with_landmark = False): X = Input(shape = (12, 12, 3), name = 'Pnet_input') M = Conv2D(10, 3, strides = 1, padding = 'valid', kernel_initializer = glorot_normal, kernel_regularizer = l2(0.00001), name = 'Pnet_conv1')(X) M = PReLU(shared_axes =

TensorFlow2-tf.keras: Loss and model weights suddenly become 'nan' when training MTCNN PNet

安稳与你 提交于 2021-01-29 12:21:01
问题 I was trying to use tfrecords to train the PNet of MTCNN. At first the loss was decreasing smoothly for the first few epochs and then it became 'nan' and so did the model weights. Below are my model structure and training results: def pnet_train1(train_with_landmark = False): X = Input(shape = (12, 12, 3), name = 'Pnet_input') M = Conv2D(10, 3, strides = 1, padding = 'valid', kernel_initializer = glorot_normal, kernel_regularizer = l2(0.00001), name = 'Pnet_conv1')(X) M = PReLU(shared_axes =

tensorflow loss & sample weights

我们两清 提交于 2020-12-15 06:26:07
问题 Two simple questions about Tensorflow's loss and sample weights. Imagine I have shallow fully convolutional NN with next model: Image(16x16x1)->Conv2(16x16x10)->so output is vector o[1][1][10] with 10 neurons. Because of batch 32, we have final output tensor as [32][1][1][10] (all dimentions checked by myself carefully). So now questions: I have experience of writing in C++ and understanding backpropagation, so I don't understand why, for example MSE loss in TF, use reduction of last

tensorflow loss & sample weights

笑着哭i 提交于 2020-12-15 06:25:39
问题 Two simple questions about Tensorflow's loss and sample weights. Imagine I have shallow fully convolutional NN with next model: Image(16x16x1)->Conv2(16x16x10)->so output is vector o[1][1][10] with 10 neurons. Because of batch 32, we have final output tensor as [32][1][1][10] (all dimentions checked by myself carefully). So now questions: I have experience of writing in C++ and understanding backpropagation, so I don't understand why, for example MSE loss in TF, use reduction of last

Keras predict gives different error than evaluate, loss different from metrics

こ雲淡風輕ζ 提交于 2020-12-11 15:53:51
问题 I have the following problem: I have an autoencoder in Keras, and train it for a few epochs. The training overview shows a validation MAE of 0.0422 and an MSE of 0.0024. However, if I then call network.predict and manually calculate the validation errors, I get 0.035 and 0.0024. One would assume that my manual calculation of the MAE is simply incorrect, but the weird thing is that if I use an identity model (simply outputs what you input) and use that to evaluate the predicted values, the

Keras predict gives different error than evaluate, loss different from metrics

谁说胖子不能爱 提交于 2020-12-11 15:52:02
问题 I have the following problem: I have an autoencoder in Keras, and train it for a few epochs. The training overview shows a validation MAE of 0.0422 and an MSE of 0.0024. However, if I then call network.predict and manually calculate the validation errors, I get 0.035 and 0.0024. One would assume that my manual calculation of the MAE is simply incorrect, but the weird thing is that if I use an identity model (simply outputs what you input) and use that to evaluate the predicted values, the