Answer the question
In order to leave comments, you need to log in
How to take into account the min area of the predicted mask in the Dice metric?
When predicting, the neural network sometimes gives a mask with false positive pixel values from 0.5-0.75, etc. The average number of such pixels is from 100 to 500 per one falsely predicted mask. I want to put a filter in the Dice metric that would filter only pixels with a value of 0.75 in the predicted mask, and if the number of such pixels is less than 800, then we can assume that there is nothing in this mask, the entire mask is filled with 0, if more, then normal. After training, in manual testing on pictures, all this works, after writing ifs to filter by the values of each pixel and min area (800 pieces minimum), but how to do it in the metric so that all this would work already during training?
Loss and metric:
def dice_loss(y_true, y_pred):
smooth=1e-6
y_true_f = K.flatten(y_true)
y_pred_f = K.cast(y_pred, 'float32')
y_pred_f = K.flatten(y_pred)
intersection = y_true_f * y_pred_f
score = K.mean(1. - (2. * K.sum(intersection) + smooth) / (K.sum(y_true_f) + K.sum(y_pred_f) + smooth))
return score
def dice_metric(preds, trues):
preds = K.cast(preds, 'float32')
return 1 - dice_loss(preds, trues)
preds_f = K.cast(K.greater(K.flatten(preds), 0.75), 'float32')
K.cast(K.greater(K.sum(preds), 800.0), 'float32')
Answer the question
In order to leave comments, you need to log in
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question