I\'m interested in taking advantage of some partially labeled data that I have in a deep learning task. I\'m using a fully convolutional approach, not sampling patches from
Try label smoothing as described in section 7.5.1 of Deep Learning book:
We can assume that for some small constant
eps
, the training set label y is correct with probability1 - eps
, and otherwise any of the other possible labels might be correct.Label smoothing regularizes a model based on a softmax with
k
output values by replacing the hard 0 and 1 classification targets with targets ofeps / k and 1 - (k - 1) / k * eps
, respectively.
See my question about implementing label smoothing in Pandas.
Otherwise if you know for sure, that some areas are negative, other are positive while some are uncertain, then you can introduce a third uncertain class. I have worked with data sets that contained uncertain class, which corresponded to samples that could belong to any of the available classes.