Training an image classifier using .fit_generator()
or .fit()
and passing a dictionary to class_weight=
as an argument.
I never go
I believe this is a bug with tensorflow that will happen when you call model.compile()
with default parameter sample_weight_mode=None
and then call model.fit()
with specified sample_weight
or class_weight
.
From the tensorflow repos:
fit()
eventually calls _process_training_inputs()
_process_training_inputs()
sets sample_weight_modes = [None]
based on model.sample_weight_mode = None
and then creates a DataAdapter
with sample_weight_modes = [None]
DataAdapter
calls broadcast_sample_weight_modes()
with sample_weight_modes = [None]
during initializationbroadcast_sample_weight_modes()
seems to expect sample_weight_modes = None
but receives [None]
[None]
is a different structure from sample_weight
/ class_weight
, overwrites it back to None
by fitting to the structure of sample_weight
/ class_weight
and outputs a warningWarning aside this has no effect on fit()
as sample_weight_modes
in the DataAdapter
is set back to None
.
Note that tensorflow documentation states that sample_weight
must be a numpy-array. If you call fit()
with sample_weight.tolist()
instead, you will not get a warning but sample_weight
is silently overwritten to None
when _process_numpy_inputs()
is called in preprocessing and receives an input of length greater than one.