I want to use tf.keras.TimeDistributed() layer with the tf.hub inception_v3 CNN model from the latest TensorFLow V2 version (tf-nightly-gpu-2.0-preview). The output is shown bel
Wrapper Layers like TimeDistributed
require a layer
instance to be passed. If you build the model out of custom layers, you'll need to at least wrap them in tf.keras.layers.Lambda
. This might not be possible in your case of models from hub.KerasLayer
, so you might consider the solutions posted here:
TimeDistributed of a KerasLayer in Tensorflow 2.0
In Tensorflow 2 it is possible to use custom layers in combination with the TimeDistributed
layer. The error is thrown because it can't compute the output shape (see here).
So in your case you should be able to subclass KerasLayer
and implement compute_output_shape
manually.