keras.layers.TimeDistributed with hub.KerasLayer NotImplementedError

后端 未结 2 758
被撕碎了的回忆
被撕碎了的回忆 2021-01-24 03:50

I want to use tf.keras.TimeDistributed() layer with the tf.hub inception_v3 CNN model from the latest TensorFLow V2 version (tf-nightly-gpu-2.0-preview). The output is shown bel

相关标签:
2条回答
  • 2021-01-24 03:58

    Wrapper Layers like TimeDistributed require a layer instance to be passed. If you build the model out of custom layers, you'll need to at least wrap them in tf.keras.layers.Lambda. This might not be possible in your case of models from hub.KerasLayer, so you might consider the solutions posted here:

    TimeDistributed of a KerasLayer in Tensorflow 2.0

    0 讨论(0)
  • 2021-01-24 04:05

    In Tensorflow 2 it is possible to use custom layers in combination with the TimeDistributed layer. The error is thrown because it can't compute the output shape (see here).

    So in your case you should be able to subclass KerasLayer and implement compute_output_shape manually.

    0 讨论(0)
提交回复
热议问题