Concatenate input tensor with multiple of neg 1 tensors

后端 未结 1 892
你的背包
你的背包 2021-01-24 02:13

Similar Posts: Firstly, these 2 posts are similar if not the same. I tried to implement these in vain. So I\'m missing something probably because of my inexperi

相关标签:
1条回答
  • 2021-01-24 02:36

    I found the solution for a model that takes in

    # a batch of size 1, for a tensor of shape =[3]
    d1 = numpy.array([[1, 2, 3]])
    

    and produces as output

    out = my_model.predict( [ d1 ] )
    print(out)
    # [[ 1.  2.  3. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1.]]
    

    The big lesson is that keras.layers want as input, the output of other keras.layers. So functionality from keras.backend like tile must be wrapped in keras.layers.Lambda before feeding them as input to keras.layer.

    Thank you to

    • Stackoverflow: How to use tile function in Keras?
    • github issue: AttributeError: 'NoneType' object has no attribute '_inbound_nodes'
    • Stackoverflow: Using Subtract layer in Keras

    Here is the solution:

    # instantiate a Keras tensor ... as per document https://keras.io/layers/core/
    keras_tensor_input = keras.layers.Input( shape=[num_letters], dtype='float32' )
    print("keras_tensor_input = ", keras_tensor_input)
    # keras_tensor_input =  Tensor("input_1:0", shape=(?, 3), dtype=float32)
    
    # https://stackoverflow.com/questions/53865471/using-subtract-layer-in-keras
    keras_tensor_neg_1 = keras.layers.Lambda(lambda x: -keras.backend.ones_like(x) )(keras_tensor_input)
    print("keras_tensor_neg_1 = ", keras_tensor_neg_1)
    # keras_tensor_neg_1 =  Tensor("lambda_1/Neg:0", shape=(?, 3), dtype=float32)
    # note batch size, just like keras_tensor_input
    
    # https://stackoverflow.com/questions/53250533/how-to-use-tile-function-in-keras
    keras_tensor_neg_1_tiled = keras.layers.Lambda(lambda x: keras.backend.tile(x, (1, n-1)))(keras_tensor_neg_1)
    print("keras_tensor_neg_1_tiled = ", keras_tensor_neg_1_tiled)
    # keras_tensor_neg_1_tiled =  Tensor("Tile_2:0", shape=(12,), dtype=float32)
    # note batch size, just like keras_tensor_input
    
    # concatenate the input from the generator and the padding
    keras_tensor_concat = keras.layers.Concatenate()( inputs = [keras_tensor_input, keras_tensor_neg_1_tiled] )
    print("keras_tensor_concat = ", keras_tensor_concat)
    # keras_tensor_concat =  Tensor("concatenate_1/concat:0", shape=(?, 15), dtype=float32)
    
    my_model = keras.models.Model( inputs=keras_tensor_input, output=keras_tensor_concat)
    
    # dummy optimizer, loss, and metric so I can compile and test the model
    my_model.compile(optimizer='rmsprop',loss='categorical_crossentropy', metrics=['accuracy'])
    
    # a batch of size 1, for a tensor of shape =[3]
    d1 = numpy.array([[1, 2, 3]])
    
    out = my_model.predict( [ d1 ] )
    print(out)
    # [[ 1.  2.  3. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1.]]
    
    0 讨论(0)
提交回复
热议问题