问题
I want to build a network like this
The hidden layer is not important, I want to know how can I write the code in my output layer, and the following is my code, am I right?
Parameters:
state_dim = 13
layer1_size, layer2_size = 400, 300
action_dim = 2
W1 = self.variable([state_dim,layer1_size],state_dim)
b1 = self.variable([layer1_size],state_dim)
W2 = self.variable([layer1_size,layer2_size],layer1_size)
b2 = self.variable([layer2_size],layer1_size)
W3 = tf.Variable(tf.random_uniform([layer2_size,action_dim],-0.003, 0.003))
b3 = tf.Variable(tf.random_uniform([action_dim],-0.003,0.003))
layer1 = tf.matmul(state_input,W1) + b1
layer1_bn = self.batch_norm_layer(layer1,training_phase=is_training,scope_bn='batch_norm_1',activation=tf.nn.relu)
layer2 = tf.matmul(layer1_bn,W2) + b2
layer2_bn = self.batch_norm_layer(layer2,training_phase=is_training,scope_bn='batch_norm_2',activation=tf.nn.relu)
action = tf.matmul(layer2_bn, W3) + b3
action_linear = tf.sigmoid(action[:, None, 0])
action_angular = tf.tanh(action[:, None, 1])
action = tf.concat([action_linear, action_angular], axis=-1)
来源:https://stackoverflow.com/questions/55273636/tensorflow-apply-different-activation-functions-in-output-layer