Tensorflow apply different activation functions in output layer

ε祈祈猫儿з 提交于 2019-12-11 10:21:22

问题


I want to build a network like this

The hidden layer is not important, I want to know how can I write the code in my output layer, and the following is my code, am I right?

Parameters:

state_dim = 13

layer1_size, layer2_size = 400, 300

action_dim = 2

W1 = self.variable([state_dim,layer1_size],state_dim)
b1 = self.variable([layer1_size],state_dim)
W2 = self.variable([layer1_size,layer2_size],layer1_size)
b2 = self.variable([layer2_size],layer1_size)
W3 = tf.Variable(tf.random_uniform([layer2_size,action_dim],-0.003, 0.003))
b3 = tf.Variable(tf.random_uniform([action_dim],-0.003,0.003))

layer1 = tf.matmul(state_input,W1) + b1
layer1_bn = self.batch_norm_layer(layer1,training_phase=is_training,scope_bn='batch_norm_1',activation=tf.nn.relu)
layer2 = tf.matmul(layer1_bn,W2) + b2
layer2_bn = self.batch_norm_layer(layer2,training_phase=is_training,scope_bn='batch_norm_2',activation=tf.nn.relu)
action = tf.matmul(layer2_bn, W3) + b3
action_linear = tf.sigmoid(action[:, None, 0])
action_angular = tf.tanh(action[:, None, 1])
action = tf.concat([action_linear, action_angular], axis=-1)

来源:https://stackoverflow.com/questions/55273636/tensorflow-apply-different-activation-functions-in-output-layer

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!