I am trying to make 2 conv layers share the same weights, however, it seems the API does not work.
import tensorflow as tf x = tf.random_normal(shape=[10, 32, 32, 3]) with tf.variable_scope('foo') as scope: conv1 = tf.contrib.layers.conv2d(x, 3, [2, 2], padding='SAME', reuse=True, scope=scope) print(conv1.name) conv2 = tf.contrib.layers.conv2d(x, 3, [2, 2], padding='SAME', reuse=True, scope=scope) print(conv2.name)
It prints out
foo/foo/Relu:0 foo/foo_1/Relu:0
Changing from tf.contrib.layers.conv2d
to tf.layers.conv2d
does not solve the problem.
It has the same problem with tf.layers.conv2d
:
import tensorflow as tf x = tf.random_normal(shape=[10, 32, 32, 3]) conv1 = tf.layers.conv2d(x, 3, [2, 2], padding='SAME', reuse=None, name='conv') print(conv1.name) conv2 = tf.layers.conv2d(x, 3, [2, 2], padding='SAME', reuse=True, name='conv') print(conv2.name)
gives
conv/BiasAdd:0 conv_2/BiasAdd:0