eager-execution

How to convert model in eager execution to static graph and save in .pb file?

主宰稳场 提交于 2019-12-08 04:34:37
问题 Imagine that I have model (tf.keras.Model): class ContextExtractor(tf.keras.Model): def __init__(self): super().__init__() self.model = self.__get_model() def call(self, x, training=False, **kwargs): features = self.model(x, training=training) return features def __get_model(self): return self.__get_small_conv() def __get_small_conv(self): model = tf.keras.Sequential() model.add(layers.Conv2D(32, (3, 3), strides=(2, 2), padding='same')) model.add(layers.LeakyReLU(alpha=0.2)) model.add(layers

How to convert model in eager execution to static graph and save in .pb file?

拈花ヽ惹草 提交于 2019-12-06 21:50:37
Imagine that I have model (tf.keras.Model): class ContextExtractor(tf.keras.Model): def __init__(self): super().__init__() self.model = self.__get_model() def call(self, x, training=False, **kwargs): features = self.model(x, training=training) return features def __get_model(self): return self.__get_small_conv() def __get_small_conv(self): model = tf.keras.Sequential() model.add(layers.Conv2D(32, (3, 3), strides=(2, 2), padding='same')) model.add(layers.LeakyReLU(alpha=0.2)) model.add(layers.Conv2D(32, (3, 3), strides=(2, 2), padding='same')) model.add(layers.LeakyReLU(alpha=0.2)) model.add

What is the purpose of the Tensorflow Gradient Tape?

风格不统一 提交于 2019-12-03 05:35:48
问题 I watched the Tensorflow Developer's summit video on Eager Execution in Tensorflow, and the presenter gave an introduction to "Gradient Tape." Now I understand that Gradient Tape tracks the automatic differentiation that occurs in a TF model. I was trying to understand why I would use Gradient Tape? Can anyone explain how Gradient Tape is used as a diagnostic tool? Why would someone use Gradient Tape versus just Tensorboard visualization of weights. So I get that the automatic differentiation

What is the purpose of the Tensorflow Gradient Tape?

 ̄綄美尐妖づ 提交于 2019-12-02 18:58:58
I watched the Tensorflow Developer's summit video on Eager Execution in Tensorflow, and the presenter gave an introduction to "Gradient Tape." Now I understand that Gradient Tape tracks the automatic differentiation that occurs in a TF model. I was trying to understand why I would use Gradient Tape? Can anyone explain how Gradient Tape is used as a diagnostic tool? Why would someone use Gradient Tape versus just Tensorboard visualization of weights. So I get that the automatic differentiation that occurs with a model is to compute the gradients of each node--meaning the adjustment of the