tf.keras

How to graph tf.keras model in Tensorflow-2.0?

末鹿安然 提交于 2019-12-09 17:27:25
问题 I upgraded to Tensorflow 2.0 and there is no tf.summary.FileWriter("tf_graphs", sess.graph) . I was looking through some other StackOverflow questions on this and they said to use tf.compat.v1.summary etc . Surely there must be a way to graph and visualize a tf.keras model in Tensorflow version 2. What is it? I'm looking for a tensorboard output like the one below. Thank you! 回答1: According to the docs, you can use Tensorboard to visualise graphs once your model has been trained. First,

How to fix “AttributeError: module 'tensorflow' has no attribute 'get_default_graph'”?

雨燕双飞 提交于 2019-12-09 16:01:22
问题 I am trying to run some code to create an LSTM model but i get an error: AttributeError: module 'tensorflow' has no attribute 'get_default_graph' My code is as follows: from keras.models import Sequential model = Sequential() model.add(Dense(32, input_dim=784)) model.add(Activation('relu')) model.add(LSTM(17)) model.add(Dense(1, activation='sigmoid')) model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy']) I have found someone else with a similar problem and they

from_logits=True and from_logits=False get different training result for tf.losses.CategoricalCrossentropy for UNet

柔情痞子 提交于 2019-12-07 03:13:16
问题 I am doing the image semantic segmentation job with unet, if I set the Softmax Activation for last layer like this: ... conv9 = Conv2D(n_classes, (3,3), padding = 'same')(conv9) conv10 = (Activation('softmax'))(conv9) model = Model(inputs, conv10) return model ... and then using loss = tf.keras.losses.CategoricalCrossentropy(from_logits=False) The training will not converge even for only one training image. But if I do not set the Softmax Activation for last layer like this: ... conv9 =

Unable to import Keras(from TensorFlow 2.0) in PyCharm

邮差的信 提交于 2019-12-07 00:10:09
问题 I have just installed the stable version of TensorFlow 2.0 (released on October 1st 2019) in PyCharm. The problem is that the keras package is unavailable . The actual error i s : " cannot import name 'keras' from tensorflow ". I have installed via pip install tensorflow==2.0.0 the CPU version , and then uninstalled the CPU version and installed the GPU version , via pip install tensorflow-gpu==2.0.0. Neither of the above worked versions of TensorFlow were working properly(could not import

from_logits=True and from_logits=False get different training result for tf.losses.CategoricalCrossentropy for UNet

☆樱花仙子☆ 提交于 2019-12-05 07:46:19
I am doing the image semantic segmentation job with unet, if I set the Softmax Activation for last layer like this: ... conv9 = Conv2D(n_classes, (3,3), padding = 'same')(conv9) conv10 = (Activation('softmax'))(conv9) model = Model(inputs, conv10) return model ... and then using loss = tf.keras.losses.CategoricalCrossentropy(from_logits=False) The training will not converge even for only one training image. But if I do not set the Softmax Activation for last layer like this: ... conv9 = Conv2D(n_classes, (3,3), padding = 'same')(conv9) model = Model(inputs, conv9) return model ... and then

RNN in Tensorflow vs Keras, depreciation of tf.nn.dynamic_rnn()

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-03 02:33:45
My question is: Are the tf.nn.dynamic_rnn and keras.layers.RNN(cell) truly identical as stated in docs? I am planning on building an RNN, however, it seems that tf.nn.dynamic_rnn is depricated in favour of Keras. In particular, it states that: Warning: THIS FUNCTION IS DEPRECATED. It will be removed in a future version. Instructions for updating: Please use keras.layers.RNN(cell), which is equivalent to this API But I don't see how the APIs are equivalent, in the case of variable sequence lengths! In raw TF, we can specify a tensor of shape (batch_size, seq_lengths) . This way, if our sequence

Concatenate input tensor with multiple of neg 1 tensors

孤人 提交于 2019-12-02 14:59:57
问题 Similar Posts: Firstly, these 2 posts are similar if not the same. I tried to implement these in vain. So I'm missing something probably because of my inexperience in Keras. similar 1 , similar 2 The problem: I have a data generator to feed data into various models to evaluate model performance and to learn Keras. model.fit_generator(generator=img_gen.next_train(), .... One of the inputs produced by this generator is a tensor "labels" of shape=[batch_size, num_letters]. This tensor is the

how can I maximize the GPU usage of Tensorflow 2.0 from R (with Keras library)?

流过昼夜 提交于 2019-12-02 07:32:11
问题 I use R with Keras and tensorflow 2.0 on the GPU. After connecting a second monitor to my GPU, I receive this error during a deep learning script: I concluded that the GPU is short of memory and a solution seems to be this code: import tensorflow as tf from keras.backend.tensorflow_backend import set_session config = tf.ConfigProto() config.gpu_options.allow_growth = True # dynamically grow the memory used on the GPU config.log_device_placement = True # to log device placement (on which

How to deactivate a dropout layer called with training=True in a Keras model?

只谈情不闲聊 提交于 2019-12-02 03:56:00
I wish to view the final output of training a tf.keras model. In this case it would be an array of predictions from the softmax function, e.g. [0,0,0,1,0,1]. Other threads on here have suggested using model.predict(training_data), but this won't work for my situation since I am using dropout at training and validation, so neurons are randomly dropped and predicting again with the same data will give a different result. def get_model(): inputs = tf.keras.layers.Input(shape=(input_dims,)) x = tf.keras.layers.Dropout(rate=dropout_rate)(inputs, training=True) x = tf.keras.layers.Dense(units=29,

Should I use the standalone Keras library or tf.keras?

你说的曾经没有我的故事 提交于 2019-12-01 02:07:51
问题 As Keras becomes an API for TensorFlow, there are lots of old versions of Keras code, such as https://github.com/keiserlab/keras-neural-graph-fingerprint/blob/master/examples.py from keras import models With the current version of TensorFlow, do we need to change every Keras code as? from tensorflow.keras import models 回答1: You are mixing things up: Keras (https://keras.io/) is a library independent from TensorFlow, which specifies a high-level API for building and training neural networks