tensorflow2.0

How to use model input in loss function?

北城余情 提交于 2021-02-10 03:23:55
问题 I am trying to use a custom loss-function which depends on some arguments that the model does not have. The model has two inputs ( mel_specs and pred_inp ) and expects a labels tensor for training: def to_keras_example(example): # Preparing inputs return (mel_specs, pred_inp), labels # Is a tf.train.Dataset for model.fit(train_data, ...) train_data = load_dataset(fp, 'train).map(to_keras_example).repeat() In my loss function I need to calculate the lengths of mel_specs and pred_inp . This

How to use model input in loss function?

心已入冬 提交于 2021-02-10 03:23:25
问题 I am trying to use a custom loss-function which depends on some arguments that the model does not have. The model has two inputs ( mel_specs and pred_inp ) and expects a labels tensor for training: def to_keras_example(example): # Preparing inputs return (mel_specs, pred_inp), labels # Is a tf.train.Dataset for model.fit(train_data, ...) train_data = load_dataset(fp, 'train).map(to_keras_example).repeat() In my loss function I need to calculate the lengths of mel_specs and pred_inp . This

module 'tensorflow' has no attribute 'logging'

感情迁移 提交于 2021-02-08 12:17:58
问题 I'm trying to run a tensorflow code in v2.0 and I'mg getting the following error AttributeError: module 'tensorflow' has no attribute 'logging' I don't want to simply remove it from the code. why this code has been removed? why should I do instead? 回答1: tf.logging was for Logging and Summary Operations and in TF 2.0 it has been removed in favor of the open-source absl-py, and to make the main tf.* namespace has functions that will be used more often. In TF.2 lesser used functions are gone or

module 'tensorflow' has no attribute 'logging'

非 Y 不嫁゛ 提交于 2021-02-08 12:17:13
问题 I'm trying to run a tensorflow code in v2.0 and I'mg getting the following error AttributeError: module 'tensorflow' has no attribute 'logging' I don't want to simply remove it from the code. why this code has been removed? why should I do instead? 回答1: tf.logging was for Logging and Summary Operations and in TF 2.0 it has been removed in favor of the open-source absl-py, and to make the main tf.* namespace has functions that will be used more often. In TF.2 lesser used functions are gone or

Can I use dictionary in keras customized model?

我们两清 提交于 2021-02-08 09:52:03
问题 I recently read a paper about UNet++,and I want to implement this structure with tensorflow-2.0 and keras customized model. As the structure is so complicated, I decided to manage the keras layers by a dictionary. Everything went well in training, but an error occurred while saving the model. Here is a minimum code to show the error: class DicModel(tf.keras.Model): def __init__(self): super(DicModel, self).__init__(name='SequenceEECNN') self.c = {} self.c[0] = tf.keras.Sequential([ tf.keras

Can I use dictionary in keras customized model?

允我心安 提交于 2021-02-08 09:51:41
问题 I recently read a paper about UNet++,and I want to implement this structure with tensorflow-2.0 and keras customized model. As the structure is so complicated, I decided to manage the keras layers by a dictionary. Everything went well in training, but an error occurred while saving the model. Here is a minimum code to show the error: class DicModel(tf.keras.Model): def __init__(self): super(DicModel, self).__init__(name='SequenceEECNN') self.c = {} self.c[0] = tf.keras.Sequential([ tf.keras

unexpected keyword argument 'sample_weight' when sub-classing tensor-flow loss class (categorical_crossentropy) to created a weighted loss function

China☆狼群 提交于 2021-02-07 20:18:35
问题 Struggling to get a sub-classed loss function to work in Tensorflow (2.2.0). Initially tried this code (which I know has worked for others - see https://github.com/keras-team/keras/issues/2115#issuecomment-530762739): import tensorflow.keras.backend as K from tensorflow.keras.losses import CategoricalCrossentropy class WeightedCategoricalCrossentropy(CategoricalCrossentropy): def __init__(self, cost_mat, name='weighted_categorical_crossentropy', **kwargs): assert(cost_mat.ndim == 2) assert

how can i use tensorboard with aws sagemaker tensorflow?

。_饼干妹妹 提交于 2021-02-07 18:40:43
问题 i have started a sagemaker job: from sagemaker.tensorflow import TensorFlow mytraining= TensorFlow(entry_point='model.py', role=role, train_instance_count=1, train_instance_type='ml.p2.xlarge', framework_version='2.0.0', py_version='py3', distributions={'parameter_server'{'enabled':False}}) training_data_uri ='s3://path/to/my/data' mytraining.fit(training_data_uri,run_tensorboard_locally=True) using run_tesorboard_locally=True gave me Tensorboard is not supported with script mode. You can run

How to convert Tensorflow 2.0 SavedModel to TensorRT?

心已入冬 提交于 2021-02-07 17:30:23
问题 I've trained a model in Tensorflow 2.0 and am trying to improve predict time when moving to production (on a server with GPU support). In Tensorflow 1.x I was able to get a predict speedup by using freeze graph, but this has been deprecated as of Tensorflow 2. From reading Nvidia's description of TensorRT, they suggest that using TensorRT can speedup inference by 7x compared to Tensorflow alone. Source: TensorFlow 2.0 with Tighter TensorRT Integration Now Available I have trained my model and

How to convert Tensorflow 2.0 SavedModel to TensorRT?

一个人想着一个人 提交于 2021-02-07 17:29:43
问题 I've trained a model in Tensorflow 2.0 and am trying to improve predict time when moving to production (on a server with GPU support). In Tensorflow 1.x I was able to get a predict speedup by using freeze graph, but this has been deprecated as of Tensorflow 2. From reading Nvidia's description of TensorRT, they suggest that using TensorRT can speedup inference by 7x compared to Tensorflow alone. Source: TensorFlow 2.0 with Tighter TensorRT Integration Now Available I have trained my model and