How do I list certain variables in the checkpoint?

前端 未结 3 1304
时光说笑
时光说笑 2021-02-06 02:21

I am working with autoencoders. My checkpoint contains the complete state of the network (i.e. the encoder, decoder, optimizer, etc). I want to fool around with the encodings. T

相关标签:
3条回答
  • 2021-02-06 02:56

    There's list_variables method in checkpoint_utils.py which lets you see all saved variables.

    However, for your use-case, it may be easier to restore with a Saver. If you know the names of the variables when you saved the checkpoint, you can create a new saver, and tell it to initialize those names into new Variable objects (possibly with different names). This is used in CIFAR example to select a restore a subset of variables. See Choosing which Variables to Save and Restore in the Howto

    0 讨论(0)
  • 2021-02-06 03:04

    Another way, that would print all checkpoint tensors (or just one, if specified) along with their content:

    from tensorflow.python.tools import inspect_checkpoint as inch
    inch.print_tensors_in_checkpoint_file('path/to/ckpt', '', True)
    """
    Args:
      file_name: Name of the checkpoint file.
      tensor_name: Name of the tensor in the checkpoint file to print.
      all_tensors: Boolean indicating whether to print all tensors.
    """
    

    It will always print the content of the tensor.

    And, while we are at it, here is how to use checkpoint_utils.py (suggested by the previous answer):

    from tensorflow.contrib.framework.python.framework import checkpoint_utils
    
    var_list = checkpoint_utils.list_variables('./')
    for v in var_list:
        print(v)
    
    0 讨论(0)
  • 2021-02-06 03:16

    You can view the saved variables in .ckpt file using,

    import tensorflow as tf
    
    variables_in_checkpoint = tf.train.list_variables('path.ckpt')
    
    print("Variables found in checkpoint file",variables_in_checkpoint)
    
    0 讨论(0)
提交回复
热议问题