How to count total number of trainable parameters in a tensorflow model?

后端 未结 7 1064
清酒与你
清酒与你 2020-12-04 15:41

Is there a function call or another way to count the total number of parameters in a tensorflow model?

By parameters I mean: an N dim vector of trainable variables h

相关标签:
7条回答
  • 2020-12-04 15:43

    I'll throw in my equivalent but shorter implementation:

    def count_params():
        "print number of trainable variables"
        size = lambda v: reduce(lambda x, y: x*y, v.get_shape().as_list())
        n = sum(size(v) for v in tf.trainable_variables())
        print "Model size: %dK" % (n/1000,)
    
    0 讨论(0)
  • 2020-12-04 15:44

    I have an even shorter version, one line solution using using numpy:

    np.sum([np.prod(v.get_shape().as_list()) for v in tf.trainable_variables()])
    
    0 讨论(0)
  • 2020-12-04 15:54

    Not sure if the answer given actually runs (I found you need to convert the dim object to an int for it to work). Here is is one that works and you can just copy paste the functions and call them (added a few comments too):

    def count_number_trainable_params():
        '''
        Counts the number of trainable variables.
        '''
        tot_nb_params = 0
        for trainable_variable in tf.trainable_variables():
            shape = trainable_variable.get_shape() # e.g [D,F] or [W,H,C]
            current_nb_params = get_nb_params_shape(shape)
            tot_nb_params = tot_nb_params + current_nb_params
        return tot_nb_params
    
    def get_nb_params_shape(shape):
        '''
        Computes the total number of params for a given shap.
        Works for any number of shapes etc [D,F] or [W,H,C] computes D*F and W*H*C.
        '''
        nb_params = 1
        for dim in shape:
            nb_params = nb_params*int(dim)
        return nb_params 
    
    0 讨论(0)
  • 2020-12-04 15:58

    Update April 2020: tfprof and the Profiler UI have been deprecated in favor of profiler support in TensorBoard.

    The two existing answers are good if you're looking into computing the number of parameters yourself. If your question was more along the lines of "is there an easy way to profile my TensorFlow models?", I would highly recommend looking into tfprof. It profiles your model, including calculating the number of parameters.

    0 讨论(0)
  • 2020-12-04 16:06
    model.summary()
    

    Model: "sequential_32"


    Layer (type) Output Shape Param #

    conv2d_88 (Conv2D) (None, 240, 240, 16) 448


    max_pooling2d_87 (MaxPooling (None, 120, 120, 16) 0


    conv2d_89 (Conv2D) (None, 120, 120, 32) 4640


    max_pooling2d_88 (MaxPooling (None, 60, 60, 32) 0


    conv2d_90 (Conv2D) (None, 60, 60, 64) 18496


    max_pooling2d_89 (MaxPooling (None, 30, 30, 64) 0


    flatten_29 (Flatten) (None, 57600) 0


    dropout_48 (Dropout) (None, 57600) 0


    dense_150 (Dense) (None, 24) 1382424


    dense_151 (Dense) (None, 9) 225


    dense_152 (Dense) (None, 3) 30


    dense_153 (Dense) (None, 1) 4

    Total params: 1,406,267 Trainable params: 1,406,267 Non-trainable params: 0


    0 讨论(0)
  • 2020-12-04 16:08

    Loop over the shape of every variable in tf.trainable_variables().

    total_parameters = 0
    for variable in tf.trainable_variables():
        # shape is an array of tf.Dimension
        shape = variable.get_shape()
        print(shape)
        print(len(shape))
        variable_parameters = 1
        for dim in shape:
            print(dim)
            variable_parameters *= dim.value
        print(variable_parameters)
        total_parameters += variable_parameters
    print(total_parameters)
    

    Update: I wrote an article to clarify the dynamic/static shapes in Tensorflow because of this answer: https://pgaleone.eu/tensorflow/2018/07/28/understanding-tensorflow-tensors-shape-static-dynamic/

    0 讨论(0)
提交回复
热议问题