What is the best way to implement weight constraints in TensorFlow?

前端 未结 1 516
你的背包
你的背包 2020-12-23 03:55

Suppose we have weights

x = tf.Variable(np.random.random((5,10)))
cost = ...

And we use the GD optimizer:

upds = tf.train.G         


        
1条回答
  •  有刺的猬
    2020-12-23 04:14

    You can take the Lagrangian approach and simply add a penalty for features of the variable you don't want.

    e.g. To encourage theta to be non-negative, you could add the following to the optimizer's objective function.

        added_loss = -tf.minimum( tf.reduce_min(theta),0)
    

    If any theta are negative, then add2loss will be positive, otherwise zero. Scaling that to a meaningful value is left as an exercise to the reader. Scaling too little will not exert enough pressure. Too much may make things unstable.

    0 讨论(0)
提交回复
热议问题