cntk linear activation function in layers?

懵懂的女人 提交于 2019-12-11 07:37:12

问题


In CNTK, it has relu, hardmax, softmax, sigmoid and all that good stuff, but I'm building a regression based algorithm and the final layer needs to predict 2 or more regression outputs. So I need n nodes, and the activation be just a run of the mill linear activation. I see I can set the activation to None, is that in fact the correct thing?

with cntk.layers.default_options(activation=cntk.ops.relu, pad=True):
    z = cntk.models.Sequential([
        cntk.models.LayerStack(2, lambda : [
            cntk.layers.Convolution((3,3), 64),
            cntk.layers.Convolution((3,3), 64),
            cntk.layers.MaxPooling((3,3), (2,2))
        ]), 
        cntk.models.LayerStack(2, lambda i: [
            cntk.layers.Dense([256,128][i]), 
            cntk.layers.Dropout(0.5)
        ]), 
        cntk.layers.Dense(4, activation=None)
    ])(feature_var)

回答1:


Yes. That is correct. You can look into layers library code here



来源:https://stackoverflow.com/questions/41838154/cntk-linear-activation-function-in-layers

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!