VGG, perceptual loss in keras

ⅰ亾dé卋堺 提交于 2019-12-04 07:06:59

The usual way of doing that is appending your VGG to the end of your model, making sure all its layers have trainable=False before compiling.

Then you recalculate your Y_train.

Suppose you have these models:

mainModel - the one you want to apply a loss function    
lossModel - the one that is part of the loss function you want   

Create a new model appending one to another:

from keras.models import Model

lossOut = lossModel(mainModel.output) #you pass the output of one model to the other

fullModel = Model(mainModel.input,lossOut) #you create a model for training following a certain path in the graph. 

This model will have the exact same weights of mainModel and lossModel, and training this model will affect the other models.

Make sure lossModel is not trainable before compiling:

lossModel.trainable = False
for l in lossModel.layers:
    l.trainable = False

fullModel.compile(loss='mse',optimizer=....)

Now adjust your data for training:

fullYTrain = lossModel.predict(originalYTrain)

And finally do the training:

fullModel.fit(xTrain, fullYTrain, ....)
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!