在Keras模型中移除然后插入新的中间层

xxslljrj  于 2022-12-23  发布在  其他
关注(0)|答案(2)|浏览(156)

给定一个预定义的Keras模型,我尝试首先加载预先训练的权重,然后删除一到三个模型内部(非最后几个)层,然后用另一个层替换它。
我似乎找不到任何关于keras.io的文档来做这样的事情,或者从预定义的模型中删除层。
我使用的模型是一个很好的ole VGG-16网络,它在函数中示例化,如下所示:

def model(self, output_shape):

    # Prepare image for input to model
    img_input = Input(shape=self._input_shape)

    # Block 1
    x = Conv2D(64, (3, 3), activation='relu', padding='same', name='block1_conv1')(img_input)
    x = Conv2D(64, (3, 3), activation='relu', padding='same', name='block1_conv2')(x)
    x = MaxPooling2D((2, 2), strides=(2, 2), name='block1_pool')(x)

    # Block 2
    x = Conv2D(128, (3, 3), activation='relu', padding='same', name='block2_conv1')(x)
    x = Conv2D(128, (3, 3), activation='relu', padding='same', name='block2_conv2')(x)
    x = MaxPooling2D((2, 2), strides=(2, 2), name='block2_pool')(x)

    # Block 3
    x = Conv2D(256, (3, 3), activation='relu', padding='same', name='block3_conv1')(x)
    x = Conv2D(256, (3, 3), activation='relu', padding='same', name='block3_conv2')(x)
    x = Conv2D(256, (3, 3), activation='relu', padding='same', name='block3_conv3')(x)
    x = MaxPooling2D((2, 2), strides=(2, 2), name='block3_pool')(x)

    # Block 4
    x = Conv2D(512, (3, 3), activation='relu', padding='same', name='block4_conv1')(x)
    x = Conv2D(512, (3, 3), activation='relu', padding='same', name='block4_conv2')(x)
    x = Conv2D(512, (3, 3), activation='relu', padding='same', name='block4_conv3')(x)
    x = MaxPooling2D((2, 2), strides=(2, 2), name='block4_pool')(x)

    # Block 5
    x = Conv2D(512, (3, 3), activation='relu', padding='same', name='block5_conv1')(x)
    x = Conv2D(512, (3, 3), activation='relu', padding='same', name='block5_conv2')(x)
    x = Conv2D(512, (3, 3), activation='relu', padding='same', name='block5_conv3')(x)
    x = MaxPooling2D((2, 2), strides=(2, 2), name='block5_pool')(x)

    # Classification block
    x = Flatten(name='flatten')(x)
    x = Dense(4096, activation='relu', name='fc1')(x)
    x = Dropout(0.5)(x)
    x = Dense(4096, activation='relu', name='fc2')(x)
    x = Dropout(0.5)(x)
    x = Dense(output_shape, activation='softmax', name='predictions')(x)

    inputs = img_input

    # Create model.
    model = Model(inputs, x, name=self._name)

    return model

因此,作为一个例子,我想采取两个Conv层在块1和取代他们只有一个Conv层后,加载到所有其他层的原始权重。
有什么想法吗?

pu3pd22g

pu3pd22g1#

假设您有一个模型vgg16_model,由上面的函数或keras.applications.VGG16(weights='imagenet')初始化,现在,您需要在中间插入一个新层,以保存其他层的权重。
其思路是将整个网络分解为不同的层,然后再重新组装起来。下面是专门针对您的任务的代码:

vgg_model = applications.VGG16(include_top=True, weights='imagenet')

# Disassemble layers
layers = [l for l in vgg_model.layers]

# Defining new convolutional layer.
# Important: the number of filters should be the same!
# Note: the receiptive field of two 3x3 convolutions is 5x5.
new_conv = Conv2D(filters=64, 
                  kernel_size=(5, 5),
                  name='new_conv',
                  padding='same')(layers[0].output)

# Now stack everything back
# Note: If you are going to fine tune the model, do not forget to
#       mark other layers as un-trainable

x = new_conv
for i in range(3, len(layers)):
    layers[i].trainable = False
    x = layers[i](x)

# Final touch
result_model = Model(inputs=layer[0].input, outputs=x)
result_model.summary()

上面代码的输出是:

_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_50 (InputLayer)        (None, 224, 224, 3)       0         
_________________________________________________________________
new_conv (Conv2D)            (None, 224, 224, 64)      1792      
_________________________________________________________________
block1_pool (MaxPooling2D)   (None, 112, 112, 64)      0         
_________________________________________________________________
block2_conv1 (Conv2D)        (None, 112, 112, 128)     73856     
_________________________________________________________________
block2_conv2 (Conv2D)        (None, 112, 112, 128)     147584    
_________________________________________________________________
block2_pool (MaxPooling2D)   (None, 56, 56, 128)       0         
_________________________________________________________________
block3_conv1 (Conv2D)        (None, 56, 56, 256)       295168    
_________________________________________________________________
block3_conv2 (Conv2D)        (None, 56, 56, 256)       590080    
_________________________________________________________________
block3_conv3 (Conv2D)        (None, 56, 56, 256)       590080    
_________________________________________________________________
block3_pool (MaxPooling2D)   (None, 28, 28, 256)       0         
_________________________________________________________________
block4_conv1 (Conv2D)        (None, 28, 28, 512)       1180160   
_________________________________________________________________
block4_conv2 (Conv2D)        (None, 28, 28, 512)       2359808   
_________________________________________________________________
block4_conv3 (Conv2D)        (None, 28, 28, 512)       2359808   
_________________________________________________________________
block4_pool (MaxPooling2D)   (None, 14, 14, 512)       0         
_________________________________________________________________
block5_conv1 (Conv2D)        (None, 14, 14, 512)       2359808   
_________________________________________________________________
block5_conv2 (Conv2D)        (None, 14, 14, 512)       2359808   
_________________________________________________________________
block5_conv3 (Conv2D)        (None, 14, 14, 512)       2359808   
_________________________________________________________________
block5_pool (MaxPooling2D)   (None, 7, 7, 512)         0         
_________________________________________________________________
flatten (Flatten)            (None, 25088)             0         
_________________________________________________________________
fc1 (Dense)                  (None, 4096)              102764544 
_________________________________________________________________
fc2 (Dense)                  (None, 4096)              16781312  
_________________________________________________________________
predictions (Dense)          (None, 1000)              4097000   
=================================================================
Total params: 138,320,616
Trainable params: 1,792
Non-trainable params: 138,318,824
_________________________________________________________________
tf7tbtn2

tf7tbtn22#

另一种方法是建立一个Sequential模型。请看下面的例子,我将ReLU层交换为PReLU。您只需简单地不添加您不想要的层,并添加一个新层。

def convert_model_relu(model):
    from keras.layers.advanced_activations import PReLU
    from keras.activations import linear as linear_activation
    from keras.models import Sequential
    new_model = Sequential()
    # Go through all layers, if it has a ReLU activation, replace it with PrELU
    for layer in tuple(model.layers):
        layer_type = type(layer).__name__
        if hasattr(layer, 'activation') and layer.activation.__name__ == 'relu':
            # Set activation to linear, add PReLU
            prelu_name = layer.name + "_prelu"
            prelu = PReLU(shared_axes=(1, 2), name=prelu_name) \ 
                if layer_type == "Conv2D" else PReLU(name=prelu_name)
            layer.activation = linear_activation
            new_model.add(layer)
            new_model.add(prelu)
        else:
            new_model.add(layer)
    return new_model

相关问题