给定一个预定义的Keras模型,我尝试首先加载预先训练的权重,然后删除一到三个模型内部(非最后几个)层,然后用另一个层替换它。
我似乎找不到任何关于keras.io的文档来做这样的事情,或者从预定义的模型中删除层。
我使用的模型是一个很好的ole VGG-16网络,它在函数中示例化,如下所示:
def model(self, output_shape):
# Prepare image for input to model
img_input = Input(shape=self._input_shape)
# Block 1
x = Conv2D(64, (3, 3), activation='relu', padding='same', name='block1_conv1')(img_input)
x = Conv2D(64, (3, 3), activation='relu', padding='same', name='block1_conv2')(x)
x = MaxPooling2D((2, 2), strides=(2, 2), name='block1_pool')(x)
# Block 2
x = Conv2D(128, (3, 3), activation='relu', padding='same', name='block2_conv1')(x)
x = Conv2D(128, (3, 3), activation='relu', padding='same', name='block2_conv2')(x)
x = MaxPooling2D((2, 2), strides=(2, 2), name='block2_pool')(x)
# Block 3
x = Conv2D(256, (3, 3), activation='relu', padding='same', name='block3_conv1')(x)
x = Conv2D(256, (3, 3), activation='relu', padding='same', name='block3_conv2')(x)
x = Conv2D(256, (3, 3), activation='relu', padding='same', name='block3_conv3')(x)
x = MaxPooling2D((2, 2), strides=(2, 2), name='block3_pool')(x)
# Block 4
x = Conv2D(512, (3, 3), activation='relu', padding='same', name='block4_conv1')(x)
x = Conv2D(512, (3, 3), activation='relu', padding='same', name='block4_conv2')(x)
x = Conv2D(512, (3, 3), activation='relu', padding='same', name='block4_conv3')(x)
x = MaxPooling2D((2, 2), strides=(2, 2), name='block4_pool')(x)
# Block 5
x = Conv2D(512, (3, 3), activation='relu', padding='same', name='block5_conv1')(x)
x = Conv2D(512, (3, 3), activation='relu', padding='same', name='block5_conv2')(x)
x = Conv2D(512, (3, 3), activation='relu', padding='same', name='block5_conv3')(x)
x = MaxPooling2D((2, 2), strides=(2, 2), name='block5_pool')(x)
# Classification block
x = Flatten(name='flatten')(x)
x = Dense(4096, activation='relu', name='fc1')(x)
x = Dropout(0.5)(x)
x = Dense(4096, activation='relu', name='fc2')(x)
x = Dropout(0.5)(x)
x = Dense(output_shape, activation='softmax', name='predictions')(x)
inputs = img_input
# Create model.
model = Model(inputs, x, name=self._name)
return model
因此,作为一个例子,我想采取两个Conv层在块1和取代他们只有一个Conv层后,加载到所有其他层的原始权重。
有什么想法吗?
2条答案
按热度按时间pu3pd22g1#
假设您有一个模型
vgg16_model
,由上面的函数或keras.applications.VGG16(weights='imagenet')
初始化,现在,您需要在中间插入一个新层,以保存其他层的权重。其思路是将整个网络分解为不同的层,然后再重新组装起来。下面是专门针对您的任务的代码:
上面代码的输出是:
tf7tbtn22#
另一种方法是建立一个Sequential模型。请看下面的例子,我将ReLU层交换为PReLU。您只需简单地不添加您不想要的层,并添加一个新层。