tensorflow 如何在Keras模型中替换(或插入)中间层?

utugiqy6  于 2023-02-19  发布在  其他
关注(0)|答案(4)|浏览(244)

我有一个训练有素的Keras模特,我想:
1)用相同但无偏置的Con2D层替换Con2D层。
2)在第一次激活之前添加BatchNormalization图层
我该怎么做呢?

def keras_simple_model():
    from keras.models import Model
    from keras.layers import Input, Dense,  GlobalAveragePooling2D
    from keras.layers import Conv2D, MaxPooling2D, Activation

    inputs1 = Input((28, 28, 1))
    x = Conv2D(4, (3, 3), activation=None, padding='same', name='conv1')(inputs1)
    x = Activation('relu')(x)
    x = Conv2D(4, (3, 3), activation=None, padding='same', name='conv2')(x)
    x = Activation('relu')(x)
    x = MaxPooling2D((2, 2), strides=(2, 2), name='pool1')(x)

    x = Conv2D(8, (3, 3), activation=None, padding='same', name='conv3')(x)
    x = Activation('relu')(x)
    x = Conv2D(8, (3, 3), activation=None, padding='same', name='conv4')(x)
    x = Activation('relu')(x)
    x = MaxPooling2D((2, 2), strides=(2, 2), name='pool2')(x)

    x = GlobalAveragePooling2D()(x)
    x = Dense(10, activation=None)(x)
    x = Activation('softmax')(x)

    model = Model(inputs=inputs1, outputs=x)
    return model

if __name__ == '__main__':
    model = keras_simple_model()
    print(model.summary())
dnph8jn4

dnph8jn41#

以下函数允许您在名称与正则表达式匹配的原始模型(包括非顺序模型,如DenseNet或ResNet)中的 * 之前 之后a插入新层或 * 替换 * 每个层。

import re
from keras.models import Model

def insert_layer_nonseq(model, layer_regex, insert_layer_factory,
                        insert_layer_name=None, position='after'):

    # Auxiliary dictionary to describe the network graph
    network_dict = {'input_layers_of': {}, 'new_output_tensor_of': {}}

    # Set the input layers of each layer
    for layer in model.layers:
        for node in layer._outbound_nodes:
            layer_name = node.outbound_layer.name
            if layer_name not in network_dict['input_layers_of']:
                network_dict['input_layers_of'].update(
                        {layer_name: [layer.name]})
            else:
                network_dict['input_layers_of'][layer_name].append(layer.name)

    # Set the output tensor of the input layer
    network_dict['new_output_tensor_of'].update(
            {model.layers[0].name: model.input})

    # Iterate over all layers after the input
    model_outputs = []
    for layer in model.layers[1:]:

        # Determine input tensors
        layer_input = [network_dict['new_output_tensor_of'][layer_aux] 
                for layer_aux in network_dict['input_layers_of'][layer.name]]
        if len(layer_input) == 1:
            layer_input = layer_input[0]

        # Insert layer if name matches the regular expression
        if re.match(layer_regex, layer.name):
            if position == 'replace':
                x = layer_input
            elif position == 'after':
                x = layer(layer_input)
            elif position == 'before':
                pass
            else:
                raise ValueError('position must be: before, after or replace')

            new_layer = insert_layer_factory()
            if insert_layer_name:
                new_layer.name = insert_layer_name
            else:
                new_layer.name = '{}_{}'.format(layer.name, 
                                                new_layer.name)
            x = new_layer(x)
            print('New layer: {} Old layer: {} Type: {}'.format(new_layer.name,
                                                            layer.name, position))
            if position == 'before':
                x = layer(x)
        else:
            x = layer(layer_input)

        # Set new output tensor (the original one, or the one of the inserted
        # layer)
        network_dict['new_output_tensor_of'].update({layer.name: x})

        # Save tensor in output list if it is output in initial model
        if layer_name in model.output_names:
            model_outputs.append(x)

    return Model(inputs=model.inputs, outputs=model_outputs)

与简单的纯顺序模型的不同之处在于,在迭代层以找到关键层之前,首先解析图并将每层的输入层存储在辅助字典中,然后,在迭代层时,还存储每层的新输出Tensor,用于在构建新模型时确定每层的输入层。
用例如下所示,其中在ResNet50的每个激活层之后插入Dropout层:

from keras.applications.resnet50 import ResNet50
from keras.models import load_model

model = ResNet50()
def dropout_layer_factory():
    return Dropout(rate=0.2, name='dropout')
model = insert_layer_nonseq(model, '.*activation.*', dropout_layer_factory)

# Fix possible problems with new model
model.save('temp.h5')
model = load_model('temp.h5')

model.summary()
huwehgph

huwehgph2#

您可以使用以下功能:

def replace_intermediate_layer_in_keras(model, layer_id, new_layer):
    from keras.models import Model

    layers = [l for l in model.layers]

    x = layers[0].output
    for i in range(1, len(layers)):
        if i == layer_id:
            x = new_layer(x)
        else:
            x = layers[i](x)

    new_model = Model(input=layers[0].input, output=x)
    return new_model

def insert_intermediate_layer_in_keras(model, layer_id, new_layer):
    from keras.models import Model

    layers = [l for l in model.layers]

    x = layers[0].output
    for i in range(1, len(layers)):
        if i == layer_id:
            x = new_layer(x)
        x = layers[i](x)

    new_model = Model(input=layers[0].input, output=x)
    return new_model
    • 示例**:
if __name__ == '__main__':
    from keras.layers import Conv2D, BatchNormalization
    model = keras_simple_model()
    print(model.summary())
    model = replace_intermediate_layer_in_keras(model, 3, Conv2D(4, (3, 3), activation=None, padding='same', name='conv2_repl', use_bias=False))
    print(model.summary())
    model = insert_intermediate_layer_in_keras(model, 4, BatchNormalization())
    print(model.summary())

由于层形状等原因,对替换存在一些限制。

6pp0gazn

6pp0gazn3#

我是这样做的:

import keras 
from keras.models import Model 
from tqdm import tqdm 
from keras import backend as K

def make_list(X):
    if isinstance(X, list):
        return X
    return [X]

def list_no_list(X):
    if len(X) == 1:
        return X[0]
    return X

def replace_layer(model, replace_layer_subname, replacement_fn,
**kwargs):
    """
    args:
        model :: keras.models.Model instance
        replace_layer_subname :: str -- if str in layer name, replace it
        replacement_fn :: fn to call to replace all instances
            > fn output must produce shape as the replaced layers input
    returns:
        new model with replaced layers
    quick examples:
        want to just remove all layers with 'batch_norm' in the name:
            > new_model = replace_layer(model, 'batch_norm', lambda **kwargs : (lambda u:u))
        want to replace all Conv1D(N, m, padding='same') with an LSTM (lets say all have 'conv1d' in name)
            > new_model = replace_layer(model, 'conv1d', lambda layer, **kwargs: LSTM(units=layer.filters, return_sequences=True)
    """
    model_inputs = []
    model_outputs = []
    tsr_dict = {}

    model_output_names = [out.name for out in make_list(model.output)]

    for i, layer in enumerate(model.layers):
        ### Loop if layer is used multiple times
        for j in range(len(layer._inbound_nodes)):

            ### check layer inp/outp
            inpt_names = [inp.name for inp in make_list(layer.get_input_at(j))]
            outp_names = [out.name for out in make_list(layer.get_output_at(j))]

            ### setup model inputs
            if 'input' in layer.name:
                for inpt_tsr in make_list(layer.get_output_at(j)):
                    model_inputs.append(inpt_tsr)
                    tsr_dict[inpt_tsr.name] = inpt_tsr
                continue

            ### setup layer inputs
            inpt = list_no_list([tsr_dict[name] for name in inpt_names])

            ### remake layer 
            if replace_layer_subname in layer.name:
                print('replacing '+layer.name)
                x = replacement_fn(old_layer=layer, **kwargs)(inpt)
            else:
                x = layer(inpt)

            ### reinstantialize outputs into dict
            for name, out_tsr in zip(outp_names, make_list(x)):

                ### check if is an output
                if name in model_output_names:
                    model_outputs.append(out_tsr)
                tsr_dict[name] = out_tsr

    return Model(model_inputs, model_outputs)

我有一个名为BatchNormalizationFreeze的自定义层(取自在线用户),因此下面是一个用法示例:

new_model = model_replacement(model, 'batch_normal', lambda **kwargs : BatchNormalizationFreeze()(x))

如果你要做多个层,只需用一个伪模型替换替换函数,这样就可以一次完成所有的层

sshcrbum

sshcrbum4#

不幸的是,对于不遵循顺序模式的模型来说,替换一个层不是一件容易的事。对于顺序模式来说,只需要x = layer就可以了(x)并在您认为合适时替换为new_layer,如前所述。对于不具有经典序列模式的模型(假设您有两列的简单“连接”)您必须实际“解析”图形并使用您的“new_layer”(或层)在正确的地方。希望这不是太令人沮丧和快乐的图解析和重构:)

相关问题