如何在Pytorch中迭代层

zxlwwiss  于 2023-05-22  发布在  其他
关注(0)|答案(5)|浏览(132)

假设我有一个名为m的网络模型对象。现在我没有关于这个网络的层数的先验信息。如何创建一个for循环来遍历它的层?我正在寻找类似的东西:

Weight=[]
for layer in m._modules:
    Weight.append(layer.weight)
ulmd4ohb

ulmd4ohb1#

假设你有下面的神经网络。

import torch
import torch.nn as nn
import torch.nn.functional as F

class Net(nn.Module):

    def __init__(self):
        super(Net, self).__init__()
        # 1 input image channel, 6 output channels, 5x5 square convolution
        # kernel
        self.conv1 = nn.Conv2d(1, 6, 5)
        self.conv2 = nn.Conv2d(6, 16, 5)
        # an affine operation: y = Wx + b
        self.fc1 = nn.Linear(16 * 5 * 5, 120)
        self.fc2 = nn.Linear(120, 84)
        self.fc3 = nn.Linear(84, 10)

    def forward(self, x):
        # define the forward function 
        return x

现在,让我们打印与每个NN层相关联的权重参数的大小。

model = Net()
for name, param in model.named_parameters():
    print(name, param.size())

输出

conv1.weight torch.Size([6, 1, 5, 5])
conv1.bias torch.Size([6])
conv2.weight torch.Size([16, 6, 5, 5])
conv2.bias torch.Size([16])
fc1.weight torch.Size([120, 400])
fc1.bias torch.Size([120])
fc2.weight torch.Size([84, 120])
fc2.bias torch.Size([84])
fc3.weight torch.Size([10, 84])
fc3.bias torch.Size([10])

我希望您可以扩展该示例以满足您的需要。

nuypyhwy

nuypyhwy2#

假设m是你的模块,那么你可以这样做:

for layer in m.children():
    weights = list(layer.parameters())
6ss1mwsb

6ss1mwsb3#

可以使用children方法:

for module in model.children():
    # ...

或者,如果你想flatten Sequential layers

for module in model.modules():
    if not isinstance(module, nn.Sequential):
        # ...
lyr7nygr

lyr7nygr4#

你可以简单地使用model.named_parameters()获取它,它将返回一个生成器,你可以迭代它并获取Tensor,它的名称等。
以下是resnet预训练模型的代码:

In [106]: resnet = torchvision.models.resnet101(pretrained=True)

In [107]: for name, param in resnet.named_parameters(): 
     ...:     print(name, param.shape)

这将输出

conv1.weight torch.Size([64, 3, 7, 7])
bn1.weight torch.Size([64])
bn1.bias torch.Size([64])
layer1.0.conv1.weight torch.Size([64, 64, 1, 1])
layer1.0.bn1.weight torch.Size([64])
layer1.0.bn1.bias torch.Size([64])
........
........ and so on

您可以在how-to-manipulate-layer-parameters-by-its-names/中找到有关此主题的一些讨论

dm7nw8vv

dm7nw8vv5#

你也可以这样做:

for name, m in mdl.named_children():
    print(name)
    print(m.parameters())

参考文献:

# https://discuss.pytorch.org/t/how-to-get-the-module-names-of-nn-sequential/39682
# looping through modules but get the one with a specific name

import torch
import torch.nn as nn

from collections import OrderedDict

params = OrderedDict([
    ('fc0', nn.Linear(in_features=4,out_features=4)),
    ('ReLU0', nn.ReLU()),
    ('fc1L:final', nn.Linear(in_features=4,out_features=1))
])
mdl = nn.Sequential(params)

# throws error
# mdl['fc0']

for m in mdl.children():
    print(m)

print()

for m in mdl.modules():
    print(m)

print()

for name, m in mdl.named_modules():
    print(name)
    print(m)

print()

for name, m in mdl.named_children():
    print(name)
    print(m)

相关问题