我在google工作,所以我假设它是当前版本的py手电筒。我试过这个:
class Fc(nn.Module): def __init__(self): super(Fc, self).__init__() self.flatt = nn.Flatten() self.seq = nn.Sequential(nn.Linear(28*28, 512), nn.ReLU(), nn.Linear(512, 512), nn.ReLu(), nn.Linear(512, 10), nn.ReLu()) def forward(x): p = self.flatt(x) p = self.seq(p) return p m1 = Fc()
并得到:
<ipython-input-85-142a1e77b6b6> in <module>() ----> 1 m1 = Fc() <ipython-input-84-09df3be0b613> in __init__(self) 4 self.flatt = nn.Flatten() 5 self.relu = torch.nn.modules.activation.ReLU() ----> 6 self.seq = nn.Sequential(nn.Linear(28*28, 1012), nn.ReLU(), nn.Linear(1012, 512), nn.ReLu(), nn.Linear(512, 10), nn.ReLu()) AttributeError: module 'torch.nn' has no attribute 'ReLu'
我在这里做错了什么?
发布于 2022-04-08 11:25:10
关于套管你出了个错误。它叫 ReLU 而不是 ReLu 。
ReLU
ReLu
import torch.nn as nn class Fc(nn.Module): def __init__(self): super(Fc, self).__init__() self.flatt = nn.Flatten() self.seq = nn.Sequential(nn.Linear(28*28, 512), # TODO: Adjust here nn.ReLU(), nn.Linear(512, 512), nn.ReLU(), # TODO: Adjust here nn.Linear(512, 10), nn.ReLU())