首页 > 编程知识 正文

nnPReLUplanes

时间:2023-05-04 14:48:27 阅读:183942 作者:1297

PReLU激活函数,内部源码实现

def __init__(self, num_parameters: int = 1, init: float = 0.25) -> None: self.num_parameters = num_parameters super(PReLU, self).__init__() self.weight = Parameter(torch.Tensor(num_parameters).fill_(init))def forward(self, input: Tensor) -> Tensor: return F.prelu(input, self.weight) def prelu(input, weight): # type: (Tensor, Tensor) -> Tensor r"""prelu(input, weight) -> Tensor Applies element-wise the function :math:`text{PReLU}(x) = max(0,x) + text{weight} * min(0,x)` where weight is a learnable parameter. See :class:`~torch.nn.PReLU` for more details. """ if not torch.jit.is_scripting(): if type(input) is not Tensor and has_torch_function((input,)): return handle_torch_function(prelu, (input,), input, weight) return torch.prelu(input, weight)

上边的公式搞到typora里就是如下形式:

weight这里是个可学习的参数,调用的时候nn.PReLU(planes)只输入了通道数,实现里将weight参数初始化为了0.25.

这和我们熟知的PRelu公式是一致的,

版权声明:该文观点仅代表作者本人。处理文章:请发送邮件至 三1五14八八95#扣扣.com 举报,一经查实,本站将立刻删除。

  • 相关阅读