首页 > 编程知识 正文

pytorch 转 onnx

时间:2023-05-05 22:00:10 阅读:240979 作者:582

pytorch转为onnx格式: def Torch2Onnx(model,input_size,output_name,istrained=True): ''' :param: model :param: input_size .e.t. (244,244) :param: output_name .e.t. "test_output" :param: if convert a trained model or not. default: True ''' x = Variable(torch.randn(1,3,input_size[0],input_size[1])).cuda() if istrained: torch_out = torch.onnx.export(model,x,output_name,verbose=True) else: torch_out = torch.onnx.export(model,x,output_name,export_params=False,verbose=True) # Only export a untrained model.

使用举例:

model = model()model.load_state_dict(torch.load(weight_path))device = torch.device("cuda" if torch.cuda.is_available() else "cpu")model = model.to(device)input_size = (384,288)Torch2Onnx(model,input_size,"test.onnx") 获取model中的params:

请注意:不同的方法默认model在cpu还是在cuda上是不一样的,如果出现类似RuntimeError: Input type (torch.FloatTensor) and weight type (torch.cuda.FloatTensor) should be the same的报错,请检查weight是否应该在cuda上。

花痴的大白:使用torchsummary

使用pip安装torchsummary:
pip install torchsummary

代码片段:

from torchsummary import summarymodel = model()model.load_state_dict(torch.load(weight_path))device = torch.device("cuda" if torch.cuda.is_available() else "cpu")model = model.to(device)summary(model,(3,384,288))

ajdjm:使用torchstat

使用pip安装torchstat:
pip install torchstat

代码片段(和summary差不多)

from torchstat import stat model = model() model.load_state_dict(torch.load(weight_path)) device = torch.device("cuda" if torch.cuda.is_available() else "cpu") stat(model,(3,384,288))

方法三:使用thop(不太推荐)

使用pip安装thop:
pip install thop

代码片段:

from thop import profile,clever_formatmodel = model() model.load_state_dict(torch.load(weight_path)) device = torch.device("cuda" if torch.cuda.is_available() else "cpu") flops, params = profile(model,inputs=()) flops,params = clever_format(flops,params,"%.3f")

版权声明:该文观点仅代表作者本人。处理文章:请发送邮件至 三1五14八八95#扣扣.com 举报,一经查实,本站将立刻删除。