I got different result between onnx and ncnn model even if input is a zeros mat. (Already checked the FAQ)
model.zip
Password:0000
Here's the onnx and ncnn model. (You can also try to regenerate ncnn model from onnx)
@nihui
A simple test:
#include "net.h"
int main(int argc, char** argv)
{
ncnn::UnlockedPoolAllocator ncnn_blob_pool_allocator_;
ncnn::PoolAllocator ncnn_workspace_pool_allocator_;
ncnn::Net ncnn_detector_;
ncnn::Option opt;
opt.blob_allocator = &ncnn_blob_pool_allocator_;
opt.workspace_allocator = &ncnn_workspace_pool_allocator_;
ncnn_detector_.opt = opt;
ncnn_detector_.load_param("model-opt.param");
ncnn_detector_.load_model("model-opt.bin");
cv::Mat ncnn_cv_mat(56, 56, CV_32FC3, cv::Scalar(0, 0, 0));
ncnn::Mat ncnn_in = ncnn::Mat::from_pixels(ncnn_cv_mat.data, ncnn::Mat::PIXEL_BGR, ncnn_cv_mat.cols, ncnn_cv_mat.rows);
auto ncnn_extractor = ncnn_detector_.create_extractor();
ncnn_extractor.input("main_input", in);
ncnn::Mat out;
ncnn_extractor.extract("class_ret", out);
std::vector<float> cls_scores;
cls_scores.resize(ncnn_out.w);
for (int j = 0; j < ncnn_out.w; j++)
{
cls_scores[j] = ncnn_out[j];
}
return 0;
}
NCNN output: 0.43, -0.66
import onnxruntime as ort
import numpy as np
from PIL import Image
img = Image.new("RGB", (56, 56), "black")
inputs = np.array(img, dtype=np.float32)
inputs = inputs.transpose(2, 0, 1)
inputs = np.expand_dims(inputs, axis=0)
ort_sess = ort.InferenceSession('model.onnx')
outputs = ort_sess.run(None, {'main_input': inputs })
print(outputs)
ONNX(and other transform like tflite): -0.05 , -0.09
on win10 x64, tested every version from 2020.12 till latest version of ncnn
5条答案
按热度按时间zmeyuzjn1#
I got password error when decompressing your model.zip
ru9i0ody2#
Your onnx file seems to be converted from tensorflow graph, while onnx2ncnn supports onnx exported from pytorch.
Try keras2ncnn or mlir2ncnn for tensorflow model conversion
eh57zj3b3#
@nihui
My onnx is exported from pytorch
vs3odd8k4#
I found that maybe the following operation is the cause:
You can test this module
0ve6wy6x5#
Is there any update on this issue? @nihui