Hi @nihui ,
Thank you for your efforts.
I tried Yolov5 NCNN model (using c++ https://github.com/Tencent/ncnn/blob/master/examples/yolov5.cpp ) and found the inference is slow compared to python model. On further analysis found that the output extraction step takes majority of the inference time.
ex.extract("output", out); - takes an average of 72.5666 millisecond out of total Inference time is 98.093 millisecond. Please advise what steps can be done to reduce the time. Thanks.
2条答案
按热度按时间but5z9lq1#
Python inference log
From above log, we can see that the inference time is 14 milliseconds for the same model and input.
x7rlezfr2#
OS - Windows 10
GPU - 2080 Ti
Vulkan - VulkanSDK-1.2.189.2