在海光dcu平台上部署paddle推理框架进行编译时,在40%左右卡住了,报错信息:error generating:/Paddle/build/paddlephi/kernels/CMakeFiles/phi gpu.dir/gpu/./phi_gpu_generated_edit_distance_kernel.cu.o
tyu7yeag1#
如果没有Nvidia 可以尝试 cmake 时候尝试 disable cuda
cd repo/ cmake -B build -GNinja -DWITH_GPU=OFF cmake --build build -j $(nproc)
xxhby3vn2#
你好,我的执行命令如下:cmake .. -DPY_VERSION=3 -DPYTHON_EXECUTABLE= which python3 -DWITH_ROCM=ON-DON_INFER=ON -DWITH_TESTING=OFF -DWITH_XBYAK=OFFmake -j$(nproc)由于我的目的主要是进行在dcu上的C++推理,所以将CMakeList.txt中tensorrt相关的命令已经注释。请问应该如何修改
which python3
kokeuurv3#
可以参考:cmake .. -DPY_VERSION=${PYTHON_MAJOR_VER} -DPYTHON_INCLUDE_DIR=/usr/local/Python${PYTHON_MAJOR_VER}d/include/python${PYTHON_MAJOR_VER}/ -DPYTHON_LIBRARY=/usr/local/Python${PYTHON_MAJOR_VER}d/lib/libpython${PYTHON_MAJOR_VER}.so -DPYTHON_EXECUTABLE=/usr/bin/python3 -DWITH_GPU=OFF -DWITH_ROCM=ON -DWITH_RCCL=ON -DWITH_NCCL=OFF -DWITH_TESTING=ON -DWITH_DISTRIBUTE=ON -DCMAKE_BUILD_TYPE=Release -DCMAKE_EXPORT_COMPILE_COMMANDS=ON -DCMAKE_VERBOSE_MAKEFILE=OFF -DWITH_TP_CACHE=ON -DROCM_PATH=${ROCM_PATH} -DWITH_MKLDNN=OFF -DON_INFER=ON
zkure5ic4#
fnvucqvd5#
编译环境参考 https://www.paddlepaddle.org.cn/documentation/docs/zh/guides/hardware_support/rocm_docs/paddle_install_cn.html
5条答案
按热度按时间tyu7yeag1#
如果没有Nvidia 可以尝试 cmake 时候尝试 disable cuda
xxhby3vn2#
你好,我的执行命令如下:
cmake .. -DPY_VERSION=3 -DPYTHON_EXECUTABLE=
which python3
-DWITH_ROCM=ON-DON_INFER=ON -DWITH_TESTING=OFF -DWITH_XBYAK=OFF
make -j$(nproc)
由于我的目的主要是进行在dcu上的C++推理,所以将CMakeList.txt中tensorrt相关的命令已经注释。请问应该如何修改
kokeuurv3#
可以参考:cmake .. -DPY_VERSION=${PYTHON_MAJOR_VER} -DPYTHON_INCLUDE_DIR=/usr/local/Python${PYTHON_MAJOR_VER}d/include/python${PYTHON_MAJOR_VER}/ -DPYTHON_LIBRARY=/usr/local/Python${PYTHON_MAJOR_VER}d/lib/libpython${PYTHON_MAJOR_VER}.so -DPYTHON_EXECUTABLE=/usr/bin/python3 -DWITH_GPU=OFF -DWITH_ROCM=ON -DWITH_RCCL=ON -DWITH_NCCL=OFF -DWITH_TESTING=ON -DWITH_DISTRIBUTE=ON -DCMAKE_BUILD_TYPE=Release -DCMAKE_EXPORT_COMPILE_COMMANDS=ON -DCMAKE_VERBOSE_MAKEFILE=OFF -DWITH_TP_CACHE=ON -DROCM_PATH=${ROCM_PATH} -DWITH_MKLDNN=OFF -DON_INFER=ON
zkure5ic4#
可以参考:cmake .. -DPY_VERSION=${PYTHON_MAJOR_VER} -DPYTHON_INCLUDE_DIR=/usr/local/Python${PYTHON_MAJOR_VER}d/include/python${PYTHON_MAJOR_VER}/ -DPYTHON_LIBRARY=/usr/local/Python${PYTHON_MAJOR_VER}d/lib/libpython${PYTHON_MAJOR_VER}.so -DPYTHON_EXECUTABLE=/usr/bin/python3 -DWITH_GPU=OFF -DWITH_ROCM=ON -DWITH_RCCL=ON -DWITH_NCCL=OFF -DWITH_TESTING=ON -DWITH_DISTRIBUTE=ON -DCMAKE_BUILD_TYPE=Release -DCMAKE_EXPORT_COMPILE_COMMANDS=ON -DCMAKE_VERBOSE_MAKEFILE=OFF -DWITH_TP_CACHE=ON -DROCM_PATH=${ROCM_PATH} -DWITH_MKLDNN=OFF -DON_INFER=ON
fnvucqvd5#
编译环境参考 https://www.paddlepaddle.org.cn/documentation/docs/zh/guides/hardware_support/rocm_docs/paddle_install_cn.html