Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

failed to build #355

Closed
Stephenfang51 opened this issue Dec 22, 2019 · 14 comments
Closed

failed to build #355

Stephenfang51 opened this issue Dec 22, 2019 · 14 comments

Comments

@Stephenfang51
Copy link

My command is
cmake .. DTENSORRT_ROOT=/input/TensorRT-6.0.1.5 -DGPU_ARCHS="61"

Generated: /input/onnx-tensorrt/build/third_party/onnx/onnx/onnx_onnx2trt_onnx-ml.proto
Generated: /input/onnx-tensorrt/build/third_party/onnx/onnx/onnx-operators_onnx2trt_onnx-ml.proto
--
-- ******** Summary ********
--   CMake version         : 3.15.5
--   CMake command         : /usr/local/bin/cmake
--   System                : Linux
--   C++ compiler          : /usr/bin/c++
--   C++ compiler version  : 5.4.0
--   CXX flags             :  -Wall -Wno-deprecated-declarations -Wno-unused-function -Wnon-virtual-dtor
--   Build type            : Release
--   Compile definitions   : ONNX_NAMESPACE=onnx2trt_onnx
--   CMAKE_PREFIX_PATH     :
--   CMAKE_INSTALL_PREFIX  : /usr/local
--   CMAKE_MODULE_PATH     :
--
--   ONNX version          : 1.6.0
--   ONNX NAMESPACE        : onnx2trt_onnx
--   ONNX_BUILD_TESTS      : OFF
--   ONNX_BUILD_BENCHMARKS : OFF
--   ONNX_USE_LITE_PROTO   : OFF
--   ONNXIFI_DUMMY_BACKEND : OFF
--   ONNXIFI_ENABLE_EXT    : OFF
--
--   Protobuf compiler     : /usr/bin/protoc
--   Protobuf includes     : /usr/include
--   Protobuf libraries    : /usr/lib/x86_64-linux-gnu/libprotobuf.so;-lpthread
--   BUILD_ONNX_PYTHON     : OFF
-- Found TensorRT headers at /input/TensorRT-6.0.1.5/include
-- Find TensorRT libs at /input/TensorRT-6.0.1.5/lib/libnvinfer.so;/input/TensorRT-6.0.1.5/lib/libnvinfer_plugin.so;TENSORRT_LIBRARY_MYELIN-NOTFOUND
-- Could NOT find TENSORRT (missing: TENSORRT_LIBRARY)
ERRORCannot find TensorRT library.
CMake Error: The following variables are used in this project, but they are set to NOTFOUND.
Please set them or make sure they are set and tested correctly in the CMake files:
TENSORRT_LIBRARY_MYELIN
    linked by target "nvonnxparser_static" in directory /input/onnx-tensorrt
    linked by target "nvonnxparser" in directory /input/onnx-tensorrt

-- Configuring incomplete, errors occurred!
See also "/input/onnx-tensorrt/build/CMakeFiles/CMakeOutput.log".
See also "/input/onnx-tensorrt/build/CMakeFiles/CMakeError.log".

Any solution for this problem ?

Thanks

@Semihal
Copy link

Semihal commented Dec 23, 2019

@Stephenfang51 , hi!

Try it:

cd onnx-tensorrt
cmake . -DCUDA_INCLUDE_DIRS=/usr/local/cuda/include -DTENSORRT_ROOT=/usr/src/tensorrt -DCMAKE_CUDA_COMPILER=/usr/local/cuda/bin/nvcc
sudo make install

@Stephenfang51
Copy link
Author

@Semihal Dear Sir

I have tried your command and it shows

-- The CXX compiler identification is GNU 5.4.0
-- The C compiler identification is GNU 5.4.0
-- Check for working CXX compiler: /usr/bin/c++
-- Check for working CXX compiler: /usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Check for working C compiler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Found Protobuf: /usr/lib/x86_64-linux-gnu/libprotobuf.so;-lpthread (found version "2.6.1")
-- Build type not set - defaulting to Release
Generated: /input/onnx-tensorrt/third_party/onnx/onnx/onnx_onnx2trt_onnx-ml.proto
Generated: /input/onnx-tensorrt/third_party/onnx/onnx/onnx-operators_onnx2trt_onnx-ml.proto
--
-- ******** Summary ********
--   CMake version         : 3.15.5
--   CMake command         : /usr/local/bin/cmake
--   System                : Linux
--   C++ compiler          : /usr/bin/c++
--   C++ compiler version  : 5.4.0
--   CXX flags             :  -Wall -Wno-deprecated-declarations -Wno-unused-function -Wnon-virtual-dtor
--   Build type            : Release
--   Compile definitions   : ONNX_NAMESPACE=onnx2trt_onnx
--   CMAKE_PREFIX_PATH     :
--   CMAKE_INSTALL_PREFIX  : /usr/local
--   CMAKE_MODULE_PATH     :
--
--   ONNX version          : 1.6.0
--   ONNX NAMESPACE        : onnx2trt_onnx
--   ONNX_BUILD_TESTS      : OFF
--   ONNX_BUILD_BENCHMARKS : OFF
--   ONNX_USE_LITE_PROTO   : OFF
--   ONNXIFI_DUMMY_BACKEND : OFF
--   ONNXIFI_ENABLE_EXT    : OFF
--
--   Protobuf compiler     : /usr/bin/protoc
--   Protobuf includes     : /usr/include
--   Protobuf libraries    : /usr/lib/x86_64-linux-gnu/libprotobuf.so;-lpthread
--   BUILD_ONNX_PYTHON     : OFF
-- Found TensorRT headers at TENSORRT_INCLUDE_DIR-NOTFOUND
-- Find TensorRT libs at TENSORRT_LIBRARY_INFER-NOTFOUND;TENSORRT_LIBRARY_INFER_PLUGIN-NOTFOUND;TENSORRT_LIBRARY_MYELIN-NOTFOUND
-- Could NOT find TENSORRT (missing: TENSORRT_INCLUDE_DIR TENSORRT_LIBRARY)
ERRORCannot find TensorRT library.
CMake Error: The following variables are used in this project, but they are set to NOTFOUND.
Please set them or make sure they are set and tested correctly in the CMake files:
/input/onnx-tensorrt/TENSORRT_INCLUDE_DIR
   used as include directory in directory /input/onnx-tensorrt
   used as include directory in directory /input/onnx-tensorrt
TENSORRT_LIBRARY_INFER
    linked by target "nvonnxparser_static" in directory /input/onnx-tensorrt
    linked by target "nvonnxparser" in directory /input/onnx-tensorrt
TENSORRT_LIBRARY_INFER_PLUGIN
    linked by target "nvonnxparser_static" in directory /input/onnx-tensorrt
    linked by target "nvonnxparser" in directory /input/onnx-tensorrt
TENSORRT_LIBRARY_MYELIN
    linked by target "nvonnxparser_static" in directory /input/onnx-tensorrt
    linked by target "nvonnxparser" in directory /input/onnx-tensorrt

-- Configuring incomplete, errors occurred!

@Semihal
Copy link

Semihal commented Dec 23, 2019

Please print the result: dpkg -l | grep Tensor

@Stephenfang51
Copy link
Author

dpkg -l | grep Tensor

Well, it printed nothing ....

I can import tensorrt in my python shell which means tensorRT installed correctly right?

here is my bashrc

# added by Miniconda3 installer
export PATH="/usr/local/miniconda3/bin:$PATH"
alias sa="source activate"
alias tb="setsid setsid /usr/local/miniconda3/envs/dl/bin/python /usr/local/miniconda3/envs/dl/bin/tensorboard --logdir=/output --port=28007"
export LC_ALL=zh_CN.UTF-8
export LANG=zh_CN.UTF-8
export LANGUAGE=zh_CN.UTF-8
export PATH="/usr/local/cuda/bin:$PATH"
export LD_LIBRARY_PATH="/usr/local/miniconda3/envs/dl/lib:/usr/local/cuda/extras/CUPTI/lib64:$LD_LIBRARY_PATH"
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/input/TensorRT-6.0.1.5/lib

I can give you some clues, I installed tensorRT in conda env, maybe that's the reason can't find the TensorRT library ??

@Semihal
Copy link

Semihal commented Dec 23, 2019

Oh, sorry, I didn't work with anaconda.

@Stephenfang51
Copy link
Author

Oh, sorry, I didn't work with anaconda.

That's okay, Thanks anyway

@1LOVESJohnny
Copy link

@Stephenfang51 Hi, have you solved the problem? I met the same error. Expect for your reply.

@1LOVESJohnny
Copy link

@Stephenfang51 Hi, I've figured it out. This problem is caused by unmatched TRT version. I uninstalled the original TRT6.0 and installed TRT7.0. The problem is solved.

@Stephenfang51
Copy link
Author

@Stephenfang51 Hi, I've figured it out. This problem is caused by unmatched TRT version. I uninstalled the original TRT6.0 and installed TRT7.0. The problem is solved.

Wow it doesn't make sense....
all of your error messsge are same with mine ?

@rmccorm4
Copy link

rmccorm4 commented Dec 25, 2019

Hi @Stephenfang51,

The fact that it's looking for this TENSORRT_LIBRARY_MYELIN means this is specific to TensorRT 7.

Similarly, I believe only TRT7 supports ONNX 1.6: ONNX version : 1.6.0.

I haven't tried building this myself, but you'll likely need to either install TRT7 to build master / 7.0 branch, or keep TRT6 and build from 6.0 / 6.0-full-dims branch: https://github.com/onnx/onnx-tensorrt/tree/6.0-full-dims

@Stephenfang51
Copy link
Author

@1LOVESJohnny @rmccorm4 Yes, problem solved. Thank you guys so much !

@fujingling
Copy link

fujingling commented Jun 17, 2021

I solved this problem by remove this line

find_library(TENSORRT_LIBRARY_MYELIN myelin
HINTS ${TENSORRT_ROOT} ${TENSORRT_BUILD} ${CUDA_TOOLKIT_ROOT_DIR}
PATH_SUFFIXES lib lib64 lib/x64)

@zengjie617789
Copy link

After i check out the path of TRT_RELEASE carefully, the problem is solved.

@fujingling
Copy link

fujingling commented Jan 18, 2022 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants