Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Any plane to support GPU acceleration using TensorRT? #21

Open
AgentSMLA opened this issue Aug 20, 2024 · 4 comments
Open

Any plane to support GPU acceleration using TensorRT? #21

AgentSMLA opened this issue Aug 20, 2024 · 4 comments

Comments

@AgentSMLA
Copy link

as title, i love this project

@guyzoler
Copy link

Download the source and edit, if you have properly installed ONNX with tensorrt just edit line 38 at YoloCore.cs
change: ? new InferenceSession(onnxModel, SessionOptions.MakeSessionOptionWithCudaProvider(gpuId))
to: ? new InferenceSession(onnxModel, SessionOptions.MakeSessionOptionWithTensorrtProvider(gpuId))

I have it running, but it has the same performance for some reason, and it takes longer to initialize.

@snjs000111
Copy link

do i need to convert yolo8pose.onnx to yolo8pose.engine by tool trtexec ?
when i use tensorrt program abort at this line but why?
// Run inference on the OrtIoBinding and bind allocated GPU-memory.
session.RunWithBinding(runOptions, ortIoBinding);

@guyzoler
Copy link

You will need to convert your model, I followed this example:
https://docs.ultralytics.com/integrations/tensorrt/#usage

Image

@Eve-88
Copy link

Eve-88 commented Feb 21, 2025

There is a error when using "SessionOptions.MakeSessionOptionWithTensorrtProvider(gpuId)"
onnxruntime::ProviderLibrary::Get [ONNXRuntimeError] : 1 : FAIL : LoadLibrary failed with error 126 "" when trying to load "path\onnxruntime_providers_tensorrt.dll"

my packeage:
YoloDotNet v2.2
cuda 12.4
cudnn v9.1
tensorrt 8.6.1.6
onnxruntime_gpu 1.20.1
The environment variable configuration was successful, and the test went without any issues.
how I can solve this problem

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants