This website requires JavaScript.
Explore
Help
Sign In
upagge
/
roop
Watch
1
Star
0
Fork
0
You've already forked roop
Code
Issues
Pull Requests
Packages
Projects
Releases
Wiki
Activity
ccd63b1dce
roop
/
core
/
globals.py
6 lines
99 B
Python
Raw
Normal View
History
Unescape
Escape
Fix GPU not being used even when it available On lots of systems onnxruntime doesn't detect GPU unless pytorch is imported before it. So despite having CUDA and CUDNN setup correctly, it is only using CPUExecutionProvider. By importing pytorch first, the issue is fixed. So let's use this until any official solution is available. See: https://stackoverflow.com/questions/75294639/onnxruntime-inference-with-cudnn-on-gpu-only-working-if-pytorch-imported-first
2023-05-31 06:50:41 +03:00
import
torch
Restore globals, add process time for better comparison
2023-05-30 02:03:43 +03:00
import
onnxruntime
use_gpu
=
False
providers
=
onnxruntime
.
get_available_providers
(
)
Reference in New Issue
Copy Permalink