I am using Unity3D on Arch Linux: https://wiki.archlinux.org/title/Unity3D for game development.
I have a Nvidia GTX 1650. All my nvidia packages are up to date (tensorflow-gpu for example works fine). But when I run a game within unity3D it does not use the GPU at all.
How can I instruct unity3D to use the GPU when developing games?
Details of my GPU below:
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 465.27 Driver Version: 465.27 CUDA Version: 11.3 |
|-------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|===============================+======================+======================|
| 0 NVIDIA GeForce ... Off | 00000000:01:00.0 Off | N/A |
| N/A 53C P8 2W / N/A | 4MiB / 3914MiB | 0% Default |
| | | N/A |
+-------------------------------+----------------------+----------------------+
+-----------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=============================================================================|
| 0 N/A N/A 615 G /usr/lib/Xorg 4MiB |
+-----------------------------------------------------------------------------+
Asked by Jack Wetherell
(101 rep)
May 8, 2021, 10:01 PM
Last activity: Aug 19, 2021, 12:55 PM
Last activity: Aug 19, 2021, 12:55 PM