Unix & Linux Stack Exchange
Q&A for users of Linux, FreeBSD and other Unix-like operating systems
Latest Questions
0
votes
1
answers
2106
views
External Monitor on HDMI with a Hybrid (Nvidia Optimus) laptop
Linux - Debian uname -a >> output Linux HomeLT 4.19.0-6-amd64 #1 SMP Debian 4.19.67-2+deb10u2 (2019-11-11) x86_64 GNU/Linux I have a ASUS TUF FX504 GM. It has a Intel i7 8750H and a GTX 1060 The problem is .. I can't use the external monitor I would plug into the HDMI port, by default. The Intel iGP...
Linux - Debian
uname -a >> output
Linux HomeLT 4.19.0-6-amd64 #1 SMP Debian 4.19.67-2+deb10u2 (2019-11-11) x86_64 GNU/Linux
I have a ASUS TUF FX504 GM.
It has a Intel i7 8750H and a GTX 1060
The problem is .. I can't use the external monitor I would plug into the HDMI port, by default.
The Intel iGPU, UHD 630 is the default one that gets used unless I use this config file as specified in the following guide..
I have the "nvidia-driver" package installed.
http://us.download.nvidia.com/XFree86/Linux-x86/375.26/README/randr14.html
/etc/X11/xorg.conf.d/10-nvidia.conf
Section "ServerLayout"
Identifier "layout"
Screen 0 "nvidia"
Inactive "intel"
EndSection
Section "Device"
Identifier "nvidia"
Driver "nvidia"
BusID "PCI:1:0:0"
EndSection
Section "Screen"
Identifier "nvidia"
Device "nvidia"
Option "AllowEmptyInitialConfiguration"
EndSection
Section "Device"
Identifier "intel"
Driver "modesetting"
BusID "PCI:0:2:0"
EndSection
Section "Screen"
Identifier "intel"
Device "intel"
EndSection
Is there anyway to change this file to make it so that the computer books on the intel screen.. but also keeps the nvidia drivers "inactive" so that both monitors show up when I run the command "xrandr --auto"
I tried putting
Screen 0 "intel"
Inactive "nvidia"
But that doesn't work.
Tried
Screen 0 "intel"
Screen 1 "nvidia
"
without the inactive line. That didn't quite work either.
I also have to run
xrandr --setprovideroutputsource modesetting NVIDIA-0
xrandr --auto
after start up.
Can anyone help?
Similar topic
https://unix.stackexchange.com/questions/250470/intel-driver-on-nvidia-optimus-laptop-not-recognizing-internal-display
BhooshanAJ
(3 rep)
Jan 7, 2020, 09:00 PM
• Last activity: Jul 27, 2025, 09:06 AM
-2
votes
1
answers
398
views
Arch Linux not using the Nvidia GPU
Arch Linux not using the Nvidia GPU. This happened after I installed Steam. My gaming experience became laggy and the laptop became quiet, that happens when my Nvidia GPU isn't used. I decided to check [Mission Center][1] and the Nvidia GPU isn't there! I tried re-installing Arch Linux and the first...
Arch Linux not using the Nvidia GPU.
This happened after I installed Steam. My gaming experience became laggy and the laptop became quiet, that happens when my Nvidia GPU isn't used. I decided to check Mission Center and the Nvidia GPU isn't there!
I tried re-installing Arch Linux and the first time I boot, it used the dGPU but after a reboot it stopped using the Nvidia GPU.
What can I do?
My specs:
Nvidia RTX 3050 Laptop GPU
12th gen Intel Core i5 12500H
Output of
inxi -G
:
Graphics:
Device-1: Intel Alder Lake-P GT2 [Iris Xe Graphics] driver: i915 v: kernel
Device-2: NVIDIA GA107M [GeForce RTX 3050 Mobile] driver: N/A
Device-3: Sonix BisonCam NB Pro driver: uvcvideo type: USB
Display: x11 server: X.org v: 1.21.1.16 with: Xwayland v: 24.1.6 driver: X:
loaded: modesetting dri: iris gpu: i915 resolution: resolution: 1920x1080
API: OpenGL Message: Unable to show GL data. glxinfo is missing.
Info: Tools: gpu: nvidia-smi x11: xprop
Praef
(51 rep)
May 27, 2025, 02:08 PM
• Last activity: May 27, 2025, 08:05 PM
0
votes
0
answers
786
views
AMDGPU doesn't work with hashcat
AMDGPU doesn't work with hashcat, anyone knows how to fix this problem? ``` $ ~ hashcat -I hashcat (v6.2.6) starting in backend information mode clCreateCommandQueue(): CL_OUT_OF_HOST_MEMORY OpenCL Info: ============ OpenCL Platform ID #1 Vendor..: Advanced Micro Devices, Inc. Name....: AMD Accelera...
AMDGPU doesn't work with hashcat, anyone knows how to fix this problem?
$ ~ hashcat -I
hashcat (v6.2.6) starting in backend information mode
clCreateCommandQueue(): CL_OUT_OF_HOST_MEMORY
OpenCL Info:
============
OpenCL Platform ID #1
Vendor..: Advanced Micro Devices, Inc.
Name....: AMD Accelerated Parallel Processing
Version.: OpenCL 2.1 AMD-APP.dbg (3513.0)
OpenCL Platform ID #2
Vendor..: Intel(R) Corporation
Name....: Intel(R) CPU Runtime for OpenCL(TM) Applications
Version.: OpenCL 2.1 LINUX
Backend Device ID #1
Type...........: CPU
Vendor.ID......: 8
Vendor.........: Intel(R) Corporation
Name...........: Intel(R) Core(TM) i5-8265U CPU @ 1.60GHz
Version........: OpenCL 2.1 (Build 0)
Processor(s)...: 8
Clock..........: 1600
Memory.Total...: 11809 MB (limited to 1476 MB allocatable in one block)
Memory.Free....: 5872 MB
Local.Memory...: 32 KB
OpenCL.Version.: OpenCL C 2.0
Driver.Version.: 18.1.0.0920
OpenCL Platform ID #3
Vendor..: Intel(R) Corporation
Name....: Intel(R) OpenCL Graphics
Version.: OpenCL 3.0
Backend Device ID #2
Type...........: GPU
Vendor.ID......: 8
Vendor.........: Intel(R) Corporation
Name...........: Intel(R) UHD Graphics 620
Version........: OpenCL 3.0 NEO
Processor(s)...: 24
Clock..........: 1100
Memory.Total...: 9447 MB (limited to 2047 MB allocatable in one block)
Memory.Free....: 4672 MB
Local.Memory...: 64 KB
OpenCL.Version.: OpenCL C 1.2
Driver.Version.: 23.22.026516
OpenCL Platform ID #4
Vendor..: Advanced Micro Devices, Inc.
Name....: AMD Accelerated Parallel Processing
Version.: OpenCL 2.1 AMD-APP (3180.7)
Backend Device ID #3
Type...........: GPU
Vendor.ID......: 1
Vendor.........: Advanced Micro Devices, Inc.
Name...........: AMD Radeon Graphics
Version........: OpenCL 1.2 AMD-APP (3180.7)
Processor(s)...: 5
Clock..........: 600
Memory.Total...: 2047 MB (limited to 1522 MB allocatable in one block)
Memory.Free....: 0 MB
Local.Memory...: 32 KB
OpenCL.Version.: OpenCL C 1.2
Driver.Version.: 3180.7
PCI.Addr.BDF...: 01:00.0
0x786d
(1 rep)
Sep 5, 2023, 09:20 AM
• Last activity: Mar 7, 2024, 09:29 PM
3
votes
2
answers
8915
views
Script to launch an application with dedicated graphics card (Fedora 25)
As shown from [this](https://blogs.gnome.org/uraeus/2016/11/01/discrete-graphics-and-fedora-workstation-25/) blog, Fedora 25 now has NVida graphics binary driver support and users has an option to launch applications with "Launch with Dedicated Graphics Card" with right click on an icon, if your com...
As shown from [this](https://blogs.gnome.org/uraeus/2016/11/01/discrete-graphics-and-fedora-workstation-25/) blog, Fedora 25 now has NVida graphics binary driver support and users has an option to launch applications with "Launch with Dedicated Graphics Card" with right click on an icon, if your computer has hybrid GPU (Intel/NVidia) configuration.
Given this option, I would like to write scripts to launch my other applications from the command line or to make desktop launchers connected to my scripts directly, with Dedicated Graphics Card option pre-selected.
I am wondering how I can achieve this? or How is this implemented in Fedora 25, so that I can learn from and use it on my scripts? Thank you!
Yui
(33 rep)
Dec 19, 2016, 09:49 PM
• Last activity: Jan 9, 2024, 07:45 AM
12
votes
3
answers
27651
views
How to check which graphics card is driving my display?
The system is setup with one of those hybrid Intel/ATI offerings (muxless). After a bit of fiddling with kernel settings and drivers I got both cards working, I think (adding radeon.dpm=1 in kernel settings and using only the opensource drivers). However I can't figure out which card the system is u...
The system is setup with one of those hybrid Intel/ATI offerings (muxless). After a bit of fiddling with kernel settings and drivers I got both cards working, I think (adding radeon.dpm=1 in kernel settings and using only the opensource drivers).
However I can't figure out which card the system is using. From my understanding with the newer kernel (3.12) amd dynamic power management will power down/ power up the card when needed, so in theory it should be using the integrated hd4000 for most of the time, however I can't find an easy way to check which one is in use.
lspci | grep VGA
only lists all the cards, it doesn't specify which one is currently in use.
Some steering in the right direction much appreciated.
System
Debian 7 stable, 3.12 amd64 Kernel
7670M AMD + Intel HD4000
Jonathan
(121 rep)
May 9, 2014, 06:10 PM
• Last activity: Nov 9, 2023, 05:15 PM
-1
votes
1
answers
505
views
Laptop overheating and battery drain after installing Ubuntu 23.04 (Dual boot)
I have installed ubuntu 23.04 (my first linux os) along with my windows (dual boot) on my Laptop and now it's heating up like crazy and my battery is draining. I know that this is caused by 2 graphics cards that I have one is intel's irix and other is dedicated graphics card which is NVDIA. I want a...
I have installed ubuntu 23.04 (my first linux os) along with my windows (dual boot) on my Laptop and now it's heating up like crazy and my battery is draining. I know that this is caused by 2 graphics cards that I have one is intel's irix and other is dedicated graphics card which is NVDIA.
I want a solution for this heating and battery draining problem. I know this is because drivers optimization problem and all. I have checked online where some have recommended to turn off one graphics card and have given some commands. But I dont know how that will work so PLEASE help me how to solve this issue.
My Laptop:
Processor: Intel i5 11th gen,
RAM: 20GB (4GB + 16GB-Manually Added)
ROM: 256GB-SSD and 1TB-Hard disk
Graphics Cards: Intel Irix and NVDIA (mx 330)

Pranav Sawant
(11 rep)
Oct 13, 2023, 01:37 PM
• Last activity: Oct 16, 2023, 01:40 PM
3
votes
1
answers
7753
views
Force graphics to run on specific GPU
My computer has one integrated graphics card and 2 Nvidia RTX 3070 GPUS. I am using Ubuntu 20.04 and `nvidia-driver-530`. ``` lspci | grep VGA 00:02.0 VGA compatible controller: Intel Corporation AlderLake-S GT1 (rev 0c) 01:00.0 VGA compatible controller: NVIDIA Corporation GA104 [GeForce RTX 3070 L...
My computer has one integrated graphics card and 2 Nvidia RTX 3070 GPUS. I am using Ubuntu 20.04 and
nvidia-driver-530
.
lspci | grep VGA
00:02.0 VGA compatible controller: Intel Corporation AlderLake-S GT1 (rev 0c)
01:00.0 VGA compatible controller: NVIDIA Corporation GA104 [GeForce RTX 3070 Lite Hash Rate] (rev a1)
05:00.0 VGA compatible controller: NVIDIA Corporation GA104 [GeForce RTX 3070 Lite Hash Rate] (rev a1)
I am currently trying to test my 3070 graphics cards with the Phoronix Test Suite.
I am using nvidia-prime
and prime-select: on-demand
to run the terminal on the intel iGPU and phoronix tests on the Nvidia 3070: prime-run phoronix-test-suite run unigine-heaven
.
There were some issues getting nvidia-prime
to work, so I followed the suggestions from this article: https://askubuntu.com/questions/1364762/prime-run-command-not-found
cat /usr/bin/prime-run
#!/bin/bash
export __NV_PRIME_RENDER_OFFLOAD=1
export __GLX_VENDOR_LIBRARY_NAME=nvidia
export __VK_LAYER_NV_optimus=NVIDIA_only
export VK_ICD_FILENAMES=/usr/share/vulkan/icd.d/nvidia_icd.json
exec "$@"
By using prime-run
I am successfully able to run the phoronix test suite on GPU 0 which has bus id 01:00.0
/ PCI:1:0:0
.
However, I seem unable to run any tests with GPU 1 which has bus id 05:00.0
/ PCI:5:0:0
.
Modifying /etc/X11/xorg.conf
by changing the bus number and rebooting as suggested by the following links didn't seem to do anything and still ran on GPU 0.
- https://stackoverflow.com/questions/18382271/how-can-i-modify-xorg-conf-file-to-force-x-server-to-run-on-a-specific-gpu-i-a
- https://askubuntu.com/questions/787030/setting-the-default-gpu
cat /etc/X11/xorg.conf
# nvidia-xconfig: X configuration file generated by nvidia-xconfig
# nvidia-xconfig: version 530.41.03
Section "ServerLayout"
Identifier "Layout0"
Screen 0 "Screen0"
InputDevice "Keyboard0" "CoreKeyboard"
InputDevice "Mouse0" "CorePointer"
EndSection
Section "Files"
EndSection
Section "InputDevice"
# generated from default
Identifier "Mouse0"
Driver "mouse"
Option "Protocol" "auto"
Option "Device" "/dev/psaux"
Option "Emulate3Buttons" "no"
Option "ZAxisMapping" "4 5"
EndSection
Section "InputDevice"
# generated from default
Identifier "Keyboard0"
Driver "kbd"
EndSection
Section "Monitor"
Identifier "Monitor0"
VendorName "Unknown"
ModelName "Unknown"
Option "DPMS"
EndSection
Section "Device"
Identifier "Device0"
Driver "nvidia"
VendorName "NVIDIA Corporation"
# BusID "PCI:1:0:0"
BusID "PCI:5:0:0"
EndSection
Section "Screen"
Identifier "Screen0"
Device "Device0"
Monitor "Monitor0"
DefaultDepth 24
SubSection "Display"
Depth 24
EndSubSection
EndSection
In fact, I deleted etc/X11/xorg.conf
and was able to run the phoronix tests on GPU 0 without the conf file at all. I would guess that one of the drivers or programs I run automatically selects the nvidia card with the lowest bus id.
I would like to know where I should look to change the settings or any configuration files in order to select the second RTX 3070 gpu with the bus id 05:00.0
. I would be more than happy to provide any further information.
jameszp
(93 rep)
Apr 24, 2023, 07:05 PM
• Last activity: Apr 27, 2023, 05:37 PM
4
votes
2
answers
8095
views
What does KDE Compositor Tearing prevention ("vsync") do under the hood?
I have screen tearing issues. When I set `Tearing prevention ("vsync")` in Compositor to something else and then back to `Automatic` the screen tearing is gone. I would like to know what configuration files `Tearing prevention ("vsync")` changes to troubleshoot this problem and find a permanent fix....
I have screen tearing issues. When I set
Tearing prevention ("vsync")
in Compositor to something else and then back to Automatic
the screen tearing is gone. I would like to know what configuration files Tearing prevention ("vsync")
changes to troubleshoot this problem and find a permanent fix.
I test for screen tearing with this video .
I also have screen tearing with the latest live iso with both free and non-free drivers.
Operating System: Manjaro Linux
KDE Plasma Version: 5.18.5
KDE Frameworks Version: 5.70.0
Qt Version: 5.15.0
Kernel Version: 5.6.16-1-MANJARO
OS Type: 64-bit
Processors: 8 × Intel® Core™ i7-6700HQ CPU @ 2.60GHz
Memory: 15,5 GiB of RAM
GPU: Nvidia GeForce 940M
MatMis
(523 rep)
Jun 22, 2020, 11:10 PM
• Last activity: Feb 20, 2023, 02:04 AM
1
votes
2
answers
2285
views
Linux can't seem to detect my dedicated GPU on laptop
I am relatively new to Linux. I have installed Endeavour OS on my laptop (an HP Victus 16), and noticed underwhelming performance on apps like `waydroid`. It seems like linux is only detecting the iGPU in my system. When I run xrandr --listproviders it gives me the output Providers: number : 0** ! E...
I am relatively new to Linux. I have installed Endeavour OS on my laptop (an HP Victus 16), and noticed underwhelming performance on apps like
waydroid
. It seems like linux is only detecting the iGPU in my system. When I run
xrandr --listproviders
it gives me the output
Providers: number : 0** !
Even going to Settings > About shows the graphics card as "AMD Renoir" only.
Running
lspci
shows the dGPU connected as:
Display controller: Advanced Micro Devices, Inc. [AMD/ATI] Navi 14 [Radeon RX 5500/5500M / Pro 5500M] (rev c1)**
but it seems like it doesn't work anywhere else?
Configuration of my laptop if it matters:
AMD Ryzen 5600h
16 GB RAM
AMD RX 5500M graphics
And the OS details:
Endeavour OS Linux x86_64
Kernel: 5.17.0-247-tkg-pds
Baffledwaffles
(13 rep)
Mar 22, 2022, 02:22 AM
• Last activity: Apr 3, 2022, 01:54 PM
0
votes
1
answers
837
views
165Hz laptop display rendering at 40Hz
I have an Aftershock Vapor 15X (Eluktronics Max 15) with i7-10870H and RTX3060. Even I though I set the refresh rate = 165Hz in the settings, it still renders at 40Hz. Ive tried using multiple DEs like gnome, xfce and kde and the same thing happens each time. Xrandr says its rendering at 1440p 165Hz...
I have an Aftershock Vapor 15X (Eluktronics Max 15) with i7-10870H and RTX3060. Even I though I set the refresh rate = 165Hz in the settings, it still renders at 40Hz. Ive tried using multiple DEs like gnome, xfce and kde and the same thing happens each time. Xrandr says its rendering at 1440p 165Hz as well.
- I've tried multiple linux distros, ubuntu, manjaro and debian, same problem. I also installed FreeBSD and installed KDE, same issue - perhaps something wrong with bios or DE?
- I've tried uninstalling intel graphics drivers but it did nothing.
- I've tried setting nvidia prime offset 1, no effect.
- I set the bios to only use dGPU, which worked, but had problems loading the display manager on boot. The screen flickers for a while before loading in.
I think it has something to do with nvidia optimus or some compatibility issues due to newer hardware. Or perhaps something with the intel drivers. Any ideas?
Capital11
(1 rep)
Nov 25, 2021, 02:38 PM
• Last activity: Dec 14, 2021, 09:32 AM
0
votes
1
answers
289
views
Disabled dedicated GPU powers on after suspend (and also just randomly)
I have a laptop with integrated AMD graphics and discrete Nvidia GTX 1650Ti. $ sudo lspci ... 01:00.0 VGA compatible controller: NVIDIA Corporation TU117M [GeForce GTX 1650 Ti Mobile] (rev a1) 01:00.1 Audio device: NVIDIA Corporation Device 10fa (rev a1) ... 05:00.0 VGA compatible controller: Advanc...
I have a laptop with integrated AMD graphics and discrete Nvidia GTX 1650Ti.
$ sudo lspci
...
01:00.0 VGA compatible controller: NVIDIA Corporation TU117M [GeForce GTX 1650 Ti Mobile] (rev a1)
01:00.1 Audio device: NVIDIA Corporation Device 10fa (rev a1)
...
05:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Renoir (rev c7)
Distro: Ubuntu 21.04
Kernel: 5.11.0-17-generic
I use
$ sudo prime-select intel
to disable Nvidia Graphics and also set PCI power management to auto using TLP:
$ sudo tlp-stat
/sys/bus/pci/devices/0000:01:00.0/power/control = auto (0x030000, VGA compatible controller, no driver)
/sys/bus/pci/devices/0000:01:00.1/power/control = auto (0x040300, Audio device, snd_hda_intel)
This works great, the GPU is in low power mode and battery life is good:
$ cat /sys/bus/pci/devices/0000:01:00.0/power_state
D3cold
But after I use suspend the GPU starts to consume more power again:
$ cat /sys/bus/pci/devices/0000:01:00.0/power_state
D0
This also happens randomly during laptop being on sometimes.
Please help. This thing halves my laptop's battery life.
vitaliy
(101 rep)
Jun 1, 2021, 08:22 AM
• Last activity: Jun 22, 2021, 10:41 AM
12
votes
1
answers
53293
views
regenerate xorg.conf with current settings
Many people have talked about this issue but I've not found a satisfactory answer. I'm on a debian jessie. Currently I have tried `nvidia-driver` as the driver but it caused the system to crash; so I have purged all the `nvidia` packages. But the problem is that `/etc/X11/xorg.conf` has been replace...
Many people have talked about this issue but I've not found a satisfactory answer.
I'm on a debian jessie. Currently I have tried
nvidia-driver
as the driver but it caused the system to crash; so I have purged all the nvidia
packages. But the problem is that /etc/X11/xorg.conf
has been replaced with NVidia settings and the backup xorg.conf.backup
has been removed.
The related configuration set by NVidia is:
Section "Device"
Identifier "Device0"
Driver "nvidia"
VendorName "NVIDIA Corporation"
EndSection
I once tried changing nvidia
to intel
(also NVidia
-> Intel
) but the resolution is much lower(my laptop has a Intel Corporation Haswell-ULT Integrated Graphics Controller
as listed by lspci
). So I might need to use nouveau
as the driver; however simply changing nvidia
to nouveau
doesn't work.
It seems that the recent X system can be booted without xorg.conf(by rm /etc/X11/xorg.conf
) but slower. So I still prefer the xorg.conf with my current settings.
The version of Xorg
:
X.Org X Server 1.16.0
Release Date: 2014-07-16
X Protocol Version 11, Revision 0
Build Operating System: Linux 3.14-1-amd64 x86_64 Debian
Current Operating System: Linux debian 3.14-1-amd64 #1 SMP Debian 3.14.9-1 (2014-06-30) x86_64
Kernel command line: BOOT_IMAGE=/boot/vmlinuz-3.14-1-amd64 root=UUID=e9341749-9dee-4cc9-878e-3b59ed1906b2 ro quiet
Build Date: 17 July 2014 10:22:36PM
xorg-server 2:1.16.0-1 (http://www.debian.org/support)
Current version of pixman: 0.32.4
Before reporting problems, check http://wiki.x.org
to make sure that you have the latest version.
So are there any ways to re-generate the configuration file?
Hongxu Chen
(5888 rep)
Jul 27, 2014, 03:10 PM
• Last activity: Apr 19, 2021, 05:49 AM
1
votes
3
answers
2876
views
Deactivate Nvidia Graphic-Card CLI
How can I deactivate my dedicated NVIDIA gc via CLI? My Laptop has 2 graphic cards, an integrated INTEL gc and a dedicated NVIDIA gc? I have no use for the NVIDIA gc, because I don't play anymore (sadly). I haven't installed any driver for the gc. Due to power save reasons I want to deactivate the d...
How can I deactivate my dedicated NVIDIA gc via CLI?
My Laptop has 2 graphic cards, an integrated INTEL gc and a dedicated NVIDIA gc?
I have no use for the NVIDIA gc, because I don't play anymore (sadly).
I haven't installed any driver for the gc.
Due to power save reasons I want to deactivate the dedicated one in a systemd script.
dudas
(301 rep)
Dec 19, 2016, 09:25 AM
• Last activity: Sep 22, 2020, 02:00 AM
1
votes
0
answers
144
views
How do I tell the system/GDM to leave GPU alone, and only use other?
I'm using an AMD RX 5700 XT GPU on my system for over 6 months now, using the open source drivers. It works fine. But I need to do some Machine Learning stuff, and ROCM doesn't even work on this card yet. So I need CUDA. Luckily, I have an older nVidia Geforce GTX 970 lying around. So I installed it...
I'm using an AMD RX 5700 XT GPU on my system for over 6 months now, using the open source drivers. It works fine.
But I need to do some Machine Learning stuff, and ROCM doesn't even work on this card yet. So I need CUDA.
Luckily, I have an older nVidia Geforce GTX 970 lying around. So I installed it.
When I rebooted, everything worked fine. Both cards were detected and working.
Then I installed the proprietary nvidia drivers.
Now GDM tries to start on the GPU card, but fails. It tries to start an X11 session too (I'm actually using wayland, and want to keep using wayland)
So after it keeps on trying to start GDM/X11 on the nvidia GPU, it stops trying and finally shows me a console. All output on my AMD GPU, btw.
So the question is: how do I tell the system/GDM to leave the nvidia GPU alone and just keep on using the AMD GPU?
Jelle De Loecker
(136 rep)
Jun 5, 2020, 09:21 PM
• Last activity: Jun 5, 2020, 10:17 PM
1
votes
1
answers
824
views
Screen tearing (PRIME Synchronization 0) on Intel onboard graphic + AMD Radeon
I'm using two monitors with AMD Radeon RX570 and Intel onboard graphics on CentOS 8. (My CPU is i3-8100 @ 3.60GHz.) I have [screen tearing][1] on the 2nd monitor which is rendered by Intel onboard graphics when moving a window or watching video. After googling, I got the information that `PRIME Sync...
I'm using two monitors with AMD Radeon RX570 and Intel onboard graphics on CentOS 8. (My CPU is i3-8100 @ 3.60GHz.)
I have screen tearing on the 2nd monitor which is rendered by Intel onboard graphics when moving a window or watching video.
After googling, I got the information that
PRIME Synchronization
makes the 2nd monitor tear-free. To check it, I ran xrandr --props
. The below output is extracted from it.
xrandr --props
...
HDMI-1-1 connected 1920x1200+1920+0 (normal left inverted right x axis y axis) 518mm x 324mm
_MUTTER_PRESENTATION_OUTPUT: 0
EDID:
00ffffffffffff0038c3240200000000
0811010380342078ea8cb5a7554d9f26
0c5054afef8071408140818081c0a940
b301d1c00101283c80a070b023403020
260006442100001a243680a070381f40
3020250030303100001a000000fd0038
4b1f4d11000a202020202020000000fc
0056434231303739480a2020202001ac
020318f2450405030290230904046503
0c001000830100008c0ad08a20e02d10
103e9600c48e210000188c0ad08a20e0
2d10103e9600138e2100001800000000
00000000000000000000000000000000
00000000000000000000000000000000
00000000000000000000000000000000
0000000000000000000000000000004b
PRIME Synchronization: 0
supported: 0, 1
Content Protection: Undesired
supported: Undesired, Desired, Enabled
aspect ratio: Automatic
supported: Automatic, 4:3, 16:9
Broadcast RGB: Automatic
supported: Automatic, Full, Limited 16:235
audio: auto
supported: force-dvi, off, auto, on
link-status: Good
supported: Good, Bad
CONNECTOR_ID: 71
supported: 71
non-desktop: 0
range: (0, 1)
...
To enable PRIME Synchronization
, I ran xrandr --output HDMI-1-1 --set "PRIME Synchronization" 1
. But, after I ran it, the 2nd monitor flicked (maybe it seems to be turned off and on quickly), and then the setting was reset to 0 again.
This situation can only be met after installing AMD linux driver for CentOS 8 v19.30, which is the recent driver at this time. If I don't install it (it means I use the default driver which is included in CentOS 8), screen tearing doesn't occur and PRIME Synchronization
is 1 for the 2nd monitor. But, I should use AMD driver instead of the default driver.
How can I set PRIME Synchronization
to 1? Or how can I find why it cannot be set to 1?
FYI, some information what I know are as below:
lspci -nn | grep "VGA\|Display"
00:02.0 Display controller : Intel Corporation 8th Gen Core Processor Gaussian Mixture Model [8086:3e91]
01:00.0 VGA compatible controller : Advanced Micro Devices, Inc. [AMD/ATI] Ellesmere [Radeon RX 470/480/570/570X/580/580X/590] [1002:67df] (rev ef)
xrandr --listproviders
Providers: number : 2
Provider 0: id: 0xa2 cap: 0xf, Source Output, Sink Output, Source Offload, Sink Offload crtcs: 6 outputs: 5 associated providers: 1 name:Radeon RX 570 Series @ pci:0000:01:00.0
Provider 1: id: 0x45 cap: 0x2, Sink Output crtcs: 3 outputs: 3 associated providers: 1 name:modesetting
Anselmo Park
(189 rep)
Jan 5, 2020, 05:53 PM
• Last activity: Apr 24, 2020, 10:06 AM
0
votes
1
answers
93
views
Is the relative inferior support of NVIDIA/Intel dual graphics on Linux compared to Windows more of a technical issue or licensing issue?
I have been using Linux desktop since 2014 and now (2020) it seems that the official support of NVIDIA/Intel dual graphics on Linux is still relative weak compared to Windows (please correct me if I am wrong). On Ubuntu there is a `prime-select` package that switches the graphics card (which still r...
I have been using Linux desktop since 2014 and now (2020) it seems that the official support of NVIDIA/Intel dual graphics on Linux is still relative weak compared to Windows (please correct me if I am wrong). On Ubuntu there is a
prime-select
package that switches the graphics card (which still requires restart the session), but on some other distros this package does not even exist. Prime-synchronization also needs manual setup and sometimes breaks the GUI system.
Is there a real fundamental technical problem that prevents seamless graphics switching on Linux desktop, or is it just a licensing issue that prevents some techniques from being deployed, and prevents both sides from being happy and making money?
Cloudy
(101 rep)
Apr 22, 2020, 10:29 AM
• Last activity: Apr 22, 2020, 11:30 AM
2
votes
0
answers
420
views
How to install GPU drivers in a ryzen/vega + nvidia laptop with Fedora?
I'm a newbie and trying to get my machine working properly in Linux, my Native language is not English so I apologize in advance for any grammar errors. I installed Fedora in my laptop: ASUS FX505D CPU: AMD Ryzen 5 3550H with Radeon Vega Mobile Gfx GPU: nvidia GTX 1650 I tried several distros so far...
I'm a newbie and trying to get my machine working properly in Linux, my Native language is not English so I apologize in advance for any grammar errors. I installed Fedora in my laptop:
ASUS FX505D
CPU: AMD Ryzen 5 3550H with Radeon Vega Mobile Gfx
GPU: nvidia GTX 1650
I tried several distros so far and attempted to install either bumblebee or prime, but I'm seriously lost here.
How should I proceed about this?
user11269293
(21 rep)
Dec 9, 2019, 10:35 AM
• Last activity: Dec 10, 2019, 03:14 AM
0
votes
2
answers
1258
views
Hybrid GPU Black Screen LINUX MINT 19.2
I've installed **LINUX MINT 19.2 (Tina)** but when i try to login the screen get black and only the cursor is shown, not even the login chime is heard as seen in other sources. When I try to use the terminal before login, the screen simply "hangs" and i cannot even login anymore. Having this in mind...
I've installed **LINUX MINT 19.2 (Tina)** but when i try to login the screen get black and only the cursor is shown, not even the login chime is heard as seen in other sources. When I try to use the terminal before login, the screen simply "hangs" and i cannot even login anymore.
Having this in mind I searched several forums to try to solve, tried to reinstall several times but always unsuccessfully, unfortunately I can not send error log because it is impossible to access terminal.
Some facts that I consider important to be informed:
- My Config
- CPU
- Intel Core i7-8750H CPU @ 2.20GHz
- RAM
- 16GB
- GPU
- Intel® UHD Graphics 630 (integrated)
- NVIDIA GeForce GTX 1050 Ti (integrated)
- Storage:
- SSD Micron 1100 MTFDDAV256TBN 256GB
- Facts:
- After login the screen get black and only cursor is shown
- Any try to open terminal end in screen hanging
- Live CD works
- After installation (remove media and restart) screen hangs
- UEFI
- Already tried with flags nouveau.noaccel=1 and nomodeset
- Tried Ubuntu, Debian, Kubuntu...
- Used Balena Etcher and Rufus to create the bootable iso's
I will be extremely grateful if someone with experience can help me solve this problem (using Mint)!
Leonardo Vinicius Maciel
(1 rep)
Nov 15, 2019, 03:30 PM
• Last activity: Nov 24, 2019, 07:03 AM
13
votes
2
answers
3544
views
Kernel modesetting hangs my boot, but the ATI driver requires it
I have a late 2011 MacBook Pro. It has an integrated Intel video card and a discrete ATI video card. Ideally, I'd like my Xorg to use the ATI card with the free driver (no Catalyst). Here's the problem: kernel modesetting hangs my boot (verified by adding `nomodeset` to kernel parameters), and I can...
I have a late 2011 MacBook Pro. It has an integrated Intel video card and a discrete ATI video card. Ideally, I'd like my Xorg to use the ATI card with the free driver (no Catalyst).
Here's the problem: kernel modesetting hangs my boot (verified by adding
When I try to start GDM using either the ATI or Intel drivers, booted without KMS, Xorg fails with a message about not finding a suitable driver (expected, since the Intel/AMD drivers need KMS). I've also tried using the
nomodeset
to kernel parameters), and I can't figure out why. However, the ATI driver _requires_ KMS, as does the Intel driver. What are my options for getting graphics with the desired setup as described above?
I'm on kernel 3.13.8, Arch GNU/Linux. I've also tried it with kernel 3.10.35, AKA the LTS kernel. No luck. As suggested in comments, I've tried to ping the affected machine after it locks up. I can't tell for sure, but it appears that it's completely frozen, not just the display.
I've also tried booting into Mac OS X and using gfxCardStatus to force using the Intel card. This did nothing.
In order to try to get more information, I've booted the MacBook with the following kernel parameters appended to my normal kernel line (the regular kernel, not the LTS kernel, and with quiet
removed), and with gfxCardStatus set to on-the-fly switching (this seemed to revert automatically on a reboot of OS X):
rootwait ignore_loglevel debug debug_locks_verbose=1 sched_debug initcall_debug mminit_loglevel=4 udev.log_priority=8 loglevel=8 earlyprintk=vga,keep log_buf_len=10M print_fatal_signals=1 apm.debug=Y i8042.debug=Y drm.debug=1 scsi_logging_level=1 usbserial.debug=Y option.debug=Y pl2303.debug=Y firewire_ohci.debug=1 hid.debug=1 pci_hotplug.debug=Y pci_hotplug.debug_acpi=Y shpchp.shpchp_debug=Y apic=debug show_lapic=all hpet=verbose lmb=debug pause_on_oops=5 panic=10 sysrq_always_enabled


xf86-video-vesa
package, but that fails with a message about having a suitable driver but not having a suitable configuration - something about the BIOS not being right.
I've tried using PRIME , but since I can't get Xorg to come up even without acceleration or anything fancy, xrandr
doesn't work and I can't even get past the first step.
I've thought about using vgaswitcheroo or something related, but I don't think that will do anything due to the fact that the underlying issue is, I believe, the fact that KMS is hanging.
The final thing I've tried is using the proprietary Catalyst driver, due to the fact that it has it's own KMS implementation, but I couldn't get it to install due to an Xorg server version mismatch. And honestly, I have less than zero desire to use a proprietary driver if I can help it, so I didn't try very hard.
I've sent the Linux Kernel Mailing List an email about this, and hopefully someone will get back to me.
Is it possible that I've run into a kernel bug or an Xorg bug worth reporting?
I've Googled, but nothing helpful's come up.
strugee
(15371 rep)
Apr 5, 2014, 01:21 AM
• Last activity: Oct 13, 2019, 12:45 AM
0
votes
1
answers
58
views
Very stuttery laptop internal display when HDMI is not connected
I'm setting up a new Manjaro install on my laptop (Dell Inspiron 7567) with NVIDIA Optimus, and I've run into an issue with my monitor configuration. When there is a monitor connected to my laptop's HDMI port, everything works fine. However, if I disable the HDMI port using `xrandr --output HDMI-0 -...
I'm setting up a new Manjaro install on my laptop (Dell Inspiron 7567) with NVIDIA Optimus, and I've run into an issue with my monitor configuration.
When there is a monitor connected to my laptop's HDMI port, everything works fine. However, if I disable the HDMI port using
xrandr --output HDMI-0 --off
, or just physically disconnect the HDMI cable, the remaining internal display is unusably laggy.
All of my windows (I'm using Openbox) only update about once every 10 seconds. The mouse cursor remains perfectly smooth though, and the rest of the computer keeps working fine; I can execute commands and they execute immediately (though I can't see the results until the monitor next refreshes) and my Spotify music continues playing.
_(Update: The issue only seems to happen on Openbox. i3 works fine when disabling HDMI.)_
As soon as I reconnect the HDMI cable or re-enable that output, everything goes back to normal and is usable again.
Since it may well be relevant, my laptop's Optimus graphics are internally connected in such a way that the NVIDIA graphics card is directly wired to the HDMI port.
Why could this be happening?
My /etc/X11/xorg.conf
(partially generated by nvidia-xconfig
):
Section "Files"
EndSection
Section "InputDevice"
# generated from default
Identifier "Mouse0"
Driver "mouse"
Option "Protocol" "auto"
Option "Device" "/dev/psaux"
Option "Emulate3Buttons" "no"
Option "ZAxisMapping" "4 5"
EndSection
Section "InputDevice"
# generated from default
Identifier "Keyboard0"
Driver "kbd"
EndSection
Section "Device"
Identifier "nvidia"
Driver "nvidia"
BusID "PCI:1:0:0"
EndSection
Section "Screen"
Identifier "nvidia"
Device "nvidia"
Option "AllowEmptyInitialConfiguration"
EndSection
Section "Device"
Identifier "intel"
Driver "modesetting"
Option "AccelMethod" "none"
EndSection
Section "Screen"
Identifier "intel"
Device "intel"
EndSection
I'm using the 430.26 proprietary NVIDIA drivers (acquired by mhwd
through the video-nvidia
configuration). Note that I am **not** using Bumblebee or a PRIME switcher.
My NVIDIA X Server Settings monitor configuration (the AOC monitor is my external HDMI one, the PRIME one is the internal monitor):

Aaron Christiansen
(119 rep)
Aug 5, 2019, 10:45 PM
• Last activity: Aug 5, 2019, 11:54 PM
Showing page 1 of 20 total questions