Unix & Linux Stack Exchange
Q&A for users of Linux, FreeBSD and other Unix-like operating systems
Latest Questions
6
votes
1
answers
2901
views
Switchable graphics Intel + AMD Venus Pro
I'm using Manjaro Linux on my laptop with switchable graphic cards: ``` 00:02.0 VGA compatible controller: Intel Corporation Haswell-ULT Integrated Graphics Controller (rev 0b) 03:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Venus PRO [Radeon HD 8850M / R9 M265X] (rev ff) `...
I'm using Manjaro Linux on my laptop with switchable graphic cards:
The above shows that bumblebee's drivers are installed, but the daemon fails:
[luke@manjaro ~]$ sudo systemctl status bumblebeed
● bumblebeed.service - Bumblebee C Daemon
Loaded: loaded (/usr/lib/systemd/system/bumblebeed.service; enabled; vendor preset: disabled)
Active: activating (auto-restart) (Result: exit-code) since pią 2015-09-18 16:22:55 CEST; 29s ago
Process: 1192 ExecStart=/usr/bin/bumblebeed (code=exited, status=1/FAILURE)
Main PID: 1192 (code=exited, status=1/FAILURE)
wrz 18 16:22:55 manjaro systemd
: bumblebeed.service: Unit entered failed state.
wrz 18 16:22:55 manjaro systemd
: bumblebeed.service: Failed with result 'exit-code'.
During my trial-and-error I also tried to install
00:02.0 VGA compatible controller: Intel Corporation Haswell-ULT Integrated Graphics Controller (rev 0b)
03:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Venus PRO [Radeon HD 8850M / R9 M265X] (rev ff)
Here's a screenshot of the drivers section of Manjaro Settings Manager:



video-catalyst
driver (in window shown above). It ended miserably - after the initial startup the screen was full white. I switched to another TTY and deleted video-catalyst using mhwd
.
My goal is to be able to play games on Steam. What can you recommend?
Luke
(151 rep)
Sep 18, 2015, 02:46 PM
• Last activity: Jun 29, 2025, 05:04 PM
1
votes
2
answers
1875
views
How to install Bumblebee in Linux Mint 17?
I found Nvidia drivers freezes in Cinnamon or their Cinnamon crash or run in fall-back mode. As Bumblebee works better, how can I install Bumblebee in Linux Mint 17?
I found Nvidia drivers freezes in Cinnamon or their Cinnamon crash or run in fall-back mode.
As Bumblebee works better, how can I install Bumblebee in Linux Mint 17?
Ashu_FalcoN
(583 rep)
Aug 24, 2014, 03:56 PM
• Last activity: Feb 12, 2025, 09:07 AM
2
votes
2
answers
1756
views
Broken Bumblebee on Debian Testing crashes with (EE) unw_get_proc_name failed: no unwind info found [-10]
I have an HP Spectre x360 with two graphics cards (Intel Corporation HD Graphics 620 rev 02 and an NVIDIA Corporation GM108M [GeForce 940MX] rev a2). I installed Debian Testing on it and eventually realized that I needed to use `primusrun` to let software access the Nvidia graphics card. This worked...
I have an HP Spectre x360 with two graphics cards (Intel Corporation HD Graphics 620 rev 02 and an NVIDIA Corporation GM108M [GeForce 940MX] rev a2). I installed Debian Testing on it and eventually realized that I needed to use
primusrun
to let software access the Nvidia graphics card. This worked great for a month or two, and then my Bumblebee setup broke -- possibly during a Debian update or because I removed the wrong package, but I'm not sure.
I've since tried uninstalling and reinstalling different Bumblebee/Nvidia/Primus packages to try to solve the problem, but I keep seeing the following segmentation fault in my Xorg.8.log file when trying to run primusrun glxgears
:
[ 57.202] (EE) Backtrace:
[ 57.205] (EE) 0: /usr/lib/xorg/Xorg (OsLookupColor+0x139) [0x55878e22b2c9]
[ 57.206] (EE) 1: /lib/x86_64-linux-gnu/libpthread.so.0 (funlockfile+0x50) [0x7f784993555f]
[ 57.206] (EE) 2: /lib/x86_64-linux-gnu/libc.so.6 (memcpy+0x1f) [0x7f7849803e4f]
[ 57.207] (EE) 3: /usr/lib/x86_64-linux-gnu/libnvidia-glcore.so.430.50 (_nv043glcore+0x27ff09) [0x7f7848425809]
[ 57.207] (EE) 4: /usr/lib/x86_64-linux-gnu/libnvidia-glcore.so.430.50 (_nv043glcore+0x28006d) [0x7f7848425b2d]
[ 57.208] (EE) 5: /usr/lib/x86_64-linux-gnu/libnvidia-glcore.so.430.50 (_nv015glcore+0x49ab8) [0x7f7847ed7fd8]
[ 57.208] (EE) unw_get_proc_name failed: no unwind info found [-10]
[ 57.208] (EE) 6: /usr/lib/nvidia/nvidia/libglxserver_nvidia.so (?+0x0) [0x7f7845e30d32]
[ 57.208] (EE)
[ 57.208] (EE) Segmentation fault at address 0x7f7845f2a000
[ 57.208] (EE)
Fatal server error:
[ 57.208] (EE) Caught signal 11 (Segmentation fault). Server aborting
I was hoping that this would sort itself out when my Nvidia drivers next upgraded, but they just upgraded to 430.50 but remain broken. Any suggestions on what is broken here or what I should look into would be hugely appreciated!
Here's my entire Xorg.8.log file:
[ 57.080]
X.Org X Server 1.20.4
X Protocol Version 11, Revision 0
[ 57.080] Build Operating System: Linux 4.9.0-8-amd64 x86_64 Debian
[ 57.080] Current Operating System: Linux diocletian 5.2.0-2-amd64 #1 SMP Debian 5.2.9-2 (2019-08-21) x86_64
[ 57.080] Kernel command line: BOOT_IMAGE=/boot/vmlinuz-5.2.0-2-amd64 root=UUID=1749bf54-f5ee-4ff3-999c-ef940cd0fb27 ro quiet
[ 57.080] Build Date: 05 March 2019 08:11:12PM
[ 57.080] xorg-server 2:1.20.4-1 (https://www.debian.org/support)
[ 57.080] Current version of pixman: 0.36.0
[ 57.080] Before reporting problems, check http://wiki.x.org
to make sure that you have the latest version.
[ 57.080] Markers: (--) probed, (**) from config file, (==) default setting,
(++) from command line, (!!) notice, (II) informational,
(WW) warning, (EE) error, (NI) not implemented, (??) unknown.
[ 57.080] (==) Log file: "/var/log/Xorg.8.log", Time: Sat Sep 21 16:25:12 2019
[ 57.080] (++) Using config file: "/etc/bumblebee/xorg.conf.nvidia"
[ 57.080] (++) Using config directory: "/etc/bumblebee/xorg.conf.d"
[ 57.080] (==) Using system config directory "/usr/share/X11/xorg.conf.d"
[ 57.080] (==) ServerLayout "Layout0"
[ 57.080] (==) No screen section available. Using defaults.
[ 57.080] (**) |-->Screen "Default Screen Section" (0)
[ 57.080] (**) | |-->Monitor ""
[ 57.080] (==) No device specified for screen "Default Screen Section".
Using the first device section listed.
[ 57.080] (**) | |-->Device "DiscreteNvidia"
[ 57.081] (==) No monitor specified for screen "Default Screen Section".
Using a default monitor configuration.
[ 57.081] (**) Option "AutoAddDevices" "false"
[ 57.081] (**) Option "AutoAddGPU" "false"
[ 57.081] (**) Not automatically adding devices
[ 57.081] (==) Automatically enabling devices
[ 57.081] (**) Not automatically adding GPU devices
[ 57.081] (==) Max clients allowed: 256, resource mask: 0x1fffff
[ 57.081] (WW) The directory "/usr/share/fonts/X11/cyrillic" does not exist.
[ 57.081] Entry deleted from font path.
[ 57.081] (==) FontPath set to:
/usr/share/fonts/X11/misc,
/usr/share/fonts/X11/100dpi/:unscaled,
/usr/share/fonts/X11/75dpi/:unscaled,
/usr/share/fonts/X11/Type1,
/usr/share/fonts/X11/100dpi,
/usr/share/fonts/X11/75dpi,
built-ins
[ 57.081] (++) ModulePath set to "/usr/lib/nvidia/nvidia,/usr/lib/xorg/modules"
[ 57.081] (==) |-->Input Device ""
[ 57.081] (==) |-->Input Device ""
[ 57.081] (==) The core pointer device wasn't specified explicitly in the layout.
Using the default mouse configuration.
[ 57.081] (==) The core keyboard device wasn't specified explicitly in the layout.
Using the default keyboard configuration.
[ 57.081] (II) Loader magic: 0x55878e2b8e20
[ 57.081] (II) Module ABI versions:
[ 57.081] X.Org ANSI C Emulation: 0.4
[ 57.081] X.Org Video Driver: 24.0
[ 57.081] X.Org XInput driver : 24.1
[ 57.081] X.Org Server Extension : 10.0
[ 57.081] (--) using VT number 2
[ 57.081] (II) systemd-logind: logind integration requires -keeptty and -keeptty was not provided, disabling logind integration
[ 57.082] (II) xfree86: Adding drm device (/dev/dri/card0)
[ 57.082] (EE) /dev/dri/card0: failed to set DRM interface version 1.4: Permission denied
[ 57.085] (--) PCI:*(1@0:0:0) 10de:134d:103c:82c1 rev 162, Mem @ 0xdc000000/16777216, 0xb0000000/268435456, 0xc0000000/33554432, I/O @ 0x0000e000/128, BIOS @ 0x????????/524288
[ 57.085] (II) LoadModule: "glx"
[ 57.085] (II) Loading /usr/lib/xorg/modules/extensions/libglx.so
[ 57.113] (II) Module glx: vendor="X.Org Foundation"
[ 57.113] compiled for 1.20.4, module version = 1.0.0
[ 57.113] ABI class: X.Org Server Extension, version 10.0
[ 57.113] (II) LoadModule: "nvidia"
[ 57.113] (II) Loading /usr/lib/nvidia/nvidia/nvidia_drv.so
[ 57.116] (II) Module nvidia: vendor="NVIDIA Corporation"
[ 57.116] compiled for 1.6.99.901, module version = 1.0.0
[ 57.116] Module class: X.Org Video Driver
[ 57.117] (II) LoadModule: "mouse"
[ 57.117] (II) Loading /usr/lib/xorg/modules/input/mouse_drv.so
[ 57.118] (II) Module mouse: vendor="X.Org Foundation"
[ 57.118] compiled for 1.20.0, module version = 1.9.3
[ 57.118] Module class: X.Org XInput Driver
[ 57.118] ABI class: X.Org XInput driver, version 24.1
[ 57.118] (II) LoadModule: "kbd"
[ 57.118] (WW) Warning, couldn't open module kbd
[ 57.118] (EE) Failed to load module "kbd" (module does not exist, 0)
[ 57.119] (II) NVIDIA dlloader X Driver 430.50 Thu Sep 5 22:43:53 CDT 2019
[ 57.119] (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
[ 57.119] (II) Loading sub module "fb"
[ 57.119] (II) LoadModule: "fb"
[ 57.120] (II) Loading /usr/lib/xorg/modules/libfb.so
[ 57.120] (II) Module fb: vendor="X.Org Foundation"
[ 57.120] compiled for 1.20.4, module version = 1.0.0
[ 57.120] ABI class: X.Org ANSI C Emulation, version 0.4
[ 57.120] (II) Loading sub module "wfb"
[ 57.120] (II) LoadModule: "wfb"
[ 57.120] (II) Loading /usr/lib/xorg/modules/libwfb.so
[ 57.121] (II) Module wfb: vendor="X.Org Foundation"
[ 57.121] compiled for 1.20.4, module version = 1.0.0
[ 57.121] ABI class: X.Org ANSI C Emulation, version 0.4
[ 57.121] (II) Loading sub module "ramdac"
[ 57.121] (II) LoadModule: "ramdac"
[ 57.122] (II) Module "ramdac" already built-in
[ 57.123] (II) NVIDIA(0): Creating default Display subsection in Screen section
"Default Screen Section" for depth/fbbpp 24/32
[ 57.123] (==) NVIDIA(0): Depth 24, (==) framebuffer bpp 32
[ 57.123] (==) NVIDIA(0): RGB weight 888
[ 57.123] (==) NVIDIA(0): Default visual is TrueColor
[ 57.123] (==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
[ 57.123] (**) NVIDIA(0): Option "ProbeAllGpus" "false"
[ 57.123] (**) NVIDIA(0): Option "UseEDID" "false"
[ 57.123] (**) NVIDIA(0): Option "UseDisplayDevice" "none"
[ 57.123] (**) NVIDIA(0): Enabling 2D acceleration
[ 57.123] (**) NVIDIA(0): Ignoring EDIDs
[ 57.123] (**) NVIDIA(0): Option "UseDisplayDevice" set to "none"; enabling NoScanout
[ 57.123] (**) NVIDIA(0): mode
[ 57.123] (II) Loading sub module "glxserver_nvidia"
[ 57.123] (II) LoadModule: "glxserver_nvidia"
[ 57.123] (II) Loading /usr/lib/nvidia/nvidia/libglxserver_nvidia.so
[ 57.153] (II) Module glxserver_nvidia: vendor="NVIDIA Corporation"
[ 57.153] compiled for 1.6.99.901, module version = 1.0.0
[ 57.153] Module class: X.Org Server Extension
[ 57.153] (II) NVIDIA GLX Module 430.50 Thu Sep 5 22:41:46 CDT 2019
[ 57.156] (II) NVIDIA(0): NVIDIA GPU GeForce 940MX (GM108-A) at PCI:1:0:0 (GPU-0)
[ 57.156] (--) NVIDIA(0): Memory: 2097152 kBytes
[ 57.156] (--) NVIDIA(0): VideoBIOS: 82.08.62.00.15
[ 57.156] (II) NVIDIA(0): Detected PCI Express Link width: 4X
[ 57.156] (II) NVIDIA(0): Validated MetaModes:
[ 57.156] (II) NVIDIA(0): "NULL"
[ 57.156] (II) NVIDIA(0): Virtual screen size determined to be 640 x 480
[ 57.156] (WW) NVIDIA(0): Unable to get display device for DPI computation.
[ 57.156] (==) NVIDIA(0): DPI set to (75, 75); computed from built-in default
[ 57.156] (II) NVIDIA: Using 6144.00 MB of virtual memory for indirect memory
[ 57.156] (II) NVIDIA: access.
[ 57.185] (II) NVIDIA(0): Setting mode "NULL"
[ 57.189] (==) NVIDIA(0): Disabling shared memory pixmaps
[ 57.189] (==) NVIDIA(0): Backing store enabled
[ 57.189] (==) NVIDIA(0): Silken mouse enabled
[ 57.189] (==) NVIDIA(0): DPMS enabled
[ 57.189] (WW) NVIDIA(0): Option "NoLogo" is not used
[ 57.189] (II) Loading sub module "dri2"
[ 57.189] (II) LoadModule: "dri2"
[ 57.189] (II) Module "dri2" already built-in
[ 57.189] (II) NVIDIA(0): [DRI2] Setup complete
[ 57.189] (II) NVIDIA(0): [DRI2] VDPAU driver: nvidia
[ 57.189] (II) Initializing extension Generic Event Extension
[ 57.190] (II) Initializing extension SHAPE
[ 57.190] (II) Initializing extension MIT-SHM
[ 57.190] (II) Initializing extension XInputExtension
[ 57.190] (II) Initializing extension XTEST
[ 57.190] (II) Initializing extension BIG-REQUESTS
[ 57.190] (II) Initializing extension SYNC
[ 57.190] (II) Initializing extension XKEYBOARD
[ 57.190] (II) Initializing extension XC-MISC
[ 57.190] (II) Initializing extension SECURITY
[ 57.191] (II) Initializing extension XFIXES
[ 57.191] (II) Initializing extension RENDER
[ 57.191] (II) Initializing extension RANDR
[ 57.191] (II) Initializing extension COMPOSITE
[ 57.191] (II) Initializing extension DAMAGE
[ 57.191] (II) Initializing extension MIT-SCREEN-SAVER
[ 57.191] (II) Initializing extension DOUBLE-BUFFER
[ 57.191] (II) Initializing extension RECORD
[ 57.191] (II) Initializing extension DPMS
[ 57.192] (II) Initializing extension Present
[ 57.192] (II) Initializing extension DRI3
[ 57.192] (II) Initializing extension X-Resource
[ 57.192] (II) Initializing extension XVideo
[ 57.192] (II) Initializing extension XVideo-MotionCompensation
[ 57.192] (II) Initializing extension SELinux
[ 57.192] (II) SELinux: Disabled on system
[ 57.192] (II) Initializing extension GLX
[ 57.192] (II) Initializing extension GLX
[ 57.192] (II) Indirect GLX disabled.
[ 57.192] (II) GLX: Another vendor is already registered for screen 0
[ 57.192] (II) Initializing extension XFree86-VidModeExtension
[ 57.192] (II) Initializing extension XFree86-DGA
[ 57.193] (II) Initializing extension XFree86-DRI
[ 57.193] (II) Initializing extension DRI2
[ 57.193] (II) Initializing extension NV-GLX
[ 57.193] (II) Initializing extension NV-CONTROL
[ 57.202] (EE)
[ 57.202] (EE) Backtrace:
[ 57.205] (EE) 0: /usr/lib/xorg/Xorg (OsLookupColor+0x139) [0x55878e22b2c9]
[ 57.206] (EE) 1: /lib/x86_64-linux-gnu/libpthread.so.0 (funlockfile+0x50) [0x7f784993555f]
[ 57.206] (EE) 2: /lib/x86_64-linux-gnu/libc.so.6 (memcpy+0x1f) [0x7f7849803e4f]
[ 57.207] (EE) 3: /usr/lib/x86_64-linux-gnu/libnvidia-glcore.so.430.50 (_nv043glcore+0x27ff09) [0x7f7848425809]
[ 57.207] (EE) 4: /usr/lib/x86_64-linux-gnu/libnvidia-glcore.so.430.50 (_nv043glcore+0x28006d) [0x7f7848425b2d]
[ 57.208] (EE) 5: /usr/lib/x86_64-linux-gnu/libnvidia-glcore.so.430.50 (_nv015glcore+0x49ab8) [0x7f7847ed7fd8]
[ 57.208] (EE) unw_get_proc_name failed: no unwind info found [-10]
[ 57.208] (EE) 6: /usr/lib/nvidia/nvidia/libglxserver_nvidia.so (?+0x0) [0x7f7845e30d32]
[ 57.208] (EE)
[ 57.208] (EE) Segmentation fault at address 0x7f7845f2a000
[ 57.208] (EE)
Fatal server error:
[ 57.208] (EE) Caught signal 11 (Segmentation fault). Server aborting
[ 57.208] (EE)
[ 57.208] (EE)
Please consult the The X.Org Foundation support
at http://wiki.x.org
for help.
[ 57.208] (EE) Please also check the log file at "/var/log/Xorg.8.log" for additional information.
[ 57.208] (EE)
[ 57.210] (EE) Server terminated with error (1). Closing log file.
Gaurav
(1123 rep)
Sep 21, 2019, 08:41 PM
• Last activity: Jan 6, 2023, 12:25 PM
3
votes
1
answers
2550
views
What is the difference between optirun and primusrun (bumblebee)
I have installed the Lightworks video editor on Debian Jessie. For best performance it needs to run on a discrete video card with the proprietary driver. A Nvidia GTX 860M in my case. I have installed Bumblebee to switch between video cards as needed. With `optirun` or `primusrun` it is possible to...
I have installed the Lightworks video editor on Debian Jessie. For best performance it needs to run on a discrete video card with the proprietary driver. A Nvidia GTX 860M in my case. I have installed Bumblebee to switch between video cards as needed. With
optirun
or primusrun
it is possible to run an application using the Nvidia card.
When I use optirun for Lightworks it crashes after startup. When I use primusrun it doesn't and performance is okay. Why is that? What is the difference between the two?
This question has been asked before , but remains unanswered.
This answer on a different question does allude to a difference, but doesn't explain it.
user200823
Nov 17, 2016, 11:07 AM
• Last activity: Sep 22, 2022, 09:29 PM
4
votes
2
answers
3194
views
Primusrun/Optirun allegedly cannot locate/open config directory: "/etc/bumblebee/xorg.conf.d"
I have recently switched from open source drivers to nvidia, to `bumblebee` as instructed by ubuntuforums.org users to better use my two gpu's capabilities. It also so happens that it does not seem to work at all, I keep getting this error regardless of which command I try. Doing `ll /etc/bumblebee/...
I have recently switched from open source drivers to nvidia, to
bumblebee
as instructed by ubuntuforums.org users to better use my two gpu's capabilities.
It also so happens that it does not seem to work at all, I keep getting this error regardless of which command I try.
Doing ll /etc/bumblebee/xorg.conf.d
I can see that this presumed config file is a directory, I am not too knowledgeable about this but I think it's correct provided the .d
extension.
Full error:
optirun glxgears
[ 9546.928811] [ERROR]Cannot access secondary GPU - error: [XORG] (EE) Unable to locate/open config directory: "/etc/bumblebee/xorg.conf.d"
primusrun glxgears
primus: fatal: Bumblebee daemon reported: error: [XORG] (EE) Unable to locate/open config directory: "/etc/bumblebee/xorg.conf.d"
EDIT:
I didn't specify it but yes xorg.conf.d
exists and if I try use ll
on that directory my result is this:
ll /etc/bumblebee/xorg.conf.d
total 8
drwxr-xr-x 2 root root 4096 jan 2 14:54 ./
drwxr-xr-x 3 root root 4096 jun 18 22:55 ../
kz-n
(41 rep)
Jun 19, 2021, 05:42 PM
• Last activity: Mar 11, 2022, 02:11 PM
1
votes
1
answers
1675
views
X reads monitor and layout conf on boot but xrandr only shows Screen0
I've been setting up arch linux for a few days - mostly it's been easy to make progress but I'm having a real problem with multihead support. I have a Thinkpad X1 extreme with an Intel and NVIDIA video chip and hybrid graphics I've installed: - xf86-video-intel - xf86-video-vesa - bumblebee - bbswit...
I've been setting up arch linux for a few days - mostly it's been easy to make progress but I'm having a real problem with multihead support.
I have a Thinkpad X1 extreme with an Intel and NVIDIA video chip and hybrid graphics
I've installed:
- xf86-video-intel
- xf86-video-vesa
- bumblebee
- bbswitch
- lightdm
- i3wm
This is the xorg conf I'm using:
Section "ServerLayout"
Identifier "Layout0"
Screen 0 "Screen0" 0 0
Screen 1 "Screen1" Above "Screen0"
InputDevice "Mouse0" "CorePointer"
InputDevice "Keyboard0" "CoreKeyboard"
EndSection
Section "Module"
Load "dbe"
Load "extmod"
Load "type1"
Load "freetype"
Load "glx"
EndSection
Section "InputDevice"
# generated from default
Identifier "Mouse0"
Driver "mouse"
Option "Protocol" "auto"
Option "Device" "/dev/psaux"
Option "Emulate3Buttons" "no"
Option "ZAxisMapping" "4 5"
EndSection
Section "InputDevice"
# generated from default
Identifier "Keyboard0"
Driver "keyboard"
EndSection
Section "Device"
Identifier "intelgpu0"
Driver "intel"
Screen 0
Option "XvMC" "true"
Option "UseEvents" "true"
Option "AccelMethod" "UXA"
BusID "PCI:00:02:0"
EndSection
Section "Device"
Identifier "intelgpu1"
Driver "intel"
Screen 1
Option "XvMC" "true"
Option "UseEvents" "true"
Option "AccelMethod" "UXA"
BusID "PCI:00:02:0"
EndSection
Section "Device"
Identifier "nvidiagpu0"
Driver "nvidia"
BusID "PCI:01:00:0"
EndSection
Section "Monitor"
Identifier "Monitor0"
Option "Enable" "true"
EndSection
Section "Monitor"
Identifier "Monitor1"
Option "Enable" "true"
EndSection
Section "Screen"
Identifier "Screen0"
Device "intelgpu0"
Monitor "Monitor0"
DefaultDepth 24
Option "TwinView" "0"
SubSection "Display"
Depth 24
EndSubSection
EndSection
Section "Screen"
Identifier "Screen1"
Device "intelgpu1"
Monitor "Monitor1"
DefaultDepth 24
Option "TwinView" "0"
SubSection "Display"
Depth 24
EndSubSection
EndSection
This is the lspci -v output of the two video devices:
00:02.0 VGA compatible controller: Intel Corporation Device 3e9b (prog-if 00 [VGA controller])
Subsystem: Lenovo Device 2267
Flags: bus master, fast devsel, latency 0, IRQ 171
Memory at 404a000000 (64-bit, non-prefetchable) [size=16M]
Memory at 80000000 (64-bit, prefetchable) [size=256M]
I/O ports at 6000 [size=64]
[virtual] Expansion ROM at 000c0000 [disabled] [size=128K]
Capabilities:
Kernel driver in use: i915
Kernel modules: i915
01:00.0 VGA compatible controller: NVIDIA Corporation GP107M [GeForce GTX 1050 Ti Mobile] (rev a1) (prog-if 00 [VGA controller])
Subsystem: Lenovo GP107M [GeForce GTX 1050 Ti Mobile]
Flags: bus master, fast devsel, latency 0, IRQ 16
Memory at bf000000 (32-bit, non-prefetchable) [size=16M]
Memory at 60000000 (64-bit, prefetchable) [size=256M]
Memory at 70000000 (64-bit, prefetchable) [size=32M]
I/O ports at 5000 [size=128]
Expansion ROM at c0000000 [disabled] [size=512K]
Capabilities:
Kernel driver in use: nvidia
Kernel modules: nouveau, nvidia_drm, nvidia
When I boot, X seems to pick up the layout and screens alright (previously I had a hard time getting it to read past the first screen section):
[ 2.312] (==) Using config directory: "/etc/X11/xorg.conf.d"
[ 2.312] (==) Using system config directory "/usr/share/X11/xorg.conf.d"
[ 2.314] (==) ServerLayout "Layout0"
[ 2.314] (**) |-->Screen "Screen0" (0)
[ 2.314] (**) | |-->Monitor "Monitor0"
[ 2.315] (**) | |-->Device "intelgpu0"
[ 2.315] (**) |-->Screen "Screen1" (1)
[ 2.315] (**) | |-->Monitor "Monitor1"
[ 2.315] (**) | |-->Device "intelgpu1"
[ 2.315] (**) |-->Input Device "Mouse0"
[ 2.315] (**) |-->Input Device "Keyboard0"
[ 2.315] (==) Automatically adding devices
[ 2.315] (==) Automatically enabling devices
[ 2.315] (==) Automatically adding GPU devices
[ 2.315] (==) Automatically binding GPU devices
([pastebin of the whole Xorg.0.log](https://pastebin.com/dhpu7awL))
and bbswitch seems to recognize and manage both video devices, albeit with some kind of problem:
[ 2.272033] bbswitch: loading out-of-tree module taints kernel.
[ 2.272053] bbswitch: module verification failed: signature and/or required key missing - tainting kernel
[ 2.272220] bbswitch: version 0.8
[ 2.272238] bbswitch: Found integrated VGA device 0000:00:02.0: \_SB_.PCI0.GFX0
[ 2.272243] bbswitch: Found discrete VGA device 0000:01:00.0: \_SB_.PCI0.PEG0.PEGP
[ 2.272435] bbswitch: detected an Optimus _DSM function
[ 2.272526] bbswitch: Succesfully loaded. Discrete card 0000:01:00.0 is on
[ 2.276310] bbswitch: disabling discrete graphics
([pastebin of the entire output of dmesg](https://pastebin.com/unjfqJ9F))
but still,
xrandr
and xrandr --listproviders
only show 1 screen and 1 provider:
$ xrandr
Screen 0: minimum 320 x 200, current 1920 x 1080, maximum 8192 x 8192
eDP1 connected primary 1920x1080+0+0 (normal left inverted right x axis y axis) 344mm x 194mm
1920x1080 60.03*+ 59.96 59.93 47.99
1680x1050 59.95 59.88
...
$ xrandr --listproviders
Providers: number : 1
Provider 0: id: 0x43 cap: 0xb, Source Output, Sink Output, Sink Offload crtcs: 3 outputs: 1 associated providers: 0 name:Intel
I can't use xrandr to turn on or move around displays because the output just isn't available after boot (whether a monitor is plugged in or not)
Carson Myers
(111 rep)
Sep 26, 2018, 05:56 PM
• Last activity: Jan 17, 2022, 12:05 AM
3
votes
1
answers
4944
views
Optirun/Bumblbee: Cannot access secondary GPU
I use Arch Linux with the zen kernel. After doing system upgrades my dual GPU setup broke. I use bumblebee to handle dual GPUs. When I run `optirun glxgears` I get the following output: ``` [ERROR]Cannot access secondary GPU - error: [XORG] (EE) No devices detected. [ERROR]Aborting because fallback...
I use Arch Linux with the zen kernel. After doing system upgrades my dual GPU setup broke. I use bumblebee to handle dual GPUs.
When I run
optirun glxgears
I get the following output:
[ERROR]Cannot access secondary GPU - error: [XORG] (EE) No devices detected.
[ERROR]Aborting because fallback start is disabled.
When I try opening the Nvidia-settings panel I get the following output:
ERROR: Unable to load info from any available system
My current Nvidia driver version is 495.44
And here is my bumblebee config if that helps:
/etc/bumblebee/bumblebee.conf
# Configuration file for Bumblebee. Values should **not** be put between quotes
## Server options. Any change made in this section will need a server restart
# to take effect.
[bumblebeed]
# The secondary Xorg server DISPLAY number
VirtualDisplay=:8
# Should the unused Xorg server be kept running? Set this to true if waiting
# for X to be ready is too long and don't need power management at all.
KeepUnusedXServer=false
# The name of the Bumbleblee server group name (GID name)
ServerGroup=bumblebee
# Card power state at exit. Set to false if the card shoud be ON when Bumblebee
# server exits.
TurnCardOffAtExit=false
# The default behavior of '-f' option on optirun. If set to "true", '-f' will
# be ignored.
NoEcoModeOverride=false
# The Driver used by Bumblebee server. If this value is not set (or empty),
# auto-detection is performed. The available drivers are nvidia and nouveau
# (See also the driver-specific sections below)
Driver=nvidia
# Directory with a dummy config file to pass as a -configdir to secondary X
XorgConfDir=/etc/bumblebee/xorg.conf.d
## Client options. Will take effect on the next optirun executed.
[optirun]
# Acceleration/ rendering bridge, possible values are auto, virtualgl and
# primus.
Bridge=auto
# The method used for VirtualGL to transport frames between X servers.
# Possible values are proxy, jpeg, rgb, xv and yuv.
VGLTransport=proxy
# List of paths which are searched for the primus libGL.so.1 when using
# the primus bridge
PrimusLibraryPath=/usr/lib/primus:/usr/lib32/primus
# Should the program run under optirun even if Bumblebee server or nvidia card
# is not available?
AllowFallbackToIGC=false
# Driver-specific settings are grouped under [driver-NAME]. The sections are
# parsed if the Driver setting in [bumblebeed] is set to NAME (or if auto-
# detection resolves to NAME).
# PMMethod: method to use for saving power by disabling the nvidia card, valid
# values are: auto - automatically detect which PM method to use
# bbswitch - new in BB 3, recommended if available
# switcheroo - vga_switcheroo method, use at your own risk
# none - disable PM completely
# https://github.com/Bumblebee-Project/Bumblebee/wiki/Comparison-of-PM-methods
## Section with nvidia driver specific options, only parsed if Driver=nvidia
[driver-nvidia]
# Module name to load, defaults to Driver if empty or unset
KernelDriver=nvidia
PMMethod=bbswitch
# colon-separated path to the nvidia libraries
LibraryPath=/usr/lib/nvidia:/usr/lib32/nvidia:/usr/lib:/usr/lib32
# comma-separated path of the directory containing nvidia_drv.so and the
# default Xorg modules path
XorgModulePath=/usr/lib/nvidia/xorg,/usr/lib/xorg/modules
XorgConfFile=/etc/bumblebee/xorg.conf.nvidia
## Section with nouveau driver specific options, only parsed if Driver=nouveau
[driver-nouveau]
KernelDriver=nouveau
PMMethod=auto
XorgConfFile=/etc/bumblebee/xorg.conf.nouveau
If I'm missing something please let me know!
Thanks!
pro
(55 rep)
Nov 20, 2021, 04:57 AM
• Last activity: Nov 21, 2021, 06:12 PM
1
votes
2
answers
876
views
NvidiaSettings not working on Bumblebee Arch Linux
So, I have a fresh Arch install. I have Bubmlebee, mesa, the intel drivers, bbswitch, etc... The problem is that Nvidia Settings says there is no X configuration file when I launch it. I create one with nvidia-xconfig, but when I start the x server, it says it can not find the display. I have no ide...
So, I have a fresh Arch install. I have Bubmlebee, mesa, the intel drivers, bbswitch, etc... The problem is that Nvidia Settings says there is no X configuration file when I launch it. I create one with nvidia-xconfig, but when I start the x server, it says it can not find the display. I have no idea what to do to get it to work. All I need is for it to either be similar to Linux Mint where I can click and switch between integrated and dedicated graphics, or I need to get Nvidia Settings to work so I can turn on v-sync because of all the tearing. Or I might just have to ditch it all and only install one set of drivers.
Kenneth Clark
(131 rep)
Apr 13, 2016, 10:44 AM
• Last activity: Nov 17, 2021, 08:33 PM
2
votes
1
answers
667
views
Xorg works only when bbswitch is ON?
I've got a weird problem. I installed bumblebee and when I start Xorg with `startx` it freezes the system completely (cannot switch tty or REISUB). But, when I use bbswitch to turn my dedicated graphics card ON `startx` **does** work and, seeing the differences in fps when running `glxspheres64` and...
I've got a weird problem.
I installed bumblebee and when I start Xorg with
startx
it freezes the system completely (cannot switch tty or REISUB). But, when I use bbswitch to turn my dedicated graphics card ON startx
**does** work and, seeing the differences in fps when running glxspheres64
and optirun glxspheres64
, it looks like bumblebee does work correctly.
I thought this should work even when bbswitch is OFF because bumblebee will start the dedicated graphics when necessary?
My system:
- Model: Asus N551VW
- OS: Arch
- CPU: Intel i7 6700HQ
- Dedicated: Nvidia 960m
Wouter92
(131 rep)
Jan 17, 2016, 03:23 PM
• Last activity: Nov 17, 2021, 12:54 PM
8
votes
1
answers
6435
views
HDMI port doesn't work Nvidia/Intel Bumblebee Driver for Laptop with Manjaro Linux
## Informations OS : Manjaro Linux 16.10 (Cinnamon Community Edition) $ cat /etc/*-release DISTRIB_ID=ManjaroLinux DISTRIB_RELEASE=16.10 DISTRIB_CODENAME=Fringilla DISTRIB_DESCRIPTION="Manjaro Linux" Manjaro Linux NAME="Manjaro Linux" ID=manjaro PRETTY_NAME="Manjaro Linux" ANSI_COLOR="1;32" HOME_URL...
## Informations
OS : Manjaro Linux 16.10 (Cinnamon Community Edition)
$ cat /etc/*-release
DISTRIB_ID=ManjaroLinux
DISTRIB_RELEASE=16.10
DISTRIB_CODENAME=Fringilla
DISTRIB_DESCRIPTION="Manjaro Linux"
Manjaro Linux
NAME="Manjaro Linux"
ID=manjaro
PRETTY_NAME="Manjaro Linux"
ANSI_COLOR="1;32"
HOME_URL="http://www.manjaro.org/ "
SUPPORT_URL="http://www.manjaro.org/ "
BUG_REPORT_URL="http://bugs.manjaro.org/ "
Type : Laptop
Kernel : 4.4.28-2-MANJARO
CPU : Intel(R) Core(TM) i7-3610QM CPU @ 2.30GHz
GPUs :
* Intel Corporation 3rd Gen Core processor Graphics Controller (rev 09)
* NVIDIA Corporation GF108M [GeForce GT 630M] (rev ff)
PCI Addresses :
* Intel iGPU : 00:02.0
* Nvidia GPU : 01:00.0
Driver from Bumblebee.
## Problem
I found (and read) the
But my
NVIDIA
and NVIDIA Optimus
ArchLinux wiki (because yes, Manjaro is ArchLinux based).
But when I installed Manjaro I installed it with NON-FREE drivers and so nvidia
, nvidia-libgl
and xorg-xrandr
are already installed and up-to-date.
And my **HDMI port is not working**. I think that's because only the Intel iGPU is present in the X11 conf (HDMI port is part of the Nvidia GPU).
While I was trying to find out why, I find that Manjaro does an [Automated Identification and Installation](https://wiki.manjaro.org/index.php/Configure_Graphics_Cards#Automated_Identification_and_Installation) of GPUs during the install.
$ sudo mhwd-gpu --check
[sudo] password for shark:
Using default
Default lib32 support: true
xorg configuration symlink valid...
libGl and libglx symlinks valid...
$ sudo mhwd-gpu --status
Using default
Default lib32 support: true
:: status
lib32-libGl: '/usr/lib32/mesa/libGL.so.1.2.0'
lib32-libGLESv1: '/usr/lib32/mesa/libGLESv1_CM.so.1.1.0'
lib32-libGLESv2: '/usr/lib32/mesa/libGLESv2.so.2.0.0'
lib32-libEGL: '/usr/lib32/mesa/libEGL.so.1.0.0'
libGl: '/usr/lib/mesa/libGL.so.1.2.0'
libGLESv1: '/usr/lib/mesa/libGLESv1_CM.so.1.1.0'
libGLESv2: '/usr/lib/mesa/libGLESv2.so.2.0.0'
libEGL: '/usr/lib/mesa/libEGL.so.1.0.0'
libglx: '/usr/lib/xorg/modules/extensions/libglx.xorg'
xorg configuration file: '/etc/X11/mhwd.d/intel.conf'
The Manjaro Settings Manager (Hardware configuration) is telling me that hybrid bumblebee driver is installed in both case (see screenshot).

/etc/X11/xorg.conf.d
folder is only containing the following symlink 90-mhwd.conf -> /etc/X11/mhwd.d/intel.conf
.
/etc/X11/mhwd.d/intel.conf
content:
##
## Generated by mhwd - Manjaro Hardware Detection
##
Section "Device"
Identifier "Device0"
Driver "intel"
BusID "PCI:0:2:0"
Option "AccelMethod" "sna"
Option "DRI" "true"
EndSection
Section "DRI"
Group "video"
Mode 0666
EndSection
Section "Extensions"
Option "Composite" "Enable"
Option "RENDER" "Enable"
EndSection
Section "InputClass"
Identifier "Keyboard Defaults"
MatchIsKeyboard "yes"
Option "XkbOptions" "terminate:ctrl_alt_bksp"
EndSection
So what I have to do to make my HDMI port working? And how can I check that my Nvidia GPU is working or not?
I don't think I need to install more drivers but I can't figure if I need to use the mhwd
tool provided by Manjaro to configure some more settings or if I need to create a new /etc/X11/xorg.conf.d/20-nvidia.conf
file or even if I need to replace the intel.conf
with a nvidia.conf
.
This may help too:
$ glxinfo | grep OpenGL
OpenGL vendor string: Intel Open Source Technology Center
OpenGL renderer string: Mesa DRI Intel(R) Ivybridge Mobile
OpenGL core profile version string: 3.3 (Core Profile) Mesa 13.0.0-rc2
OpenGL core profile shading language version string: 3.30
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
OpenGL version string: 3.0 Mesa 13.0.0-rc2
OpenGL shading language version string: 1.30
OpenGL context flags: (none)
OpenGL extensions:
OpenGL ES profile version string: OpenGL ES 3.0 Mesa 13.0.0-rc2
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.00
OpenGL ES profile extensions:
**Edit** : Is this post https://unix.stackexchange.com/questions/25120/xrandr-doesnt-detect-monitor-on-hdmi-port answering my question?
## Why I am questionning here
I apologize to ask this question here but ArchLinux forum say this
> These boards are for the support of Arch Linux, and Arch ONLY If you
> have installed Archbang, Antegros, Chakra, Evo/Lution, Manjaro,
> Whatever, you are NOT running Arch Linux. Similarly, if you followed
> some random video on YouTube or used an automated script you found on
> a blog, you are NOT running Arch Linux, so do not expect any support,
> sympathy or anything but your thread being closed and told to move
> along. Arch is a DIY distro: if someone else has done it for you, then
> showing up here asking to have your hand held for more help is just
> help vampirism and is not welcome.
and I never received the confirmation mail for Manjaro Linux forum.
I made search and configuration efforts so I wish this is not help vampirism as some says. Plus I really don't want to trash my distro by installing wrong drivers (it already happened to me when I installed Nvidia drivers from their website, now I know that's a very bad idea).
noraj
(425 rep)
Nov 2, 2016, 08:24 PM
• Last activity: Nov 17, 2021, 12:52 PM
2
votes
4
answers
3978
views
NVIDIA GTX 1650 not detected on Debian 10
#Solved It was hardware problem, my motherboard was broken. Fixed now. #Problem I can’t figure out how to install Nvidia driver on my laptop. (I’m being linux user for only 4-5 days, but I think I try hard enough.) paraduxos@ASUSDOGE:/$ neofetch _,met$$$$$gg. paraduxos@ASUSDOGE ,g$$$$$$$$$$$$$$$P. -...
#Solved
It was hardware problem, my motherboard was broken. Fixed now.
#Problem
I can’t figure out how to install Nvidia driver on my laptop.
(I’m being linux user for only 4-5 days, but I think I try hard enough.)
paraduxos@ASUSDOGE:/$ neofetch
_,met$$$$$gg. paraduxos@ASUSDOGE
,g$$$$$$$$$$$$$$$P. ------------------
,g$$P" """Y$$.". OS: Debian GNU/Linux 10 (buster) x86_64
,$$P' `$$$. Host: ROG Strix G531GT_G531GT 1.0
',$$P ,ggs. `$$b: Kernel: 4.19.0-8-amd64
`d$$' ,$P"' . $$$ Uptime: 1 hour, 42 mins
$$P d$' , $$P Packages: 2256 (dpkg)
$$: $$. - ,d$$' Shell: bash 5.0.3
$$; Y$b._ _,d$P' Resolution: 1920x1080
Y$$.
.
"Y$$$$P"' DE: Xfce
`$$b "-.__ WM: Xfwm4
`Y$$ WM Theme: Default
`Y$$. Theme: Xfce [GTK2], Adwaita [GTK3]
`$$b. Icons: Tango [GTK2], Adwaita [GTK3]
`Y$$b. Terminal: xfce4-terminal
`"Y$b._ Terminal Font: Monospace 12
`""" CPU: Intel i7-9750H (12) @ 4.500GHz
GPU: Intel UHD Graphics 630
Memory: 1434MiB / 7828MiB
I'm using laptop: ASUS ROG STRIX G 531GT (GPU: NVIDIA GeForce GTX 1650, Intel on-board)
paraduxos@ASUSDOGE:/$ lspci
00:00.0 Host bridge: Intel Corporation 8th Gen Core Processor Host Bridge/DRAM Registers (rev 07)
00:02.0 VGA compatible controller: Intel Corporation UHD Graphics 630 (Mobile)
00:04.0 Signal processing controller: Intel Corporation Skylake Processor Thermal Subsystem (rev 07)
00:08.0 System peripheral: Intel Corporation Skylake Gaussian Mixture Model
00:12.0 Signal processing controller: Intel Corporation Cannon Lake PCH Thermal Controller (rev 10)
00:14.0 USB controller: Intel Corporation Cannon Lake PCH USB 3.1 xHCI Host Controller (rev 10)
00:14.2 RAM memory: Intel Corporation Cannon Lake PCH Shared SRAM (rev 10)
00:14.3 Network controller: Intel Corporation Wireless-AC 9560 [Jefferson Peak] (rev 10)
00:15.0 Serial bus controller [0c80]: Intel Corporation Cannon Lake PCH Serial IO I2C Controller (rev 10)
00:15.1 Serial bus controller [0c80]: Intel Corporation Cannon Lake PCH Serial IO I2C Controller (rev 10)
00:16.0 Communication controller: Intel Corporation Cannon Lake PCH HECI Controller (rev 10)
00:17.0 SATA controller: Intel Corporation Cannon Lake Mobile PCH SATA AHCI Controller (rev 10)
00:1d.0 PCI bridge: Intel Corporation Cannon Lake PCH PCI Express Root Port (rev f0)
00:1d.6 PCI bridge: Intel Corporation Cannon Lake PCH PCI Express Root Port (rev f0)
00:1f.0 ISA bridge: Intel Corporation Device a30d (rev 10)
00:1f.3 Audio device: Intel Corporation Cannon Lake PCH cAVS (rev 10)
00:1f.4 SMBus: Intel Corporation Cannon Lake PCH SMBus Controller (rev 10)
00:1f.5 Serial bus controller [0c80]: Intel Corporation Cannon Lake PCH SPI Controller (rev 10)
01:00.0 Non-Volatile memory controller: Intel Corporation Device f1a8 (rev 03)
02:00.0 Ethernet controller: Realtek Semiconductor Co., Ltd. RTL8111/8168/8411 PCI Express Gigabit Ethernet Controller (rev 15)
**First**, my laptop can't find NVIDIA GPU
paraduxos@ASUSDOGE:/$ nvidia-detect
No NVIDIA GPU detected.
- I also try with lspci (as shown above), lshw (also with sudo
), no NVIDIA found.
After I do some research (aka google.com)
- using lspci with grep *something* -> still not found
- installing Nvidia-driver -> still not found (and has some problem)
- some said bumblebee need (linuxquestions.org )
- some said it's BIOS problem (forums.developer.nvidia.com ): I try go to BIOS set up (F2), no NVIDIA as well (I can capture, please tell me if you need.)
I don't know how to configure BIOS so I go with nvidia-driver and bumblebee choice.
From Debian wiki, I found 3 wikis that might be related to my problems:
https://wiki.debian.org/NvidiaGraphicsDrivers :
> The NVIDIA graphics processing unit (GPU) series/codename of an installed video card can usually be identified using the lspci command.
>
> Note: if this lspci command returns more than one line of output, you
> have an Optimus (hybrid) graphics chipset, and **the instructions on
> this page do not apply to you.** Check the NVIDIA Optimus page instead.
Well, I got 0 output. But I decide to go with Optimus and discontinue this wiki. (I think I'm right, maybe?)
(I actually come back to this later, and install Version 440.59 (via buster-backports)
and after reboot, nothing happen.)
In Configuration part I haven't tried, since it state that
> However, the configuration described below should not be applied to Nvidia Optimus systems;
So I came to the second wiki
https://wiki.debian.org/NvidiaGraphicsDrivers/Optimus
$ lspci | grep 3D (No output)
This wiki said that there are 2 ways.
> First: Dynamic Graphics Disabled - xrandr and Display Manager Scripts
- This method require BusID from lspci. so I can't go with this method.
> Second: Dynamic Graphics with Bumblebee
paraduxos@ASUSDOGE:/$ glxinfo | grep OpenGL
OpenGL vendor string: Intel Open Source Technology Center
OpenGL renderer string: Mesa DRI Intel(R) UHD Graphics 630 (Coffeelake 3x8 GT2)
OpenGL core profile version string: 4.5 (Core Profile) Mesa 18.3.6
OpenGL core profile shading language version string: 4.50
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
OpenGL version string: 3.0 Mesa 18.3.6
OpenGL shading language version string: 1.30
OpenGL context flags: (none)
OpenGL extensions:
OpenGL ES profile version string: OpenGL ES 3.2 Mesa 18.3.6
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.20
OpenGL ES profile extensions:
No hybrid GPU??. I'm not quite understand the output so I continue to install Bumblebee.
https://wiki.debian.org/Bumblebee
Since I'm using Debian 10 (Buster) I following the wiki but found problem.
paraduxos@ASUSDOGE:/$ sudo apt install bumblebee-nvidia primus libgl1-nvidia-tesla-glx
Reading package lists... Done
Building dependency tree
Reading state information... Done
E: Unable to locate package libgl1-nvidia-tesla-glx
I tried google this but none seems answer my question. So I tried
paraduxos@ASUSDOGE:/$ sudo dpkg --add-architecture i386 && sudo apt update && sudo apt install bumblebee-nvidia primus libgl1-nvidia-glx primus-libs:i386 libgl1-nvidia-glx:i386
Hit:1 http://security.debian.org/debian-security buster/updates InRelease
Hit:2 http://deb.debian.org/debian buster InRelease
Hit:3 http://deb.debian.org/debian buster-updates InRelease
Reading package lists... Done
Building dependency tree
Reading state information... Done
All packages are up to date.
Reading package lists... Done
Building dependency tree
Reading state information... Done
primus-libs:i386 is already the newest version (0~20150328-7).
Some packages could not be installed. This may mean that you have
requested an impossible situation or if you are using the unstable
distribution that some required packages have not yet been created
or been moved out of Incoming.
The following information may help to resolve the situation:
The following packages have unmet dependencies:
libgl1-nvidia-glx : Depends: libnvidia-glcore (= 418.74-1) but it is not going to be installed
Recommends: nvidia-driver-libs-nonglvnd (= 418.74-1) but it is not going to be installed
Recommends: nvidia-kernel-dkms (= 418.74-1) but it is not going to be installed or
nvidia-kernel-418.74
E: Unable to correct problems, you have held broken packages.
I don't know what to do next. Please help.
_____________________________________________________________
**UPDATE:**
- Since sudo apt install bumblebee-nvidia primus libgl1-nvidia-tesla-glx
return E: Unable to locate package libgl1-nvidia-tesla-glx
so I remove that and run sudo apt install bumblebee-nvidia primus
- I do the same with `sudo dpkg --add-architecture i386 && sudo apt update && sudo apt install bumblebee-nvidia primus libgl1-nvidia-glx primus-libs:i386
change to
sudo dpkg --add-architecture i386 && sudo apt update && sudo apt install bumblebee-nvidia primus primus-libs:i386`
After I run bumblebee, it returns
paraduxos@ASUSDOGE:~$ optirun glxgears -info
[ 1097.543100] [ERROR]The Bumblebee daemon has not been started yet or the socket path /var/run/bumblebee.socket was incorrect.
[ 1097.543133] [ERROR]Could not connect to bumblebee daemon - is it running?
_____________________________________________________________
This is my second attempt after re-install Debian 10
(Live install non-free (XFCE) Debian non-free )
This is my sources.list
# See https://wiki.debian.org/SourcesList for more information.
deb http://deb.debian.org/debian buster main contrib non-free
deb-src http://deb.debian.org/debian buster main contrib non-free
deb http://deb.debian.org/debian buster-updates main contrib non-free
deb-src http://deb.debian.org/debian buster-updates main contrib non-free
deb http://security.debian.org/debian-security/ buster/updates main contrib non-free
deb-src http://security.debian.org/debian-security/ buster/updates main contrib non-free
# buster-backports
# deb http://deb.debian.org/debian buster-backports main contrib non-free
# deb-src http://deb.debian.org/debian buster-backports main contrib non-free
I tried switch the comment of backport section (and run sudo apt update
) but still got same result.
I haven't do anything with my Xorg, .xinit, or anything.
(I also read related question but I think I better ask here.)
- https://superuser.com/questions/1521457/debian-10-on-hp-desktop-with-geforce-gtx-1650-stuck-on-black-screen-and-cursor
- https://superuser.com/questions/1484109/debian-10-hybrid-graphics-how-to-use-nvidia-drivers-instead-of-nouveau
uchaosis
(31 rep)
Mar 29, 2020, 01:07 PM
• Last activity: Nov 2, 2020, 06:09 PM
0
votes
0
answers
177
views
Different nvidia-settings in ubuntu and debian
Why in ubuntu (on the top) I've all those entries, while in debian (below) I have a lot fewer? I installed nvidia drivers in ubuntu through gui and in debian through `apt install nvidia-driver`. In debian I've also same problems and maybe this has to do with `nvidia-settings`. Indeed installing the...
Why in ubuntu (on the top) I've all those entries, while in debian (below) I have a lot fewer? I installed nvidia drivers in ubuntu through gui and in debian through
apt install nvidia-driver
.
In debian I've also same problems and maybe this has to do with nvidia-settings
.
Indeed installing the drivers for my Nvidia GeForce 950m,the screen sporadically does graphical glitches. These consist of flickering, until I move the mouse abruptly.
I also installed bumblebee
to try to improve the situation, but nothing changes and not even in nvidia-settings what I need appears.
Tried also to use nvidia-xconf
, but nothing.

Teo7
(31 rep)
Oct 18, 2020, 12:56 PM
• Last activity: Oct 18, 2020, 11:35 PM
3
votes
1
answers
1413
views
Can't enable proprietary nVidia driver on debian 8 with bumblebee
I am running the Debian 8.0 64bit (Jessie, the stable release) on my HP laptop. I am having hard time in installing the proprietary nVidia driver of my graphic card 01:00.0 VGA compatible controller [0300]: NVIDIA Corporation GF108M [GeForce GT 630M] [10de:0de9] (rev ff) I have followed step by step...
I am running the Debian 8.0 64bit (Jessie, the stable release) on my HP laptop.
I am having hard time in installing the proprietary nVidia driver of my graphic card
01:00.0 VGA compatible controller : NVIDIA Corporation GF108M [GeForce GT 630M] [10de:0de9] (rev ff)
I have followed step by step the guides on debian wiki.
Bumblebee is required since my graphic card comes with nvidia optimus.
So far so good, I think I have all the right packages required on my machine.
Now I have to enable bumblebee.
It turns out the right command is
sudo optirun nvidia-settings -c :8
So a graphic front-end appears, I leave all the defaul checks and I have just to save the configuration file in the /etc/X11 folder as xorg.conf.
But at the system restart I get the black screen and x can't start, it says that there are "No screens found".
Here it is the log
[ 74.012] (II) xfree86: Adding drm device (/dev/dri/card0)
[ 74.014] (--) PCI:*(0:0:2:0) 8086:0126:103c:181d rev 9, Mem @ 0xd4000000/4194304, 0xc0000000/268435456, I/O @ 0x000050$
[ 74.014] (II) LoadModule: "glx"
[ 74.015] (II) Loading /usr/lib/xorg/modules/extensions/libglx.so
[ 74.017] (II) Module glx: vendor="X.Org Foundation"
[ 74.017] compiled for 1.16.4, module version = 1.0.0
[ 74.017] ABI class: X.Org Server Extension, version 8.0
[ 74.017] (==) AIGLX enabled
[ 74.017] (II) LoadModule: "nvidia"
[ 74.017] (WW) Warning, couldn't open module nvidia
[ 74.017] (II) UnloadModule: "nvidia"
[ 74.017] (II) Unloading nvidia
[ 74.018] (EE) Failed to load module "nvidia" (module does not exist, 0)
[ 74.018] (EE) No drivers available.
[ 74.018] (EE)
Fatal server error:
[ 74.018] (EE) no screens found(EE)
[ 74.018] (EE)
Please consult the The X.Org Foundation support
at ....
for help.
[ 74.018] (EE) Please also check the log file at "/home/zarathushtra/.local/share/xorg/Xorg.0.log" for additional infor$
[ 74.018] (EE)
Hysoka
(31 rep)
Jan 28, 2016, 06:06 PM
• Last activity: Aug 24, 2020, 03:16 PM
0
votes
1
answers
311
views
Debian unable to locate package primus-nvidia
I am attempting to install three packages through the following command: ```apt install bumblebee-nvidia primus-nvidia nvidia-smi``` The bumblebee-nvidia and nvidia-smi work individually however, primus-nvidia can't be found. I think I might be missing a line in my sources.list. Here is my current s...
I am attempting to install three packages through the following command:
install bumblebee-nvidia primus-nvidia nvidia-smi
The bumblebee-nvidia and nvidia-smi work individually however, primus-nvidia can't be found. I think I might be missing a line in my sources.list. Here is my current sources.list:
deb http://deb.debian.org/debian/ buster main non-free contrib
deb-src http://deb.debian.org/debian/ buster main non-free contrib
deb http://security.debian.org/debian-security buster/updates main contrib non-free
deb-src http://security.debian.org/debian-security buster/updates main contrib non-free
# buster-updates, previously known as 'volatile'
deb http://deb.debian.org/debian/ buster-updates main contrib non-free
deb-src http://deb.debian.org/debian/ buster-updates main contrib non-free
What do I need to do to install primus-nvidia? Any help is appreciated.
For reference, I installed debian through a non-free iso
Myth
(1 rep)
Jun 12, 2020, 11:20 AM
• Last activity: Jun 12, 2020, 12:01 PM
0
votes
1
answers
184
views
Installing CUDA 7.5 on Debian with packaged bumblebee
My laptop, which runs Debian Jessie, has switchable graphics cards (Intel + NVIDIA GT 520M), and for this I have installed Bumblebee & Primus, which depends on the NVIDIA driver in the package manager. Now I need to install CUDA 7.5, and therefore need a more recent driver (even the backports one is...
My laptop, which runs Debian Jessie, has switchable graphics cards (Intel + NVIDIA GT 520M), and for this I have installed Bumblebee & Primus, which depends on the NVIDIA driver in the package manager. Now I need to install CUDA 7.5, and therefore need a more recent driver (even the backports one is too old). However, to install the one bundled with CUDA, I need to remove the one from the package manager, which is not allowed due to the dependency.
How can I make the packaged bumblebee work while using a non packaged driver?
kalj
(111 rep)
Mar 1, 2016, 03:43 PM
• Last activity: May 2, 2020, 09:59 AM
1
votes
1
answers
4457
views
How do I permanently disable a specific display in gdm?
My laptop has an unknown display always active in the background. It is called "unknown display" in the system settings on Ubuntu GNOME, and it was the same when I was using Unity. I have once installed Enlightenment before, and I also had to explicitly switch off that display in the settings. This...
My laptop has an unknown display always active in the background. It is called "unknown display" in the system settings on Ubuntu GNOME, and it was the same when I was using Unity. I have once installed Enlightenment before, and I also had to explicitly switch off that display in the settings. This was no problem for me since I only need to switch them off once.
Now that I'm using Ubuntu GNOME, gdm seems to behave as if there is another display to the right, beyond the built-in display. I can easily switch it off by using Ctrl+P, but this solution is only temporary. This problem currently causes some rendering faults whenever I move my mouse before logging in, and it causes the background image to be misplaced to the right as the display manager tries to find the center of the two displays.
My laptop has a VGA port and an HDMI port. I have once tried connecting my laptop to a TV through HDMI, and the result was that the "Unknown Display" disappeared and was replaced by "HDMI *something*".
I would like to permanently disable the "Unknown Display" in gdm. How can I do that?
**Update:**
xrandr
output:
Screen 0: minimum 8 x 8, current 1366 x 768, maximum 32767 x 32767
LVDS1 connected primary 1366x768+0+0 (normal left inverted right x axis y axis) 309mm x 174mm
1366x768 60.0*+
1360x768 59.8 60.0
1024x768 60.0
800x600 60.3 56.2
640x480 59.9
VGA1 disconnected (normal left inverted right x axis y axis)
HDMI1 disconnected (normal left inverted right x axis y axis)
DP1 disconnected (normal left inverted right x axis y axis)
VIRTUAL1 disconnected (normal left inverted right x axis y axis)
LVDS-1-2 disconnected
VGA-1-2 connected
1024x768 60.0
800x600 60.3 56.2
848x480 60.0
640x480 59.9
HDMI-1-2 disconnected
1024x768 (0x45) 65.0MHz
h: width 1024 start 1048 end 1184 total 1344 skew 0 clock 48.4KHz
v: height 768 start 771 end 777 total 806 clock 60.0Hz
800x600 (0x46) 40.0MHz
h: width 800 start 840 end 968 total 1056 skew 0 clock 37.9KHz
v: height 600 start 601 end 605 total 628 clock 60.3Hz
800x600 (0x47) 36.0MHz
h: width 800 start 824 end 896 total 1024 skew 0 clock 35.2KHz
v: height 600 start 601 end 603 total 625 clock 56.2Hz
Update:
Output of lshw -C video
WARNING: you should run this program as super-user.
*-display
description: VGA compatible controller
product: GF108M [GeForce 610M]
vendor: NVIDIA Corporation
physical id: 0
bus info: pci@0000:01:00.0
version: a1
width: 64 bits
clock: 33MHz
capabilities: vga_controller bus_master cap_list rom
configuration: driver=nouveau latency=0
resources: irq:50 memory:a2000000-a2ffffff memory:90000000-9fffffff memory:a0000000-a1ffffff ioport:2000(size=128) memory:a3000000-a307ffff
*-display
description: VGA compatible controller
product: 2nd Generation Core Processor Family Integrated Graphics Controller
vendor: Intel Corporation
physical id: 2
bus info: pci@0000:00:02.0
version: 09
width: 64 bits
clock: 33MHz
capabilities: vga_controller bus_master cap_list rom
configuration: driver=i915 latency=0
resources: irq:48 memory:a3400000-a37fffff memory:80000000-8fffffff ioport:3000(size=64)
WARNING: output may be incomplete or inaccurate, you should run this program as super-user.
busukxuan
(213 rep)
Dec 22, 2014, 05:06 PM
• Last activity: Jan 7, 2020, 06:00 PM
1
votes
1
answers
1263
views
Installing Manjaro on my ASUS TUF GAMING FX504 SERIES
So first problem i run into is booting from the live image. But after googling a lot i could figure out something and now i am booting like following: ``` drivers=nonfree ``` and adding ``` systemd.mask=mhwd-live.service ``` to the boot options. After doing this i can boot into the live image. But t...
So first problem i run into is booting from the live image.
But after googling a lot i could figure out something and now i am booting like following:
drivers=nonfree
and adding
systemd.mask=mhwd-live.service
to the boot options.
After doing this i can boot into the live image.
But than if i try to install it it would stuck.
So edit the /usr/lib/calamares/modules/mhwdcfg/main.py
From:
def run(self):
for b in self.bus:
for id in self.identifier['net']:
self.configure(b, id)
for id in self.identifier['video']:
self.configure(b, id)
return None
To:
def run(self):
return None
After doing this i am able to install it with follwoing paritions
drive 1: ssd
/dev/nvme0n1p3 at /boot/efi with fat32 and 250mb
/dev/nvme0n1p1 at /swap with linuxswap and 8GiB
/dev/nvme0n1p2 at / with ext4 and 111GiB
drive 2: hdd
/dev/sda1 at /home with ntfs and 931.5GiB i had some errors with ext4...
So now it is installed.
After the first startup i am doing update:
sudo pacman -Syyu
after this is done i am rebooting and sometimes it stucks but starts after a hard power off.
Next i tried to install bumblebee without luck and i had to retry...
So i tried added this but i think it is not required in my newest installation i am not using it. /etc/default/grub:
acpi_osi=! acpi_osi='Windows 2009'
than i did:
sudo update-grub
and reboot
than i will try so install my dual graphics nvida drivers
sudo mhwd -a pci nonfree 0300
But i am stuck at installing the nvidia bumblebee drivers...
Everytime I install them no matter what i am stuck at a complete black screen after reboot and i cant enter with Alt+Ctrl+F1 ...
With the free drivers it is running probably but with really bad performance.
I tried one more thing which is discribed here:
howto-set-up-prime-with-nvidia
But after i did everything i am stuck at a black screen again but this time i can access different terminals via CTRL+Alt+F2 ...
Zubty
(111 rep)
May 12, 2019, 03:29 PM
• Last activity: Nov 15, 2019, 05:40 PM
19
votes
1
answers
33781
views
Do not manage to activate HDMI on a laptop (that has optimus / bumblebee)
I am trying to use the HDMI output on a PC (HP ZBook) with Debian (stretch). I have configured Bumblebee, it works well (glxinfo and optirun glxinfo report the expected information, and I tested complicated GLSL shaders that also work as expected). Now I would like to be able to plug a videoprojecto...
I am trying to use the HDMI output on a PC (HP ZBook) with Debian (stretch). I have configured Bumblebee, it works well (glxinfo and optirun glxinfo report the expected information, and I tested complicated GLSL shaders that also work as expected).
Now I would like to be able to plug a videoprojector on the HDMI. I have read here that intel-virtual-output can be used to configure it when the HDMI is connected on the NVidia board (using a VIRTUAL output that can be manipulated by xrandr). However, intel-virtual-output says:
no VIRTUAL outputs on ":0"
When I do
xrandr -q
, there is no VIRTUAL output listed, I only have:
Screen 0: minimum 320 x 200, current 1920 x 1080, maximum 8192 x 8192
eDP-1 connected primary 1920x1080+0+0 (normal left inverted right x axis y axis) 345mm x 194mm
1920x1080 60.02*+ 59.93
1680x1050 59.95 59.88
1600x1024 60.17
... other video modes ...
400x300 60.32 56.34
320x240 60.05
DP-1 disconnected (normal left inverted right x axis y axis)
HDMI-1 disconnected (normal left inverted right x axis y axis)
DP-2 disconnected (normal left inverted right x axis y axis)
HDMI-2 disconnected (normal left inverted right x axis y axis)
My installed version of xserver-xorg-video-intel is: xserver-xorg-video-intel_2.99.917+git20160706-1_amd64.deb
**Update (Sat. Dec. 09 2016)** I have updated Debian, and now X crashes when second monitor is active when I starting some applications (for instance xemacs). Sat. Dec. 17 2016: Yes, found out ! (updated the answer).
**Update (Wed Sep 27 2017)** The method works in 99% of the cases, but last week I tried a beamer that only accepts 50Hz modes, and could not get anything else than 60Hz (so it did not work). Anybody knows how to force 50Hz modes ?
**Update (Tue 01 Oct 2019)** Argh! Broken again: After updating X and the NVidia driver, optirun now crashes (/var/log/Xorg.8.log
says crash in Xorg, OsLookupColor+0x139). **Update (07 Oct 2019)** Found a temporary fix (updated answer).
https://github.com/Bumblebee-Project/Bumblebee/wiki/Multi-monitor-setup
BrunoLevy
(551 rep)
Nov 4, 2016, 08:07 PM
• Last activity: Nov 15, 2019, 01:25 PM
1
votes
0
answers
157
views
Laptop with NVidia and Intel GPUs, choose the graphic board connected to a USB-C adapter
I am using a laptop that has both an Intel and NVidia GPU, with a second monitor plugged on an USB-C adapter and bumblebee enabled. By default, the USB-C adapter connects to the NVidia board, then I am able to use it with bumblebee + intel-virtual-output. Now I noticed that sometimes, randomly, the...
I am using a laptop that has both an Intel and NVidia GPU, with a second monitor plugged on an USB-C adapter and bumblebee enabled. By default, the USB-C adapter connects to the NVidia board, then I am able to use it with bumblebee + intel-virtual-output.
Now I noticed that sometimes, randomly, the USB-C adapter connects to the Intel board instead, which I prefer (because it is more direct and display is more fluid than using intel-virtual-output that copies the framebuffer in the background). Since it sometimes happens, it means that it is at least possible. Is there a way of controlling that, that is specifying which graphic board the USB-C adapter talks to ?
Note: if I deactivate the discrete GPU in the BIOS, then the USB-C connects to the integrated Intel board. If the discrete GPU is active, then 9 times of out 10 it is connected to the USB-C, and 1 time out of 10 it is the Intel GPU.
BrunoLevy
(551 rep)
Oct 26, 2019, 04:37 PM
• Last activity: Oct 29, 2019, 03:18 PM
2
votes
1
answers
784
views
Docker with Bumblebee on Fedora
I have a Notebook (Xiaomi Mi Notebook Pro) with a Nvidia MX150. So it's utilizing the Nvidia's Optimus technology. As a distribution, I use Fedora 28. ## Bumblebee Therefore I installed Bumblebee to take advantage of this technology. It should be installed correctly since I can start glmark2 via `op...
I have a Notebook (Xiaomi Mi Notebook Pro) with a Nvidia MX150. So it's utilizing the Nvidia's Optimus technology. As a distribution, I use Fedora 28.
## Bumblebee
Therefore I installed Bumblebee to take advantage of this technology.
It should be installed correctly since I can start glmark2 via
optirun
or primusrun
. Also running cat /proc/acpi/bbswitch
outputs ON
.
So the Nvidia GPU should indeed be running.
## Docker
To install docker, I followed the instructions on https://docs.docker.com/install/linux/docker-ce/fedora/#install-docker-ce
Running docker run hello-world
outputs what it should, so docker does also work.
## nvidia-docker2
I got nvidia-docker2 installed on Fedora with this commands:
curl -s -L https://nvidia.github.io/nvidia-docker/centos7/nvidia-docker.repo | \
sudo tee /etc/yum.repos.d/nvidia-docker.repo
sudo dnf install nvidia-docker2
sudo pkill -SIGHUP dockerd
## Installed nvidia packages
To check which nvidia packages are installed, I run this command:
rpm -qa '*nvidia*'
- Output:akmod-nvidia-396.51-1.fc28.x86_64
- nvidia-container-runtime-2.0.0-1.docker18.06.1.x86_64
- nvidia-driver-396.51-1.fc28.x86_64
- kmod-nvidia-4.17.9-200.fc28.x86_64-396.45-1.fc28.x86_64
- kmod-nvidia-4.17.14-202.fc28.x86_64-396.51-1.fc28.x86_64
- nvidia-docker2-2.0.3-1.docker18.06.1.ce.noarch
- nvidia-driver-libs-396.51-1.fc28.x86_64
- nvidia-container-runtime-hook-1.4.0-1.x86_64
- libnvidia-container1-1.0.0-0.1.rc.2.x86_64
- kmod-nvidia-4.17.12-200.fc28.x86_64-396.45-1.fc28.x86_64
- libnvidia-container-tools-1.0.0-0.1.rc.2.x86_64
## Test docker is running with Nvidia GPU
Unfortunately, docker doesn't currently run with the Nvidia GPU:
optirun docker run --runtime=nvidia --rm nvidia/cuda nvidia-smi
I get this error:
docker: Error response from daemon: OCI runtime create failed: container_linux.go:348: starting container process caused "process_linux.go:402: container init caused \"process_linux.go:385: running prestart hook 1 caused \\\"error running hook: exit status 1, stdout: , stderr: exec command: [/usr/bin/nvidia-container-cli --load-kmods configure --ldconfig=@/sbin/ldconfig --device=all --compute --utility --require=cuda>=9.0 --pid=26115 /var/lib/docker/overlay2/c00aa7855e42deee545cb07531a571538e0d051d38f45e36584a1c850dd47680/merged]\\\\nnvidia-container-cli: initialization error: driver error: failed to process request\\\\n\\\"\"": unknown.
## What am I missing?
For now, I am clueless where's the error. I guess it could be a problem with the CUDA
version.
Roman
(133 rep)
Aug 29, 2018, 09:05 AM
• Last activity: Oct 15, 2019, 10:11 AM
Showing page 1 of 20 total questions