2 GPUs: how to choose one?


#1

I managed to do a fresh install of Trident RELEASE into a ZFS partition. I was amazed on how it went flawlessly - and it even recognized my 2 GPUs: Intel and NVIDIA!

Though I have issues with display artifacts when using Java applications.

I recall I have seen them before. In previous distributions, I should prefer the Intel GPU, and I had an option to select which GPU to use on startup. I cannot find such option in Trident.

Is there a way to select which GPU to use - or to disable one of them?


#2
Post your /usr/local/etc/X11/xorg.conf
and the output of "about".
John
groenveld@acm.org

#3
Section "ServerLayout"
        Identifier     "XFree86 Configured"
        Screen      0  "Screen0" 0 0
EndSection

Section "Screen"
        Identifier "Screen0"
        Device     "Card0"
        Monitor    "Monitor0"
EndSection

  
Section "Device"
  Identifier      "Card0"
  Driver          "modesetting"
  BusID           "0:2:0"
  Option   "AccelMethod"   "none"
EndSection

Section "Device"
  Identifier      "Card1"
  Driver          "nvidia"
  BusID           "1:0:0"
  
EndSection

Project Trident Information
--------------------------------------
OS Version: 18.12-RELEASE
Build Date: 20190113173435
--------------------------------------
TrueOS Sources Used:
  Base Repository: https://github.com/trueos/trueos
  Base Commit Branch or Tag: trueos-stable-18.12
  Ports Repository: https://github.com/trueos/trueos-ports
  Ports Commit Branch or Tag: b42445dec15979dabc50c64073c0fe08dd870a9a
--------------------------------------
System Specs:
  Boot Method: UEFI
  Intel(R) Core(TM) i5-7200U CPU @ 2.50GHz
  Physical CPUs: 4
  Physical Memory: 7.87 GB
--------------------------------------
GPU Information:
  vgapci0@pci0:0:2:0:   class=0x030000 card=0xc784144d chip=0x59168086 rev=0x02 hdr=0x00
    vendor     = 'Intel Corporation'
    device     = 'HD Graphics 620'
    class      = display
    subclass   = VGA
  vgapci1@pci0:1:0:0:   class=0x030200 card=0xc784144d chip=0x134f10de rev=0xa2 hdr=0x00
    vendor     = 'NVIDIA Corporation'
    device     = 'GM108M [GeForce 920MX]'
    class      = display
    subclass   = 3D
--------------------------------------
Network Device Information
  re0@pci0:2:0:0:       class=0x020000 card=0xc784144d chip=0x813610ec rev=0x07 hdr=0x00
    vendor     = 'Realtek Semiconductor Co., Ltd.'
    device     = 'RTL810xE PCI Express Fast Ethernet controller'
    class      = network
    subclass   = ethernet

#4
X is using your Intel VGA.
You might test with Option "AccelMethod" "none" commented out.
And you might also test with Driver "intel"

# cp /usr/local/etc/X11/xorg.conf /etc/X11
# pkg install xf86-video-intel

Happy testing and report back!
John
groenveld@acm.org

#5

I tested all those options, and the results are always the same: artifacts on Java GUIs…

I removed AccelMethod…
I installed xf86-video-intel…
I changed the driver to “intel”…
I reverted the changes back and tried it all again…
Always rebooting.

Also, I tried commenting out kldload_nvidia=“nvidia-modeset nvidia” in /etc/rc.conf… and included kld_list="/boot/modules/i915kms.ko"… and it yielded error messages on boot.

I even changed Device “Card1” to see what happens… and I got a black screen with a stuck mouse cursor.

So I’m back to the beginning, but removed the line with AccelMethod.

For your information, glxinfo is printing:

Xlib:  extension "GLX" missing on display ":0".

Maybe that helps.

Thank you very much.


#6

Well… it seems I can be sure I’m using the Intel video card. But I’m not sure it’s optimized. Is acceleration working? Should glxinfo work? xscreensaver is missing GLX as well.

So, I’d like to know what’s missing in order to get the best out of this Intel video card…


#7
# pkg delete nvidia-driver
# shutdown -r now

At first glance it appears that Trident loads the nvidia drivers regardless
of your /etc/X11/xorg.conf Card device selection which causes OpenGL
to try to use the nVidia GLX.

John
groenveld@acm.org

#8

it shouldn’t, since there are 3 different nvidia drivers. unless he “selected” them at boot time


#9
Thank Rod, yes, you're correct.
I was force loading nvidia and nvidia-modesetting drivers
in my rc.conf.
John
groenveld@acm.org

#10

After removing nvidia-driver, it’s finally loading the intel driver, with acceleration. Though it’s still not optimized.

I disabled kldload_nvidia="nvidia-modeset nvidia" and enabled kld_list="/boot/modules/i915kms.ko" in /etc/rc.conf.

I’ve tried modesetting and intel driver. No matter what, glxgears as an ordinary user is giving me this:

libGL error: failed to open drm device: Permission denied
libGL error: failed to load driver: i965
9673 frames in 5.0 seconds = 1934.509 FPS

But as superuser, it’s working optimally!

Running synchronized to the vertical refresh.  The framerate should be
approximately the same as the monitor refresh rate.
302 frames in 5.0 seconds = 60.339 FPS

So, there’s one setting still missing. Permissions? Can somebody help me, please?


#11
# pw groupmod video -m sergio
<URL:https://github.com/project-trident/trident-installer/issues/44>


John
groenveld@acm.org

#12

Thank you. It’s working now, using 3D accelerated Intel video card.

Summing it up: I removed nvidia-driver and installed xf86-video-intel. I removed such references from /etc/rc.conf (it seems they are not needed) and I removed “AccelMethod” “none” from /etc/X11/xorg.conf. It works with both Driver settings, “intel” or “modesetting”. glxgears runs optimally.

The issue with Java GUIs is reported elsewhere and is not related to Trident:

https://bugs.freebsd.org/bugzilla/show_bug.cgi?id=234594

Thanks!


#13

One information and one question:

  1. I had to choose “intel” driver, which works better. “modesetting” is displaying artifacts in video players and Java GUIs.

  2. Well, now that I can use my Intel GPU just fine, is there a way to set my Trident installation to use the NVIDIA GPU instead, if I want it to? Or should I rather forget about it?


#14
ISTR success stories using the nVidia GPU with Optimus
but only when the BIOS provided an option to set the 
default GPU.
John
groenveld@acm.org