You are not logged in.

#1 2020-07-18 11:15:12

k1-801
Member
Registered: 2016-03-22
Posts: 7

Using HDMI output on hybrid graphics AMD+nvidia laptop

Hello.
I have an ASUS TUF FX505DU laptop with a weird hybrid graphics configuration with AMD Vega integrated into CPU as IGPU and nvidia GTX 1660ti as DGPU. As far as I know, the laptop's screen is wired to the IGPU and the HDMI port is wired to the nvidia card. I'm trying to make the HDMI port work. Currently, if I plug a tv cable into it, it is recognized correctly by xrandr and KDE's system settings app, I can change the resolution (tv shows it when it changes), but no matter what I do, it just stays a black screen. I did use PRIME to run some applications on DGPU and they work fine, so at least the nvidia card itself must be working fine.

$ xrandr
Screen 0: minimum 320 x 200, current 1920 x 1080, maximum 16384 x 16384
eDP connected primary 1920x1080+0+0 (normal left inverted right x axis y axis) 344mm x 194mm
   1920x1080     59.93*+  39.99  
   1680x1050     59.93  
   1280x1024     59.93  
   1440x900      59.93  
   1280x800      59.93  
   1280x720      59.93  
   1024x768      59.93  
   800x600       59.93  
   640x480       59.93  
HDMI-1-0 connected 1920x1080+0+0 0mm x 0mm
   1366x768      59.96 +
   1920x1080     59.94*   50.00    23.98  
   1280x768      59.99  
   1280x720      60.00    59.94    50.00  
   1024x768      60.00  
   800x600       60.32    56.25  
   720x576       50.00  
   720x480       59.94  
   640x480       59.94    59.93

Also, when i set TV's resolution to 640*480 or 800*600, sddm crashes and resets the session.

$ xrandr --listproviders
Providers: number : 2
Provider 0: id: 0x54 cap: 0xf, Source Output, Sink Output, Source Offload, Sink Offload crtcs: 4 outputs: 1 associated providers: 1 name:Unknown AMD Radeon GPU @ pci:0000:05:00.0
Provider 1: id: 0x1f7 cap: 0x2, Sink Output crtcs: 4 outputs: 1 associated providers: 1 name:NVIDIA-G0

About a month ago, when I started my attempts, this thing did NOT show the nvidia's single output, the number was 0, making the total number of outputs 1, but my laptop must have at least two (it's own screen and an HDMI port). It was about the same time I noticed that bumblebee stopped working, so I had to switch to PRIME. These two events might be unrelated, as I never attempted to use HDMI output prior to the update that broke bumblebee.
Side note: I tried to check providers using a live ubuntu back then, just to see if it detects the second output, and it did, but the TV screen behaved weirdly - there was a working mouse cursor, but the screen behind it was filled with some static glitches which always stayed the same pattern until the resolution is chenged, did not react to dragging a window into that area, etc. I will not be digging into that here unless it's actually a hardware problem.
About a week ago, when I updated Arch, the socond output showed up, and my TV at least started reacting to the cable being connected, but it stays black, as mentioned above.

To make things worse, this article https://us.download.nvidia.com/XFree86/ … ndr14.html claims that

The NVIDIA driver currently cannot be used as an output sink when the output source driver is xf86-video-amdgpu.

If I am reading that correctly, it means that I can't do "Reverse PRIME" on my hardware.
I am fine with always running the DGPU and having the session always on it, but I don't know how to achieve that. There is another topic on switching the primary GPU, but the guy seems to have no luck as well. https://bbs.archlinux.org/viewtopic.php?id=255980.

My questions are:
Is there a way to have a working HDMI port on my laptop? If so, how to achieve it? Will it require the X session to always run on nvidia card? How to achieve that too?

I also have to mention that I am bad with Xorg configs and currently there are no custom config files for Xorg on my machine.
The system is up to date and I will provide any additional info about my machine if needed.

Last edited by k1-801 (2020-07-18 11:29:37)

Offline

#2 2020-07-18 12:31:10

Lone_Wolf
Member
From: Netherlands, Europe
Registered: 2005-10-04
Posts: 11,911

Re: Using HDMI output on hybrid graphics AMD+nvidia laptop

While not ideal, amd cards can work* with the X modesetting driver .

It comes with the xorg-server package, so install it if you don't have it.
Then remove xf86-video-amdgpu and restart X .

Reverse Prime should now work.

If that fails a custom config file to make the nvidia card the primary one would prob be best option.

Post lspci -k , xorg log from current setup and from the setup with modestting for amd.





* not all features will be supported and performance of the amd card may be not as good as with xf86-amdgpu


Disliking systemd intensely, but not satisfied with alternatives so focusing on taming systemd.


(A works at time B)  && (time C > time B ) ≠  (A works at time C)

Offline

#3 2020-07-18 17:06:41

k1-801
Member
Registered: 2016-03-22
Posts: 7

Re: Using HDMI output on hybrid graphics AMD+nvidia laptop

Lone_Wolf wrote:

Reverse Prime should now work.
If that fails a custom config file to make the nvidia card the primary one would prob be best option.

Done. xrandr shows it switched to modesetting, but the TV still has a black screen, so I have to assume it either didn't work or there is something else I need to do to see the results.
lspci -k: http://ix.io/2rYU
Xorg.0.log with amdgpu: http://ix.io/2rYW
Xorg.0.log with modesetting: http://ix.io/2rYV

As I assume it didn't work, I am returning the amdgpu driver and awaiting further instructions on how to put nvidia in charge.
I forgot to mention that it uses the proprietary nvidia driver because my card was listed as completely unsupported by nouveau.

Last edited by k1-801 (2020-07-27 10:36:26)

Offline

#4 2020-07-19 13:30:00

Lone_Wolf
Member
From: Netherlands, Europe
Registered: 2005-10-04
Posts: 11,911

Re: Using HDMI output on hybrid graphics AMD+nvidia laptop

Both Xorg logs show messages like [1] that make me wonder if something else may be causing the issue.

HDMI screens tend to switch to standby when they don't get a signal fast enough, and videocards sometimes have trouble waking them up.
Does it make a difference if you keep the tv powered off until X is up and running , then power on the tv ?
Is the nvidia card/driver able to get an EDID this way ?



Here''s a config file for /etc/X11/xorg.conf.d/10-nvidia-primary-gpu.conf  . It's adapted from https://us.download.nvidia.com/XFree86/ … ndr14.html .
Keep in mind that this file is completely untested and may break X, potentially requiring you to boot to console or from a installation disc/usb .

Get familiar with https://wiki.archlinux.org/index.php/Sy … ent_target and https://wiki.archlinux.org/index.php/Chroot


Section "ServerLayout"
    Identifier "layout"
    Screen 0 "nvidia"
    Inactive "amd"
EndSection

Section "Device"
    Identifier "nvidia"
    Driver "nvidia"
    BusID "1@0:0:0"
EndSection

Section "Screen"
    Identifier "nvidia"
    Device "nvidia"
    Option "AllowEmptyInitialConfiguration"
EndSection

Section "Device"
    Identifier "amd"
    Driver "modesetting"
EndSection

Section "Screen"
    Identifier "amd"
    Device "amd"
EndSection

[1]

[  7570.327] (--) NVIDIA(GPU-0): HKC-TV (DFP-0): connected
[  7570.327] (--) NVIDIA(GPU-0): HKC-TV (DFP-0): Internal TMDS
[  7570.327] (--) NVIDIA(GPU-0): HKC-TV (DFP-0): 600.0 MHz maximum pixel clock
[  7570.327] (--) NVIDIA(GPU-0): 
[  7570.350] (II) NVIDIA(G0): Validated MetaModes:
[  7570.350] (II) NVIDIA(G0):     "NULL"
[  7570.350] (II) NVIDIA(G0): Virtual screen size determined to be 640 x 480
[  7570.391] (WW) NVIDIA(G0): HKC-TV (DFP-0) does not have an EDID, or its EDID does not
[  7570.391] (WW) NVIDIA(G0):     contain a maximum image size; cannot compute DPI from
[  7570.391] (WW) NVIDIA(G0):     HKC-TV (DFP-0)'s EDID.

Disliking systemd intensely, but not satisfied with alternatives so focusing on taming systemd.


(A works at time B)  && (time C > time B ) ≠  (A works at time C)

Offline

#5 2020-07-27 10:36:05

k1-801
Member
Registered: 2016-03-22
Posts: 7

Re: Using HDMI output on hybrid graphics AMD+nvidia laptop

Lone_Wolf wrote:

Here''s a config file for /etc/X11/xorg.conf.d/10-nvidia-primary-gpu.conf  . It's adapted from https://us.download.nvidia.com/XFree86/ … ndr14.html .
Keep in mind that this file is completely untested and may break X, potentially requiring you to boot to console or from a installation disc/usb .

Long story short: nothing's changed.
As far as I understand, it should have changed the provider name in "xrandr --listproviders" to "amd" (which it didn't) and it's driver to "modesetting" (which it didn't). glxinfo also still reports OpenGL provider as AMD RAVEN:

$ xrandr --listproviders
Providers: number : 2
Provider 0: id: 0x54 cap: 0xf, Source Output, Sink Output, Source Offload, Sink Offload crtcs: 4 outputs: 1 associated providers: 1 name:Unknown AMD Radeon GPU @ pci:0000:05:00.0
Provider 1: id: 0x1f7 cap: 0x2, Sink Output crtcs: 4 outputs: 1 associated providers: 1 name:NVIDIA-G0

It just stays the same.

Will update the post with Xorg logs soon enough.

Last edited by k1-801 (2020-07-27 22:58:21)

Offline

#6 2020-07-27 13:47:59

V1del
Forum Moderator
Registered: 2012-10-16
Posts: 21,645

Re: Using HDMI output on hybrid graphics AMD+nvidia laptop

Not really. The identifier is completely irrelevant outside of xorg configuration , and the modesetting driver "might" be listed differently (... depending on which "driver" you talk about. lspci will always list amdgpu. the kernel driver and the used xorg driver are two distinct and unrelated concepts). What's your --listproviders with the modesetting driver/the configuration above?

Given the provviders output in your first post

xrandr --setprovideroutputsource 0 1
xrandr --auto

in  some early initialization file like .xinitrc should regardless have the desired effect.

Online

#7 2020-07-27 23:02:27

k1-801
Member
Registered: 2016-03-22
Posts: 7

Re: Using HDMI output on hybrid graphics AMD+nvidia laptop

I just realized how dumb I am. I apologize for providing the wrong info, I accidentally put my Xorg config file in a wrong location.
The config above works. It did make nvidia the primary card. However, there are several smaller issues that I am sure easy enough to solve.
1) Incorrect DPI (it must have taken the TV's DPI, but I'm not sure) which leads to all KDE applications have super-tiny fonts. Temporarily set the DPI to 96 in KDE config, seems to work, but there might be other side effects which I didn't find yet (GTK applications?).
2) Black laptop screen on SDDM login screen. SDDM is only displayed on TV now, even if it is not connected. This is the one I don't know how to solve and it is something I need help with.
3) After a reboot kwin crashes when set to use OpenGL and reports that compositing doesn't work anymore, and the two leftmost columns of the systemsettings5 window (out of 3) are now black. KDE struggles to start, requires a manual plasmashell restart. Could be caused by the most recent update or the Xorg config, could be unrelated (but seemed to work fine the first timem before the update and reboot).
UPD: A followup update fixed black columns in systemsettings5 and KDE is now starting the session properly with xrender backend. Setting the backend to opengl still crashes kwin.

Using AMD as the primary card fixes everything but the HDMI port, the output is still black, with or without config and commands. I think there still might be a solution somewhere to make this work as well.

Last edited by k1-801 (2020-07-30 11:44:48)

Offline

#8 2020-09-07 08:42:31

Awsim
Member
Registered: 2020-09-07
Posts: 3

Re: Using HDMI output on hybrid graphics AMD+nvidia laptop

Hello, I don't know if you have solved this already but if you haven't, I might have a solution for you. I running a similar combo like your laptop. I have an Acer Nitro AN515-44-R6ZW with AMD Ryzen 5 4600H and it's onboard AMD Renoir Graphics combined with Nvidia GTX 1650TI dGPU.

HDMI port is wired to the Nvidia GPU on my laptop and seems like on your's too as you have mentioned it.

To solve the HDMI black screen problem I have followed this section of the Arch Wiki to start X with Nvidia GPU only. Arch Wiki also suggests to enable NVIDIA DRM kernel mode setting to fix screen tearing but I haven't experienced any screen tearing on my own DWM build so I didn't enable it but this might be different for other setups so keep that in mind.

So since this will start X using Nvidia GPU you might not want to use this all the time for saving power. I suggest adding these aliases to your .bashrc to disable or enable the xorg.conf file you have created.

alias nvidia-enable="sudo mv /etc/X11/xorg.conf.d/10-nvidia-drm-outputclass.conf.dis /etc/X11/xorg.conf.d/10-nvidia-drm-outputclass.conf"
alias nvidia-disable="sudo mv /etc/X11/xorg.conf.d/10-nvidia-drm-outputclass.conf /etc/X11/xorg.conf.d/10-nvidia-drm-outputclass.conf.dis"

I manually edit my .xinitrc to comment the added lines out. You might come out with a better way but this is what I use to make it a little more simple.

From this point on you can use xrandr to manage your displays (or use arandr if you want to something graphical).

Last edited by Awsim (2020-09-07 08:50:44)

Offline

Board footer

Powered by FluxBB