You are not logged in.

#1 2016-11-28 08:07:40

RhiobeT
Member
Registered: 2016-11-28
Posts: 5

Multi-monitor using a HDMI->VGA adapter

Hello,

I've been using Arch on my laptop for about a year now, and for the first time I can't solve an issue even using the wiki and intense googling hmm

I sometimes need to use a VGA video projector, but my laptop only has a HDMI port, so I bought a HDMI -> VGA adapter.
It works fine, but I always have an issue when I want to use it with my Arch laptop:

My HDMI port is shown as disconnected in xrandr, so I have to manually add the resolution I want (1280x960) before using it:

xrandr --addmode HDMI-2 1280x960
xrandr --output HDMI-2 --mode 1280x960 --right-of eDP-1

But, then, I get the following error:

xrandr: Configure crtc 1 failed
X Error of failed request:  BadMatch (invalid parameter attributes)
  Major opcode of failed request:  140 (RANDR)
  Minor opcode of failed request:  7 (RRSetScreenSize)
  Serial number of failed request:  42
  Current serial number in output stream:  43

However, if I first connect my laptop to a true HDMI output (a TV for example), and I then connect the adapter, I don't get any error and everything works fine out of the box (I can also add my resolution and use it).

So, I guess that something needs to 'activate' my HDMI port before I can use it, but I have no idea of what it is or how to do it...
I don't always have a HDMI display at hand, so I'd really like to have another solution to my issue.

Can someone explain to me why this happens, and if I can do something?
Thanks in advance~~

P.S.: Sorry if my english is a little bad, I'm not a native speaker ^^'

Offline

#2 2016-11-28 21:46:49

wolfdogg
Member
From: Portland, OR, USA
Registered: 2011-05-21
Posts: 545

Re: Multi-monitor using a HDMI->VGA adapter

This is interesting in the fact that i think your on to something when you mention something might need to be activated first, i was thinking something along those line. 

First on your suggestion:
1) then try lsmod some display port or module?

However i was thinking,
1) it appears its simply telling you that your putting in out of range parameters for your indicated display settings, it looks like.  So either possibly you may need to enroll it first yes, or can you try looking up a list of other possible parameters for this and paste a resource url for that here so we can look through that list for something that might help initiate it;

Or i was actually thinking, what if it just doesn't like the resolution, if you can delay the setting of the resolution until a later time, by putting a sleep on there, or possibly trying something like a lower safe resolution first, then switching it after, possibly even 0x0 since its may not be enrolled yet.

If you hack out those few things maybe we can get a slightly better idea of whats going on here.

Last edited by wolfdogg (2016-11-28 21:51:14)


Node.js, PHP Software Architect and Engineer (Full-Stack/DevOps)
GitHub  | LinkedIn

Offline

#3 2016-11-29 14:31:16

neymarsabin
Member
Registered: 2016-05-02
Posts: 12

Re: Multi-monitor using a HDMI->VGA adapter

could you post your

 xrandr --query 

output before and after using the HDMI->VGA adapter??

Offline

#4 2016-11-29 15:21:26

Tom B
Member
Registered: 2014-01-15
Posts: 187
Website

Re: Multi-monitor using a HDMI->VGA adapter

Can you post the outputs of:

lspci | grep VGA

Then

glxinfo | grep OpenGL

My hunch is that your laptop has hybrid graphics (see https://wiki.archlinux.org/index.php/hybrid_graphics ) and the HDMI output is connected to the discrete GPU, but you're booting using the processor's GPU. When a powered device is connected via HDMI, the discreate card turns on and becomes available.

If that's the case there are a few options depending on hardware. Worst case scenario you may need to use vgaswitcheroo to use the discrete card all the time

Last edited by Tom B (2016-11-29 15:22:01)

Offline

#5 2016-11-29 23:21:12

RhiobeT
Member
Registered: 2016-11-28
Posts: 5

Re: Multi-monitor using a HDMI->VGA adapter

Thank you guys for the answers!

Following your advices, I tried a bunch of things and here are the results:

First of all, I noticed thanks to lsmod that after I plug a HDMI display, nvidia_drm becomes used by one more module (that doesn't seem to be random, since it happened everytime I checked).
That could be a lead, but I don't know how, or even if I can get more informations on this.

I also tried a bunch of different parameters (even a resolution of 8x1), and nothing worked.


Here are the different outputs of xrandr --query, but I don't think there is much to see :
Wether I connect the adapter or not, HDMI-2 is shown as disconnected

Screen 0: minimum 8 x 8, current 1920 x 1080, maximum 16384 x 16384
eDP-1 connected 1920x1080+0+0 (normal left inverted right x axis y axis) 344mm x 193mm
   1920x1080     60.00*+
   1400x1050     59.98  
   1280x1024     60.02  
   1280x960      60.00  
   1024x768      60.04    60.00  
   960x720       60.00  
   928x696       60.05  
   896x672       60.01  
   800x600       60.00    60.32    56.25  
   700x525       59.98  
   640x512       60.02  
   640x480       60.00    59.94  
   512x384       60.00  
   400x300       60.32    56.34  
   320x240       60.05  
DP-1 disconnected (normal left inverted right x axis y axis)
HDMI-1 disconnected (normal left inverted right x axis y axis)
HDMI-2 disconnected (normal left inverted right x axis y axis)
DP-2 disconnected (normal left inverted right x axis y axis)
HDMI-3 disconnected (normal left inverted right x axis y axis)

And even when the adapter is working, HDMI-2 is still shown as disconnected

Screen 0: minimum 8 x 8, current 3200 x 1080, maximum 16384 x 16384
eDP-1 connected 1920x1080+0+0 (normal left inverted right x axis y axis) 344mm x 193mm
   1920x1080     60.00*+
   1400x1050     59.98  
   1280x1024     60.02  
   1280x960      60.00  
   1024x768      60.04    60.00  
   960x720       60.00  
   928x696       60.05  
   896x672       60.01  
   800x600       60.00    60.32    56.25  
   700x525       59.98  
   640x512       60.02  
   640x480       60.00    59.94  
   512x384       60.00  
   400x300       60.32    56.34  
   320x240       60.05  
DP-1 disconnected (normal left inverted right x axis y axis)
HDMI-1 disconnected (normal left inverted right x axis y axis)
HDMI-2 disconnected 1280x960+1920+0 (normal left inverted right x axis y axis) 0mm x 0mm
   1280x960      60.00* 
DP-2 disconnected (normal left inverted right x axis y axis)
HDMI-3 disconnected (normal left inverted right x axis y axis)

Also, it's true that my laptop has hybrid graphics, and in the past I had a hard time using my HDMI port because I was using Bumblebee (even "true" HDMI outputs didn't work, and I needed to do some voodoo stuff to use them).
But now I have switched to full proprietary NVIDIA drivers, so that shouldn't be an issue anymore.

Still, here are the outputs (in case I overlooked something):

00:02.0 VGA compatible controller: Intel Corporation HD Graphics 530 (rev 06)

(That's probably not relevant since I don't have any VGA port on my laptop)

OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: GeForce GTX 960M/PCIe/SSE2
OpenGL core profile version string: 4.5.0 NVIDIA 375.20
OpenGL core profile shading language version string: 4.50 NVIDIA
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
OpenGL version string: 4.5.0 NVIDIA 375.20
OpenGL shading language version string: 4.50 NVIDIA
OpenGL context flags: (none)
OpenGL profile mask: (none)
OpenGL extensions:
OpenGL ES profile version string: OpenGL ES 3.2 NVIDIA 375.20
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.20
OpenGL ES profile extensions:

Finally, I noticed that only plugging a HDMI display wasn't enough to make my adapter work: I also need to run xrandr (without any argument) while the display is plugged.
So, now, I'm wondering if the issue could be from xrandr (does it have any side effect when used without argument? Like doing some updates somewhere?)

Offline

#6 2016-11-30 18:54:38

Tom B
Member
Registered: 2014-01-15
Posts: 187
Website

Re: Multi-monitor using a HDMI->VGA adapter

Ah it's not hybrid graphics that's the issue then. What if you try setting the resolution?

xrandr --output HDMI-2 --mode 1280x960 --right-of eDP-1

edit: nevermind, you already tried this in the first post!

edit2: I have a similar laptop with hybrid nvidia/intel graphics and never tried hdmi - > vga but displayport -> vga worked flawlessly. Not an ideal fix but it would be interesting to know whether DP is any different to HDMI

Last edited by Tom B (2016-11-30 20:18:39)

Offline

Board footer

Powered by FluxBB