You are not logged in.
I got a new monitor recently and I noticed that it's display size isn't being properly detected. When the desktop is up and I detect it I'm getting this:
$ xdpyinfo | grep -B2 resolution
dimensions: 1920x1080 pixels (508x286 millimeters)
resolution: 96x96 dots per inch
This is a 21.5" monitor and this information converts to a 22.95" monitor so I know it's not right. The correct measurements should be 480x270 millimeters. So I put the configuration in the xord.conf.d directory (as it's said in the wiki):
$ cat 90-monitor-external-disp-size.conf
Section "Monitor"
Identifier "VGA-0"
ModelName "SA300/SA350"
DisplaySize 480 270
EndSection
I added ModelName because this is an external monitor so I hoped that this would uniquely identify it. However after restarting the xorg server several time, no luck.
How the monitor gets set up is by the xorg server doing automatic detection and then I believe the settings are handled by Gnome 3's Display Control Panel settings. I have it set in the Control Panel so that the Laptop monitor is turned off when the desktop is loading and the external monitor is activated. What I found interesting here is that the DPI detected is 96 DPI. If I remember right, and probably my guess, it that Gnome 2 set the DPI as 96 by default ignoring xorg server settings, thereby skewing the DisplaySize dimensions.
Anybody know anything about this?
Last edited by Gen2ly (2011-12-23 07:22:13)
Setting Up a Scripting Environment | Proud donor to wikipedia - link
Offline
What's the output of 'xrandr --prop'?
You can keep an eye on a similar thread https://bbs.archlinux.org/viewtopic.php?id=132192 but about the DPI.
Offline
What's the output of 'xrandr --prop'?
Thanks karol, here it is:
$ xrandr --prop
Screen 0: minimum 320 x 200, current 1920 x 1080, maximum 4096 x 4096
VGA-0 connected 1920x1080+0+0 (normal left inverted right x axis y axis) 477mm x 268mm
EDID:
00ffffffffffff004c2dd1073831335a
2a1501030e301b782a90c1a259559c27
0e5054bfef80714f8100814081809500
a940b300950f023a801871382d40582c
4500dd0c1100001e000000fd00384b1e
5111000a202020202020000000fc0053
413330302f53413335300a20000000ff
0048434c424130323937370a2020007c
load detection: 1 (0x00000001) range: (0,1)
1920x1080 60.0*+
1600x1200 60.0
1680x1050 60.0
1280x1024 75.0 60.0
1440x900 75.0 59.9
1280x960 60.0
1280x800 59.8
1152x864 75.0
1024x768 75.1 70.1 60.0
832x624 74.6
800x600 72.2 75.0 60.3 56.2
640x480 72.8 75.0 66.7 60.0
720x400 70.1
LVDS connected (normal left inverted right x axis y axis)
EDID:
00ffffffffffff0006af060fe2000000
2c0d0102801e17780a12e59152528927
24505400000001010101010101010101
01010101010164190040410026301888
360030e6100000180000000f00067708
ff010f052e2dff053f01000000fe0041
554f0a202020202020202020000000fe
00423135305847303156320a202000cf
scaling mode: Full
supported: None Full Center Full aspect
1024x768 60.0 +
800x600 59.9
848x480 59.7
720x480 59.7
640x480 59.4
S-video disconnected (normal left inverted right x axis y axis)
tv standard: ntsc
supported: ntsc pal pal-m pal-60
ntsc-j scart-pal pal-cn secam
load detection: 1 (0x00000001) range: (0,1)
You can keep an eye on a similar thread https://bbs.archlinux.org/viewtopic.php?id=132192 but about the DPI.
Ah, hadn't seen that. Thank you.
Setting Up a Scripting Environment | Proud donor to wikipedia - link
Offline
xdpyinfo & xrandr give the same values on my monitor so something is not right in your case.
From the link cybertorture posted in the other thread:
Display size: 18.74" × 10.54" (47.6cm × 26.77cm) = 102.46 PPI, 0.2479mm dot pitch, 10498 PPI²
Seems that you should use 102 DPI <shrugs>
Offline
Yeah, what to do, what to do. Well, I did discover that I can set the DPI via xrandr which in turn will give the correct display size! So it's a start. I tried creating /etc/X11/Xsession.d/45custom-xrandr-settings to have it set automatically:
# 45custom-xrandr-settings - sets external monitor on login if connect, and
# includes DPI setting (Gnome sets 96 DPI to Xorg server by default?)
# Laptop monitor and external monitor detected names
ext_monitor="VGA-0" # Samsung SA350 1920x1080
int_monitor="LVDS" # Pavilion ze5570 1280x1024
# Discover if external monitor is connected
xrandr -q | grep $ext_monitor | grep " connected "
# Enable external monitor if connected and disable laptop monitor, otherwise
# just set laptop monitor
if [ $? -eq 0 ]; then
xrandr --output $int_monitor --off --output $ext_monitor --auto --dpi "102.46"
else
xrandr --output $int_monitor --auto --output $ext_monitor --off
fi
But no luck. Will have to think about this some more.
Setting Up a Scripting Environment | Proud donor to wikipedia - link
Offline
I decided to try and provide a more thorough xorg.conf.d configuration:
$ cat 90-monitor-disp-size.conf
Section "Monitor"
Identifier "Internal - Pavilion Laptop"
DisplaySize 304.5 228.6
EndSection
Section "Monitor"
Identifier "External - Samsung Syncmaster SA350"
#ModelName "SA300/SA350"
DisplaySize 476 267.7
EndSection
Section "Device"
Identifier "ATi Radeon Mobility IGP 330M"
Option "Monitor-VGA-0" "External - Samsung Syncmaster SA350"
Option "Monitor-LVDS" "Internal - Pavilion Laptop"
EndSection
Section "Screen"
Identifier "Default Screen"
Monitor "Internal - Pavilion Laptop"
EndSection
Section "ServerLayout"
Identifier "Default Layout"
Screen "Default Screen"
EndSection
Unfortunately, no luck. The log shows that it's using that section though:
$ grep External /var/log/Xorg.0.log
[ 9342.472] (II) RADEON(0): Output VGA-0 using monitor section External - Samsung Syncmaster SA350
So at this point I'm not sure what to think, still getting the wrong display size detected.
Last edited by Gen2ly (2011-12-21 09:26:55)
Setting Up a Scripting Environment | Proud donor to wikipedia - link
Offline
I was struggling with Radeon/Xorg incorrectly detecting my TV's size/resolution and read that it was necessary to use "IgnoreEDID" and/or "NoDDC" in xorg.conf to ensure the (incorrect) data provided by the TV/monitor was not used during configuration. (Note that I never successfully configured my TV, but this may still be helpful.)
M*cr*s*ft: Who needs quality when you have marketing?
Offline
I was struggling with Radeon/Xorg incorrectly detecting my TV's size/resolution and read that it was necessary to use "IgnoreEDID" and/or "NoDDC" in xorg.conf to ensure the (incorrect) data provided by the TV/monitor was not used during configuration. (Note that I never successfully configured my TV, but this may still be helpful.)
Good thinking pointone. Unfortunately it didn't work. I was never able to figure out how to get the Xorg server configuration to work so I just gave up on it. I ended up just using xrandr to define the values and wrote a script surrounding it. It can be found here. Marking topic as Workaround.
Setting Up a Scripting Environment | Proud donor to wikipedia - link
Offline