You are not logged in.
I've bought a new monitor LG L204WT. It has 2 way of connection to the videocard: VGA and DVI. My videocard also supports two these interfaces.
So, when I'm connectioning devices via VGA cable - all is working OK, but when doing via DVI I got errors: my screen resulution become ugly
My friend said me that the picture via DVI is match better then VGA one (is this true?)
Does anyone know what may be the reason of such behaviour?
Some details:
Monitor: LG L204WT
Video: Inno3D GeForce4 MX440
my xorg.conf:
Section "ServerLayout"
Identifier "Simple Layout"
Screen "Screen 1" 0 0
InputDevice "Mouse1" "CorePointer"
InputDevice "Keyboard1" "CoreKeyboard"
EndSection
Section "Files"
RgbPath "/usr/share/X11/rgb"
ModulePath "/usr/lib/xorg/modules"
FontPath "/usr/share/fonts/misc/"
FontPath "/usr/share/fonts/TTF/"
FontPath "/usr/share/fonts/Type1/"
FontPath "/usr/share/fonts/CID/"
FontPath "/usr/share/fonts/75dpi/"
FontPath "/usr/share/fonts/100dpi/"
FontPath "/usr/share/fonts/local/"
EndSection
Section "Module"
Load "dbe" # Double buffer extension
SubSection "extmod"
Option "omit xfree86-dga" # don't initialise the DGA extension
EndSubSection
Load "type1"
Load "speedo"
Load "freetype"
Load "xtt"
Load "glx"
Load "bitmap"
Load "int10"
Load "vbe"
EndSection
Section "InputDevice"
Identifier "Keyboard1"
Driver "kbd"
Option "XkbRules" "xorg"
Option "XkbModel" "pc105"
Option "XkbLayout" "us,ru(winkeys)"
Option "XkbVariant" "us"
Option "XkbOptions" "grp:ctrl_shift_toggle"
EndSection
Section "InputDevice"
Identifier "Mouse1"
Driver "mouse"
Option "CorePointer"
Option "Protocol" "ImPS/2"
Option "Device" "/dev/input/mice"
Option "Emulate3Buttons" "true"
Option "ZAxisMapping" "4 5"
# Mouse-speed setting for PS/2 mouse.
EndSection
Section "Monitor"
Identifier "SyncMaster"
Option "DPMS"
VendorName "Generic"
ModelName "Flat Panel 1680x1050"
#HorizSync 31.5-90
#VertRefresh 60
#ModeLine "1680x1050" 146.2 1680 1784 1960 2240 1050 1053 1059 1089 +HSync -VSync
# modeline generated by gtf(1) [handled by XFdrake]
#ModeLine "1680x1050_60" 147.14 1680 1784 1968 2256 1050 1051 1054 1087 -HSync +VSync
EndSection
Section "Device"
#Driver "nv"
# Insert Clocks lines here if appropriate
Identifier "Nvidia"
Driver "nvidia"
Option "RenderAccel" "True"
#ption "UseEDID" "False"
#Option "IgnoreEDID" "True"
#Option "ModeValidation" "NoEdidModes"
EndSection
Section "Screen"
Identifier "Screen 1"
Device "Nvidia"
Monitor "SyncMaster"
DefaultDepth 24
Option "RenderAccel" "true"
#Option "AllowGLXWithComposite" "true"
SubSection "Display"
#Virtual 1600 1280
Viewport 0 0
Virtual 1680 1050
Depth 8
Modes "1680x1050" "1280x1024" "1024x768" "800x600" "640x480"
EndSubSection
SubSection "Display"
#Virtual 1280 1024
#Virtual 1680 1050
Viewport 0 0
Depth 16
Modes "1680x1050" "1280x1024" "1024x768" "800x600" "640x480"
EndSubSection
SubSection "Display"
#Virtual 1280 1024
Viewport 0 0
Virtual 1680 1050
Depth 24
Modes "1680x1050" "1280x1024" "1024x768" "800x600" "640x480"
EndSubSection
EndSection
the part of Xorg.0.log
(**) NVIDIA(0): Depth 24, (--) framebuffer bpp 32
(==) NVIDIA(0): RGB weight 888
(==) NVIDIA(0): Default visual is TrueColor
(==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
(**) NVIDIA(0): Option "RenderAccel" "true"
(**) NVIDIA(0): Enabling RENDER acceleration
(II) NVIDIA(0): NVIDIA GPU GeForce4 MX 440 with AGP8X at PCI:1:0:0
(--) NVIDIA(0): VideoRAM: 65536 kBytes
(--) NVIDIA(0): VideoBIOS: 04.18.20.13.00
(II) NVIDIA(0): Detected AGP rate: 4X
(--) NVIDIA(0): Interlaced video modes are supported on this GPU
(--) NVIDIA(0): Connected display device(s) on GeForce4 MX 440 with AGP8X at
(--) NVIDIA(0): PCI:1:0:0:
(--) NVIDIA(0): LG L204WT (DFP-0)
(--) NVIDIA(0): LG L204WT (DFP-0): 135.0 MHz maximum pixel clock
(--) NVIDIA(0): LG L204WT (DFP-0): Internal Single Link TMDS
(WW) NVIDIA(0): Mode "1400x1050" is too large for LG L204WT (DFP-0);
(WW) NVIDIA(0): discarding.
(WW) NVIDIA(0): Mode "1600x1024" is too large for LG L204WT (DFP-0);
(WW) NVIDIA(0): discarding.
(II) NVIDIA(0): Assigned Display Device: DFP-0
(WW) NVIDIA(0): No valid modes for "1680x1050"; removing.
(II) NVIDIA(0): Validated modes:
(II) NVIDIA(0): "1280x1024"
(II) NVIDIA(0): "1024x768"
(II) NVIDIA(0): "800x600"
(II) NVIDIA(0): "640x480"
(**) NVIDIA(0): Virtual screen size configured to be 1680 x 1050
(--) NVIDIA(0): DPI set to (99, 98); computed from "UseEdidDpi" X config option
Any help would be greatly appreciated. Thanks
Offline
The resolution you want to use is not "standard" (per se) & thus not supported by default. You need to create a modeline for that resolution. This site, http://xtiming.sourceforge.net/cgi-bin/xtiming.pl is very useful. Just enter your monitor's specs and get a modeline. Then add it in your xorg.conf.
Offline
thank you very much. I'll try this tomorrow (it is 00:35 here:)). But I see that these modelines are right for the XFree86... Is this also possible lines for the Xorg?
Offline
They're compatible w/ each other.
Offline
"gtf 1680 1050 60" in a terminal as you may already know. Also i suggest you remove the "_60" because as of now, it doesn't apply to any of the modes in the configuration. Also, it is commented out...(why!)
#ModeLine "1680x1050_60" 147.14 1680 1784 1968 2256 1050 1051 1054 1087 -HSync +VSync
So, this is what it should look like;
ModeLine "1680x1050" 147.14 1680 1784 1968 2256 1050 1051 1054 1087 -HSync +VSync
"Your beliefs can be like fences that surround you.
You must first see them or you will not even realize that you are not free, simply because you will not see beyond the fences.
They will represent the boundaries of your experience."
SETH / Jane Roberts
Offline
I've tried all of these... It didn't help different modelines from different sources (gtf and that perl script on the web).
Also I tried with "_60" and without it... Nothing... VGA is working correctly. And it is commented because I've done many experiments before posting here.
Offline
You must set horizSync and Vertrefresh in you xorg.conf to have 1680x1050.
This value are different for each monitor. You can find them in the monitor manual or in your specifications.
You can also set some options in the xorg.conf.
Here mine (you must set your own Horiz and VertRefresh value)
Section "Monitor"
Identifier "Monitor0"
VendorName "BenQ"
ModelName "FP202W"
HorizSync 30.0 - 81.0
VertRefresh 56.0 - 76.0
Option "DPMS"
Modeline "1680x1050" 119.00 1680 1728 1760 1840 1050 1053 1059 1080 +hsync -vsync
EndSection
Section "Device"
Identifier "Card0"
Driver "nvidia"
VendorName "Nvidia"
BoardName "FX 5600"
Option "RenderAccel" "false"
# Option "CursorShadow" "true"
# Option "NoLogo" "true"
Option "UseDisplayDevice" "DFP-0"
Option "UseEDIDFreqs" "false"
Option "UseEdidDPI" "FALSE"
Option "DPI" "99 x 98"
EndSection
Section "Screen"
Identifier "Screen0"
Device "Card0"
Monitor "Monitor0"
DefaultDepth 24
SubSection "Display"
Viewport 0 0
Depth 24
Modes "1680x1050"
EndSubSection
Good luck!
Shaika-Dzari
http://www.4nakama.net
Offline
I own Samsung 205bw, it's 1680x1050 too (which is pretty common resolution nowadays). Some people say GeForce FX series have broken DVI-port, others say default monitor's DVI cable sucks, but i use VGA too
IRC: Stalwart @ FreeNode
Skype ID: thestalwart
WeeChat-devel nightly packages for i686
Offline
Stalwart, Hm... I have Single-Channel DVI cable as default. And then I owned new Dual-Channel DVI cable. I tried with it but got no picture at all instead of broken resolution. As for the FX cards - I have MX series (MX 440)
It is pretty funny to speak English when both of us are speaking Russian, yes? I remember you from the Russian Ubuntu forum
Offline
I run 1680x1050 through DVI with no problems. I used to have to specify a modeline, but that was always tricky...some worked, some didn't, and I never knew which one was the "correct" mode. Anyway, with the newer releases of Xorg (probably around 5-6 months ago), I don't have to specify a modeline at all.
As Shaika-Dzari mentioned, you do have to specify the proper horizontal sync and vertical refresh though. For an example, here is the relevant portion of my <code>xorg.conf</code> file (and my graphics card stuff too, for clarification):
Section "Monitor"
Identifier "Monitor0"
VendorName "Dell"
ModelName "UltraSharp 2005FPW"
HorizSync 30.0 - 83.0
VertRefresh 56.0 - 75.0
Option "DPMS" "true"
EndSection
Section "Device"
Identifier "Card0"
Driver "nvidia"
VendorName "NVIDIA Corporation"
BoardName "GeForce 6200"
Option "HWCursor" "off"
Option "IgnoreDisplayDevices" "CRT, TV"
Option "NoLogo" "true"
Option "NvAgp" "1"
Option "RenderAccel" "true"
Option "AllowGLXWithComposite" "true"
EndSection
Section "Extensions"
Option "Composite" "Enable"
EndSection
Section "Screen"
Identifier "Screen0"
Device "Card0"
Monitor "Monitor0"
DefaultDepth 24
SubSection "Display"
Viewport 0 0
Depth 24
Modes "1680x1050"
EndSubSection
EndSection
Offline
There is a known bug with nvidia cards and edid.
Try adding
Option "IgnoreEDID" "true"
to your 'Device' section in xorg.conf
Evil #archlinux@libera.chat channel op and general support dude.
. files on github, Screenshots, Random pics and the rest
Offline
There is a known bug with nvidia cards and edid.
Try addingOption "IgnoreEDID" "true"
to your 'Device' section in xorg.conf
It doesn't help
2Elasticdog:
You have 6xxx series card which differs from older geforces
IRC: Stalwart @ FreeNode
Skype ID: thestalwart
WeeChat-devel nightly packages for i686
Offline
I've tried either "IgnoreEDID" (take a look plz to xorg.conf on the top) or specifiying the appropriate VSync and HSync (from my monitor spec). Nothing helped. As far as I've undertood from our discussion its seems to be the Nvidia cards bug, yes?
Next week I'll have newer nvidia card for testing... Will see if it helps. I'll post my results here
Offline
My (crappy, but working) xorg.conf, just for comparison:
http://pastebin.ca/221094
btw, you might want to try:
Option "IgnoreEDID" "True"
Option "ModeValidation" "NoEdidModes"
Evil #archlinux@libera.chat channel op and general support dude.
. files on github, Screenshots, Random pics and the rest
Offline