You are not logged in.
Hello,
I want to connect a LCD display to my graphic card. It's a nvidia GT635M and the display has 2560x1440 pixels.
I tried different configuration using intel integrated chipset as the renderer and nouveau with prime but I got poor perfomance.
I decided to use the dedicated card as a primary gpu. But when I set the resolution to 2560x1440, there are vertical white lines on screen and I can't see anything.
I tried to set dpi manually, to reduce depth but it doesn't work. When I set the refresh rate to 30 Hz, the image is great but it is too low to use the mouse.
I use the same hardware with MS Windows for 3D modeling and it works well. The Nvidia driver uses CVT to calculate modeline (that's how I calculated modeline on Arch) but there is no vertical lines on screen.
This is my xorg 10-nvidia.conf file :
Section "ServerLayout"
Identifier "layout"
Screen 0 "Screen0"
Inactive "intel"
EndSection
Section "Module"
Load "glx"
EndSection
Section "Device"
Identifier "nvidia"
Driver "nvidia"
VendorName "NVIDIA Corporation"
BoardName "GeForce GT 635M"
BusID "PCI:1:0:0"
EndSection
Section "Monitor"
# HorizSync source: edid, VertRefresh source: edid
Identifier "Monitor0"
VendorName "Unknown"
ModelName "ViewSonic VX2778 Series"
# HorizSync 15.0 - 100.0
# VertRefresh 24.0 - 75.0
Option "DPMS"
# Modeline "2560x1440_60.00" 311.83 2560 2744 3024 3488 1440 1441 1444 1490 -HSync +Vsync
Modeline "1920x1080_60.00" 173.00 1920 2048 2248 2576 1080 1083 1088 1120 -hsync +vsync
Modeline "1280x720_60.00" 74.50 1280 1344 1472 1664 720 723 728 748 -hsync +vsync
Option "ModeValidation" "NoMaxPClkCheck, NoEdidMaxPClkCheck, NoHorizSyncCheck, NoVertRefreshCheck"
EndSection
Section "Screen"
Identifier "Screen0"
Device "nvidia"
Monitor "Monitor0"
DefaultDepth 24
Option "AllowEmptyInitialConfiguration" "Yes"
Option "Stereo" "0"
# Option "modes" "1280x720_60.00" "1920x1080_60.00" "2560x1440_60.00"
Option "SLI" "Off"
Option "MultiGPU" "Off"
Option "BaseMosaic" "off"
SubSection "Display"
Depth 24
EndSubSection
EndSection
Section "Device"
Identifier "intel"
Driver "intel"
BusID "PCI:0:2:0"
Option "AccelMethod" "sna"
EndSection
Section "Screen"
Identifier "intel"
Device "intel"
EndSection
Thank you for reading.
Offline
please post xorg log [1] and dmesg output.
They may be huge, better use a pastebin client
[1]
xorg log can be in 2 locations, /var/log/ and ~/.local/share/xorg/ folders .
If there's more then one, use the most recent .
Last edited by Lone_Wolf (2016-11-13 12:10:29)
Disliking systemd intensely, but not satisfied with alternatives so focusing on taming systemd.
(A works at time B) && (time C > time B ) ≠ (A works at time C)
Offline
This is a different Modelines to see the pixel clock of different modes
$ cvt 2018 1152 60
# 2024x1152 59.97 Hz (CVT) hsync: 71.66 kHz; pclk: 195.50 MHz
Modeline "2024x1152_60.00" 195.50 2024 2160 2376 2728 1152 1155 1165 1195 -hsync +vsync
$ cvt 1920 1080 60
# 1920x1080 59.96 Hz (CVT 2.07M9) hsync: 67.16 kHz; pclk: 173.00 MHz
Modeline "1920x1080_60.00" 173.00 1920 2048 2248 2576 1080 1083 1088 1120 -hsync +vsync
$ cvt 2560 1440 30
# 2560x1440 29.94 Hz (CVT) hsync: 43.95 kHz; pclk: 146.25 MHz
Modeline "2560x1440_30.00" 146.25 2560 2680 2944 3328 1440 1443 1448 1468 -hsync +vsync
$ cvt 2560 1440 60
# 2560x1440 59.96 Hz (CVT 3.69M9) hsync: 89.52 kHz; pclk: 312.25 MHz
Modeline "2560x1440_60.00" 312.25 2560 2752 3024 3488 1440 1443 1448 1493 -hsync +vsync
$
The problem is that the 2560x1440 mode written in the edid of the monitor is ignored or invalid in HDMI.
The mode that I want to use has a 312.25 MHz pclk.
In the following log, we can read "(--) NVIDIA(GPU-0): ViewSonic VX2778 Series (DFP-0): 225.0 MHz maximum pixel clock".
This is my xorg log
Offline
Searched a bit around and it does seem possible it has to do with HDMI limitations .
Can you try with a display port connection ?
Another cause could be the videocard , check http://www.geforce.com/hardware/noteboo … ifications .
It claims 2560x1600 as highest mode, but lists nothing about refreshrate at that resolution .
Disliking systemd intensely, but not satisfied with alternatives so focusing on taming systemd.
(A works at time B) && (time C > time B ) ≠ (A works at time C)
Offline
I tried to use my monitor with a display port and a recent and (more or less) powerful graphic card. I tested the mini DP output of a GTX960M on MS Windows the nvidia driver found the right resolution with the EDID directly, but I forgot to try using its HDMI output. The HDMI allows to use this mode (2560x1440_60) on my graphic card because I use it with nouveau on Arch and with Nvidia on MS Windows. It's a great HDMI cable and there is no strange artefact on screen, the image is wonderful.
I have been switching between nouveau et nvidia since I installed Arch. I have just reinstalled nouveau because it's closer to the user and because I'm able to set the correct resolution.
So I choosed to use my dedicated card as a primary GPU using the wiki example, but now xrandr doesn't find Intel integrated Chipset as a provider anymore and I'm no longer able to use my laptop monitor. I have not said before that I had the same issue using the nvidia driver.
I don't know if I should open another thread.
Last edited by quentoush (2016-11-15 12:34:48)
Offline
It's better to start a new thread for that .
In opening post include full "lspci -k" output, dmesg and an xorg log of a start without any configuration files.
(PRIME mode is picky about small details and works best if xorg conf-files only include absolutely necessary 100% correct settings) .
Disliking systemd intensely, but not satisfied with alternatives so focusing on taming systemd.
(A works at time B) && (time C > time B ) ≠ (A works at time C)
Offline