You are not logged in.

#1 2014-04-09 23:06:59

ThePacman
Member
From: Classified
Registered: 2013-09-19
Posts: 127

Problems with a Radeon card, an HDMI adapter, and an unusual TV

I have a Radeon HD 4350 attached to a JVC TV (model # is currently unknown) via a DVI cable and a DVI-HDMI adapter.

The TV is supposed to support resolutions up to 1080p (and it does, technically,) but the physical dimensions of the TV are 1366x768 px. Anything higher than this gets scaled down to size.
This means that anything higher than 720x480 gets scaled down (which produces ugly results.)
The resolution 1280x720 doesn't get scaled, but does have the sides cropped. Anything larger has scaling AND cropping.

Rght now, I have a custom command saved for whenever I connect to the TV:

#!/bin/sh
if (xrandr --current | grep '*' | grep 1920x1080)  ; then
	xrandr --output DVI-0\
	 --mode 1280x720\
	 --set "underscan hborder" 35\
	 --set "underscan vborder" 20\
	 --set underscan on
	exit $?
else
	exit 255
fi

Unfortunately, this scales the result, so it's a little ugly.

The output of xrandr is:

Screen 0: minimum 320 x 200, current 1280 x 720, maximum 8192 x 8192
VGA-0 disconnected (normal left inverted right x axis y axis)
DIN disconnected (normal left inverted right x axis y axis)
DVI-0 connected 1280x720+0+0 (normal left inverted right x axis y axis) 16mm x 9mm
   1920x1080i    60.00 +  59.94  
   1920x1080     60.00    59.94  
   1280x720      60.00*   59.94  
   1440x480i     60.00    59.94    59.94  
   720x480       60.00    59.94  
   640x480       60.00    59.94  

How can I get something better than 720x480 without scaling?

I've followed the instructions in the Wiki page on xrandr for adding undetected resolutions, but cvt 1366 768 gives

# 1368x768 59.88 Hz (CVT) hsync: 47.79 kHz; pclk: 85.25 MHz
Modeline "1368x768_60.00"   85.25  1368 1440 1576 1784  768 771 781 798 -hsync +vsync

The resolution is 1368x768, but I tried it anyway and got a black screen.

I then saw this thread, and tried his recommendation of 1360x768, and this almost worked: good picture with no scaling, but green bars on the left of the screen and orange static obscuring the right third.

I've also tried cvt 1360 760 with no success.

Last edited by ThePacman (2014-04-09 23:10:52)


Fedora believes in "software freedom" - that is, restricting user software choices to those deemed appropriately licensed by The Powers That Be.
Arch believes in "freedom", as well - the user has control over his or her system and can do what he wants with it.
https://fedoraproject.org/wiki/Forbidden_items | https://wiki.archlinux.org/index.php/The_Arch_Way

Offline

#2 2014-04-09 23:44:27

ThePacman
Member
From: Classified
Registered: 2013-09-19
Posts: 127

Re: Problems with a Radeon card, an HDMI adapter, and an unusual TV

When using the 1360x768 method (green bars and orange static) , I found that the framerate for that entry was off, despite running cvt with a specified refresh rate.

$ cvt 1360 768
# 1360x768 59.80 Hz (CVT) hsync: 47.72 kHz; pclk: 84.75 MHz
Modeline "1360x768_60.00"   84.75  1360 1432 1568 1776  768 771 781 798 -hsync +vsync
$ cvt 1360 768 60
# 1360x768 59.80 Hz (CVT) hsync: 47.72 kHz; pclk: 84.75 MHz
Modeline "1360x768_60.00"   84.75  1360 1432 1568 1776  768 771 781 798 -hsync +vsync
$ gtf 1360 768 59.80

  # 1360x768 @ 59.80 Hz (GTF) hsync: 47.54 kHz; pclk: 84.43 MHz
  Modeline "1360x768_59.80"  84.43  1360 1424 1568 1776  768 769 772 795  -HSync +Vsync

$ gtf 1360 768 60 

  # 1360x768 @ 60.00 Hz (GTF) hsync: 47.70 kHz; pclk: 84.72 MHz
  Modeline "1360x768_60.00"  84.72  1360 1424 1568 1776  768 769 772 795  -HSync +Vsync

As you can see, CVT ignores the specified refresh rate, while GTF does not.

I added the 59.80 Hz mode as "1360x768" and the GTF 60 Hz one as "1360x768_60.00" and here's the output of xrandr:

Screen 0: minimum 320 x 200, current 1280 x 720, maximum 8192 x 8192
VGA-0 disconnected (normal left inverted right x axis y axis)
DIN disconnected (normal left inverted right x axis y axis)
DVI-0 connected 1280x720+0+0 (normal left inverted right x axis y axis) 16mm x 9mm
   1920x1080i    60.00 +  59.94  
   1920x1080     60.00    59.94  
   1280x720      60.00*   59.94  
   1440x480i     60.00    59.94    59.94  
   720x480       60.00    59.94  
   640x480       60.00    59.94  
   1360x768      59.80  
   1360x768_60.00  60.00

Both modes give green bars and orange static, but I thought this information might be important/useful.


Fedora believes in "software freedom" - that is, restricting user software choices to those deemed appropriately licensed by The Powers That Be.
Arch believes in "freedom", as well - the user has control over his or her system and can do what he wants with it.
https://fedoraproject.org/wiki/Forbidden_items | https://wiki.archlinux.org/index.php/The_Arch_Way

Offline

#3 2014-04-10 04:47:22

Max-P
Member
Registered: 2011-06-11
Posts: 164

Re: Problems with a Radeon card, an HDMI adapter, and an unusual TV

	 --set "underscan hborder" 35\
	 --set "underscan vborder" 20\
	 --set underscan on

You likely always will have some issues if you use underscan because to compensate overscan, it means the TV is always rescaling the image anyway to apply the overscan. What you describe as not scaled is probably because the pixels happens to be at the right place and cause only little blur instead of the usual obvious scaling blur.

Have you tried checking the TV's menus (especially in advanced settings) if you can switch the image mode to Native/Original/Source whatever it might be called instead of 16/9 or auto? Pretty much all TVs apply overscan by default, but on every TV I ever seen there's always an option to disable the overscanning and get full native resolution. Usually in picture settings, advanced settings, and sometimes also ont the remote (P.SIZE on mine). If you can find this, it will get rid of the entire underscan configuration and the native resolution probably will be one of the default listed by xrandr. 1366x768 seems rather odd for a TV, they usually are either HD-ready (1280x720) or Full HD (1920x1080). 1366x768 is usually for laptops and small widescreen computer monitors. How do you know it really is 1366x768? Chances are that the TV actually is 1920x1080 and the blurriness is caused by your underscan setting. I have never seen any TV/monitor advertising over their native resolution, especially since every source attached via HDMI/DVI support 720p for regular HD-Ready TVs.


If the option does not exist and I understant right, you might want to try 1436x808, and hopefully when the TV overscan that it will happen to be exactly the native 1366x768 resolution (+2x35 horizontally and +2x20 vertically) and thus the scaling will cancel itself. But I wouldn't bet too much on that. Also, the noise you are getting on the special modes might be due to timing issues. Some TVs will successfully stabilize the image (because of the analog input which pretty much always have some degree of noise/timing problems), but that causes noise similar to what you described. Or it could also just be that the scaler gets confused over that weird mode, I don't really know that's all suppositions.

Good luck!

Offline

#4 2014-04-10 20:23:59

ThePacman
Member
From: Classified
Registered: 2013-09-19
Posts: 127

Re: Problems with a Radeon card, an HDMI adapter, and an unusual TV

Max-P wrote:
	 --set "underscan hborder" 35\
	 --set "underscan vborder" 20\
	 --set underscan on

You likely always will have some issues if you use underscan because to compensate overscan, it means the TV is always rescaling the image anyway to apply the overscan. What you describe as not scaled is probably because the pixels happens to be at the right place and cause only little blur instead of the usual obvious scaling blur.

Have you tried checking the TV's menus (especially in advanced settings) if you can switch the image mode to Native/Original/Source whatever it might be called instead of 16/9 or auto? Pretty much all TVs apply overscan by default, but on every TV I ever seen there's always an option to disable the overscanning and get full native resolution. Usually in picture settings, advanced settings, and sometimes also ont the remote (P.SIZE on mine). If you can find this, it will get rid of the entire underscan configuration and the native resolution probably will be one of the default listed by xrandr. 1366x768 seems rather odd for a TV, they usually are either HD-ready (1280x720) or Full HD (1920x1080). 1366x768 is usually for laptops and small widescreen computer monitors. How do you know it really is 1366x768? Chances are that the TV actually is 1920x1080 and the blurriness is caused by your underscan setting. I have never seen any TV/monitor advertising over their native resolution, especially since every source attached via HDMI/DVI support 720p for regular HD-Ready TVs.

Yes.. I am completely sure the TV is 1366x768 for several reasons: firstly, it adds up to that when I take a look at the overscan amount on 1280x720 (which the TV crops but does not scale.)
Also, the TV has an auto demo feature where it says explicitly that the TV is 1366x768 pixels, which is accompanied by an unmistakable diagram.

Max-P wrote:

If the option does not exist and I understant right, you might want to try 1436x808, and hopefully when the TV overscan that it will happen to be exactly the native 1366x768 resolution (+2x35 horizontally and +2x20 vertically) and thus the scaling will cancel itself. But I wouldn't bet too much on that. Also, the noise you are getting on the special modes might be due to timing issues. Some TVs will successfully stabilize the image (because of the analog input which pretty much always have some degree of noise/timing problems), but that causes noise similar to what you described. Or it could also just be that the scaler gets confused over that weird mode, I don't really know that's all suppositions.

Good luck!

If I use a higher resolution than 1366x768, the TV will crop it (down *to* 1366x768). If it is much higher, it will scale and crop it. But at 1280x720, the TV itself does no scaling.
As for the TV settings, I'm already using Full aspect which is the most "leave-the-picture-alone" of all the options, and does not crop anything except computers connected via an HDMI adapter.

Last edited by ThePacman (2014-04-10 20:26:01)


Fedora believes in "software freedom" - that is, restricting user software choices to those deemed appropriately licensed by The Powers That Be.
Arch believes in "freedom", as well - the user has control over his or her system and can do what he wants with it.
https://fedoraproject.org/wiki/Forbidden_items | https://wiki.archlinux.org/index.php/The_Arch_Way

Offline

Board footer

Powered by FluxBB