You are not logged in.
Pages: 1
i know that there is something i change in the monitor section of /etc/X11/xorg.conf, but the way hwd did it is kind of confusing. Here is my xorg.conf http://pastebin.archlinux.org/14696
btw, if anyone sees anything in my xorg.conf that looks wrong please pm me.
Last edited by theringmaster (2007-09-18 03:20:19)
Check me out on twitter!!! twitter.com/The_Ringmaster
Offline
This is my "Device" section. It successfully sets the DPI to 96x96. I hope this is what you need.
Section "Device"
Identifier "device1"
Driver "nvidia"
VendorName "nVidia"
BoardName "NVIDIA GeForce FX (generic)"
Option "UseEdidDpi" "false"
Option "DPI" "96 x 96"
EndSection
Last edited by skottish (2007-09-18 03:33:35)
Offline
What skottish said. As far as errors in xorg are concerned it's best to look in /var/log/Xorg.0.log and look for the lines that say (WW) warning or (EE) error.
Offline
This is my "Device" section. It successfully sets the DPI to 96x96. I hope this is what you need.
Section "Device" Identifier "device1" Driver "nvidia" VendorName "nVidia" BoardName "NVIDIA GeForce FX (generic)" Option "UseEdidDpi" "false" Option "DPI" "96 x 96" EndSection
i installed the 100 dpi font package and I would like to be able to use them properly. FYI
i guess i would use Option "DPI" "100 x 100" then, right?
Check me out on twitter!!! twitter.com/The_Ringmaster
Offline
i guess i would use Option "DPI" "100 x 100" then, right?
Yes, as long as you have 100 DPI fonts installed it should work.
Offline
yes i do these to be exact http://www.archlinux.org/packages/13788/
Check me out on twitter!!! twitter.com/The_Ringmaster
Offline
There's some pretty good xorg.conf examples in the wiki that might be able to help you sort out any WW or EE lines that you find in your xorg log file. I remember it took a little tweaking before I got my log file warning and error free.
-- archlinux 是一个极好的 linux。
Offline
I have made a new thread to handle my warnings in. I didn't want to hijack my own thread. hehe
Check me out on twitter!!! twitter.com/The_Ringmaster
Offline
by changing my dpi to 100, i won't hurt anything will I. Like i won't fry my monitor, or will it??
Check me out on twitter!!! twitter.com/The_Ringmaster
Offline
What i do to set DPI is
(resolution*25.4)/(DPI)
So if i want a DPI of 96 and my resolution is 1280x800
1280*25,4/96 = 338,6 = 339
800*25,4/96 = 212,3 = 212
then in xorg.conf
i put:
DisplaySize 339 212
in Section "Monitor"
Last edited by AlmaMater (2007-09-18 18:59:27)
Offline
by changing my dpi to 100, i won't hurt anything will I. Like i won't fry my monitor, or will it??
No, it's purely a matter of math when drawing fonts and stuff. No monitor settings are changed. One would theoretically set it equal to the actual number of pixels on their screen that fit in an inch. People tend not to do this though.
Offline
Pauldonnelly mentioned that, ideally, people would set their resolution so that the PPI (Digital equivalent of DPI) would be the same as the actual number of pixels a monitor contains per inch. How do I figure out the number of pixels my monitor contains per an inch?
It measures exactly 333mmx270mm (width x height), but I would think some monitor specifications would be required to calculate the number of pixels the monitor contains per an inch.
Offline
Pauldonnelly mentioned that, ideally, people would set their resolution so that the PPI (Digital equivalent of DPI) would be the same as the actual number of pixels a monitor contains per inch. How do I figure out the number of pixels my monitor contains per an inch?
It measures exactly 333mmx270mm (width x height), but I would think some monitor specifications would be required to calculate the number of pixels the monitor contains per an inch.
Why?
If your resolution is : X x Y, just do X / 333 * 25.4 and Y / 270 * 25.4
Afaik, X tries to get the physical dimensions from your monitor using DDC. If it works correctly, it will already use these values this by default.
Anyway, on my desktop with a 19' crt, I get this by default :
> xdpyinfo| grep -A1 dim
dimensions: 1280x1024 pixels (361x271 millimeters)
resolution: 90x96 dots per inch
And on my laptop with a 15'4 lcd :
> xdpyinfo| grep -A1 dim
dimensions: 1280x800 pixels (331x207 millimeters)
resolution: 98x98 dots per inch
I don't have any problems with these.
pacman roulette : pacman -S $(pacman -Slq | LANG=C sort -R | head -n $((RANDOM % 10)))
Offline
For a majority of people, having the resolution set to match the real number of pixels a monitor contains would be worthless. However, it seems to me that setting up a monitor this way would allow for maximum use of the monitor's capabilities while not trying to do more than the monitor can render, with the highest possible display accuracy. Sure, there is probably no visual difference. However, for digital artistry I am an accuracy pedant, so I really am interested in pauldonnelly's proposition. I fully understand those equations, but I am interested in how I could figure out the real number of pixels my monitor has.
Offline
The "real number of pixels my monitor" has? A Liquid Crystal Display (LCD) has a fixed number of pixels by design. Driving the display at any resolution below the actual causes scaling (usually implemented in hardware) to be done. The LCD's specifications describe the actual number of pixels, but it's usually pretty obvious when scaling is occurring -- the output looks blurry. If you take a close look at a normal LCD, you can see the individual pixel cells with the unaided eye rather easily.
Cathode ray tube (CRT) displays have no actual number of pixels (though they do have a maximum drivable resolution). The technology works by firing a stream of electrons at a phosphor-coated surface, exciting the phosphors to emit light. Depending on how the electron gun is actually used, you can get an extremely wide number of possible resolutions.
Plasma-based displays used gas sealed in discrete containers and thus has fixed, identifiable pixels just like LCDs.
Personally, I've found 96 dots-per-inch (which is generally the default for font rendering everywhere) to be readable on many displays of widely varying resolution. Manually setting your DPI to be higher is often of no particular benefit.
Offline
Thank you very much! I do have an LCD, so I will see about specifications for it.
Offline
I fully understand those equations, but I am interested in how I could figure out the real number of pixels my monitor has.
It has however many pixels your monitor is set to. If you're running 640x480 and your monitor is 10 inches wide, then you have 64 dpi horizontally. Most monitors tend to clock in at around 100 dpi when they're using what you might call a "comfortable" resolution, I've noticed, so blindly selecting 100 dpi works out pretty well.
Offline
Wow. That made things so much easier to read at 96x96 DPI. The fonts look a lot better, too. I think X was setting them to 82x86. Fonts looked smallish and sometimes hard to read. Sadly I have some vision problems so many thanks for the tip. Glad I looked into this thread.
~jnengland77
Last edited by jnengland77 (2007-10-01 20:18:17)
Offline
Pages: 1