You are not logged in.

#1 2006-09-09 16:22:18

chrismortimore
Member
From: Edinburgh, UK
Registered: 2006-07-15
Posts: 655

I feel like I really missed something somewhere...

So I was watching Family Guy last night using Xine (I like menus), and I noticed that the quality was really bad, so I cranked up the deinterlacing, and once my CPU topped 100%, the quality was "acceptable". 

But then I was cooking, so the menus annoyed me (it doesn't have a play all button), so I used mplayer with this command: "for i in `seq 2 9`; do mplayer dvd://${i} -dvd-device /dev/hdc -vo xv; done", and the picture quality was perfect (note the lack of any deinterlace filters).

So I wondered why Xine was giving me such crap quality, and I noticed that "Video Device" was set to "Auto" (which I always thought meant it picked xv over all of the others), but I changed it to "xv", and now I get the same quality as mplayer without any deinterlacing.

Now, I swear a few years ago when I was fiddling with that kinda thing, I had to use deinterlacers with "xv" to get the same quality as I get now (using just xv alone).  So, does xv now use my graphics cards onboard MPEG-2 decoding, or does xv have it's own deinterlacer, or does something else happen?  I really feel like I missed something somewhere...


Desktop: AMD Athlon64 3800+ Venice Core, 2GB PC3200, 2x160GB Maxtor DiamondMax 10, 2x320GB WD Caviar RE, Nvidia 6600GT 256MB
Laptop: Intel Pentium M, 512MB PC2700, 60GB IBM TravelStar, Nvidia 5200Go 64MB

Offline

Board footer

Powered by FluxBB