Subject: Re: DVD movies.
To: None <jonl@yubyub.net>
From: Richard Rauch <rauch@rice.edu>
List: netbsd-help
Date: 07/22/2002 12:02:41
> > At least under XFree86 3.3.6 (no Xv extension, I believe), the quality is
> > something of a trade-off against what mplayer gives.  The picture appears
> > smoother in ogle, but the contrast is lower (and/or the value of each color
> > is lower).  As a result, black comes out as somewhat washed-out as dark grey,
> > rather than black.
> >
> >
> > mplayer has decidedly better colors, and I think a more correct aspect ratio.
> >  (Correcting the aspect may be why mplayer's playback looks less smooth.)
>
> [snip]
>
> > Lastly, mplayer (at least) seems to only want to use 12 bits per pixel for
> > me.  So, playing in 24 bits doesn't make its life any easier than in 16 bits
> > (and, in fact, requirs 50% more data to be pumped out)
>
> This could be what is making mplayer look brighter or "better."  If it is

No, I don't think so.  The color variation between what mplayer produces
is visually substantialy higher (more than rounding/truncating to 12 bits
can explain).  However, such truncation may explain the slightly less
smooth effect of mplayer.  (On the other hand, 12bpp *is* fairly capable,
especially if you don't have 20/20 vision and don't press your nose to the
monitor...(^&)

(But I still favor the aspect-scaling theory to explain the fuzziness.  )

My present theory is that Ogle is doing some kind of simplistic color
correction that washes out the low end---or isn't doing any color
correction at all, while mplayer either isn't doing correction (thus
leaving "black" as "black"), or is doing more intelligent correction.


If you care about a smooth per-frame image, ogle is better.  If you care
about color quality, mplayer is better.  Both skip frames in order to keep
up, and mplayer is more work to figure out.  (Some, or all, of the
playback problems may go away if you have an AGP card and the Xv extension
from XFree86 v4.x.)


> indeed only throwing out 12bpp, you're looking at a severely posterized image
> (i.e. 4 bits per color channel, essentially).  Some DVD players (and ATI

Remember that 4 bits per channel != 4 levels, and 12bpp != 12 colors.

Calling 12bpp "severely posterized", I think, is a bit extreme.  (^&


> cards, when using hardware decoding, BTW) use 10 bits per color channel for
> 30bpp effective color depth.  12bpp, like a well known detergent, makes your
> "whites look whiter" and your "blacks look blacker" due to the smaller
> available pallette.

I suspect you've not actually used it, or do not appreciate how
significant the variation between mplayer and ogle is---or overestimate
the difference between 12bpp and 16bpp (or even 12bpp and 24bpp).  If
represented in 12bpp, Ogle's "black" would be about 222 on my system, I
think.  mplayer's black is black (000).

mplayer claims to have software controls for brightness, etc., but these
apparently are disabled for me (maybe if I were using Xv, they would be
availble?).  I haven't found any corresponding runtime controls for ogle.
(If such controls were available, turning down Ogle's brightness and
raising the contrast might help.)


> > The short of it: It works.  No need to upgrade OS, throw out my perfectly
> > good PCI video card, or switch to XFree86 4.x.
>
> Again, the video card will most likely have to be upgraded in the future, if
> you get serious about it and/or you start noticing video artifacts.  Until

If the server is running in 24bpp, there are artifacts (most visible when
the image changes rapidly, as when a beam of light moves rapidly over an
object).  This causes visible tearing.  At 16bpp, the images don't degrade
(since they are 12bpp anyway) and the tearing goes away, being replaced by
smoother animation.

It may be that mplayer (or the X server) is turning off double-buffering
in 24bpp, due to limited memory on the card, or it may be that mplayer is
just pushing as hard as it can with 24bpp and can't keep the tearing from
happening.  (I haven't tried ogle on a 24bpp display, so I don't know
how it fares there.)


I'm not sure what constitutes "getting serious".  A movie *is*
entertainment, after all.  (^&  (And I don't plan on charging admission to
let people watch movies on my computer.)  However, I'm satisfied enough
that I'll probably buy a movie from time to time.


> then, enjoy!  If it isn't broken, don't fix it (you won't hear me say that
> again ;-).
> WRT ATI vs. others, there won't be a huge difference unless you start getting
> into hardware acceleration/overlay channels.  The big advantage of ATI vs.
> Geforce2, for example, is the afore mentioned 10 bit/color channel decoding
> capability.  Believe me, 10 bit vs. 8 bit per channel makes a _huge_

But how much is actually coming out of your card?  If your card claims to
be a 32bpp card, realize that 32bpp usually means 8bpp are taken up by an
alpha channel.  If your card is "really" only doing 24bpp of actual color
output, then those extra 2 bits per channel per pixel are just going to
help with antialiasing.  (Not a bad idea, but not fundamentally impossible
with a software decoder, either.  (^&)


> difference.  But, then again, I'm picky about my video.

Heh.  Most are, I think.  The question is the relative weight that <X>
enjoys versus <Y>.  I'm a grad student with very low income at the moment,
so a $400 or $500 video card just on which to watch a handful of movies is
a huge waste.  A $40 or $60 card that I'll "have" to replace in time with
a "real" card for DRI support is also a waste.  A $100 or $150 card when
DRI is available is almost defensible, though, so I'm waiting...the longer
I wait, the more leverage I get for that $100 or so.


> In the end, I'm glad to hear you've got everything running.  Enjoy!

As am I.  (^&


  ``I probably don't know what I'm talking about.'' --rauch@math.rice.edu