Subject: Re: radeon driver design (was Re: generic virtual consoles)
To: der Mouse <mouse@Rodents.Montreal.QC.CA>
From: Kurt J. Lidl <email@example.com>
Date: 12/29/2005 12:23:47
On Wed, Dec 28, 2005 at 07:09:13PM -0500, der Mouse wrote:
> > We aren't the only OS which has to worry about graphics performance.
> True, but...
Well, if you look at what the hardware protected mode versions of
Windows has done, it gets pretty complicated:
NT 3.5 -- user-mode graphics drivers -- users complain it is "slow"
NT 4.0 -- kernel-mode graphics drivers -- much faster, less secure,
more vendor supplied (buggy drivers)
W2000 -- still kernel-mode
Vista -- (not yet released) -- back to user-mode graphics drivers
> > Windows and MacOS X have to have addressed this problem already, and
> > from looking at the performance of games on either platform, they
> > have done it well.
> ...there is nothing to say they have done it cleanly, elegantly, or
> extensibly, all three of which I would hope would be true of any
> solution NetBSD adopts along these lines.
> Even if they have, certainly Windows and probably OSX[%] do not
> publicize the ways they have solved these problems, so the "existing
> art" is not really available for inspection.
> [%] Darwin is open-source, but Aqua - the snazzy graphics layer -
> isn't. I don't know how much the underlying Darwin drivers could
> tell us; it might be worth looking, but I don't expect anything
> terribly useful. For all I know there are no Darwin drivers
> involved, with the graphics drivers being considered part of Aqua
> and dynamically loaded a la our LKMs....
Well, Aqua is the user-interface layer on the Mac. Quartz and Quartz
extreme are the layers that actually interface to the hardware. There
was a fascinating article at ArsTechnica about this, which goes into
more detail than most about the evolution of Quartz (and its morphing
into Quartz Extreme). Quartz and Quartz Extreme basically push
a huge amount of the rendering complexity onto the GPU on the display
card. The figure that accompanies page 14 of the article tells the
story -- a small amount of PCI/AGP bandwidth is used for sending
commands to the GPU, and the GPU's huge bandwidth to the display
memory is exploited to actually do rendering and drawing.
The article (starting on page 14) is here:
The linked-to graphic that shows the evolution of the Quartz components
(from various MacOS releases) use of the GPU is here:
All this arguing about "just gimme a flat frame buffer and I'll
twiddle the bits myself" fly in the face of modern high-performance
graphics hardware. While it certainly was fashionable to whack
directly on framebuffers in 1988, it is just not a sane way of
extracting good performance. Anytime one can exploit the a graphics
accelerator that has more-or-less direct access to the framebuffer
memory, it is almost certainly going to be a performance win. The
problem arises in a cross-platform system like NetBSD that wants
to support the full range of hardware that crosses multiple
generational boundaries -- older hardware may just be a a dumb
framebuffer (e.g. cg3) and requires a full software implementation
to do anything. Newer hardware (e.g. cg6) might support some
accelation features (line drawing, polygon fill, etc) and that
helps a lot. Semi-modern hardware (e.g. FFB) can do all that and
much, much more.
In the Apple world, the Quickdraw was the original ultimate interface
where you could tell it draw this line, fill that rectangle, etc,
and it would magically make it happen on your display hardware.
Most often by having the main CPU twiddle the bits and then update
the framebuffer. This is now deprecated in OS X, and someday (in
the distant future) will be removed.
NetBSD's various hardware drivers have a graphics model that is
far, far below the functionality of even Quickdraw. Part of this,
of course, is due to a need to support low-level graphics acceleration
on old hardware (console text acceleration), and part of this is
due to the supporting X11 as the "portable" bitmap display standard.
Anyhow, I don't have a solution, just some pointers to other data
points for others to think about.