Subject: Re: 'unusual' resolutions on X
To: None <current-users@netbsd.org>
From: Martijn van Buul <pino@dohd.org>
List: current-users
Date: 11/10/2005 08:37:20
["Followup-To:" header set to gmane.os.netbsd.current.]
> If you also use the DVI input, you get precisely 1 screen pixel for each
> graphics pixel with no illumiation of the 'wrong' phosphors due to beam
> focus and/or colour mask misalignment.

I know from own experience that you really don't need a DVI input for that.
My display @ work (a 21" Samsung SyncMaster 213T) works just great off an
analog VGA input - Apperently my employer felt the need to replace my 21" CRT
with a TFT, even though I was perfectly happy with the CRT, but really couldn't
afford to shelve out another 40 euro to get a budget video card with DVI
output.. 

> Don't believe the 'auto-adjust' for analogue input.  You need to display
> a full width text window with a small font (6 pixel/char width) and
> enter a long line of 'm' characters, then adjust the pixel clock and phase
> in a vague attempt to get every vertical line aligned with a column of
> pixels.

*shrug* Works For Me. I even throw in sub-pixel rendering for the fun and
readability of it. My own TFT panel at home (A meager Samsung 17" one, about
two years old now) doesn't even have DVI input, and I really don't see any
justification for replacing it - not to mention that I'd have to replace my 
KVM switch as well, if I would, and that I'd have to buy replacement video
cards for systems which run in textmode only anyway.

Mar "DVI considered redundant ;)" tijn

-- 
    Martijn van Buul - pino@dohd.org - http://www.stack.nl/~martijnb/
	 Geek code: G--  - Visit OuterSpace: mud.stack.nl 3333
 The most exciting phrase to hear in science, the one that heralds new
discoveries, is not 'Eureka!' (I found it!) but 'That's funny ...' Isaac Asimov