Subject: Re: C Language Standard(s)
To: None <current-users@NetBSD.ORG>
From: der Mouse <mouse@Collatz.McRCIM.McGill.EDU>
List: current-users
Date: 01/08/1996 10:56:54
>> I don't quite understand why we need pre-ANSI C for bootstrapping
> The current kernel code is _not_ pre ANSI.

It's pre-ANSI in that it uses only the restricted subset of ANSI that
is also compatible with pre-ANSI C.

I'd say it _is_ pre-ANSI.  What it isn't is _non_-ANSI.

> What does anyone think it is costing - using the old style
> definitions?  Why change something that is not broken?

Personally?  I'd say it's costing debugging time.  When I started
routinely using -Wstrict-prototypes -Wmissing-prototypes for all my new
code, a whole class of bugs simply stopped happening, and upon recoding
old programs in the new style I found at least one long-standing and
extremely irritating bug I'd been otherwise unable to find.

Now, the kernel isn't to the point where it can use those two options.
But as long as we stay away from full prototypes it never will.

> The only code where new vs old style definitions makes a difference
> is code which uses stdargs and functions that want to pass sub-int
> types.

At present, on most architectures, but this is not necessarily so.  I
think the calling compatability rules are such that in the presence of
prototypes the compiler may choose to tune the argument passing method
for the particular argument pattern under consideration, latitude which
it doesn't have for code that depends on the old-style-compatability
grandfather clauses.

Unless of course you're talking about the gcc language instead of the C
language, in which case you can do old-style definitions provided you
have a modern prototype in scope.  But I thought the whole point of
this discussion was the potential for compiling under non-gcc
compilers; if we're assuming gcc, we might as well just stop blabbering
write prototypes.

					der Mouse

			    mouse@collatz.mcrcim.mcgill.edu