Subject: Re: example for k&r being bad (was: Re: Bluetooth protocol code)
To: None <>
From: der Mouse <mouse@Rodents.Montreal.QC.CA>
List: tech-userlevel
Date: 12/20/2005 00:40:38
>> The problem is that K&R prototypes

Um, when discussing C, "prototype" is a technical term, and it refers
to something K&R doesn't have.  So "K&R prototypes" is an oxymoron.  Do
you mean "K&R declarations", or what?

>> won't work right if we pass in a parameter that's larger than an
>> int.  Like a long or a pointer on an LP64 system.

>> In such a case, the caller will turn the parameter into an int,
>> truncating it.  Worse yet, the caller will then pack other
>> parameters in, not leaving enough space for what the caller will
>> expect the parameter to be.

This is a rather confused description.

In K&R C, you have to declare function formal parameters with types
that, when promoted according to the argument-promotion rules, match
the types of the actual arguments - or you get implementation-specific
(and formally undefined) behaviour.

In particular, declaring a formal argument as a long and then passing a
long works.  Same for pointer types, whether or not pointers are wider
than ints.

What won't work is passing an int where the function expects a long, or
vice versa.  (Unless int and long are really the same on the
implementation in question, of course.)  In this case most
implementations will end up skewing the argument list, somewhat as
sketched above (but even this is not guaranteed; I think doing such
things means it's actually nasal demon time).

Now, of course, our situation is somewhat different.  We don't use a
K&R compiler, and ANSI/ISO C has fairly precise descriptions of what
happens to old-style ("K&R") code.  When there is no prototype
declaration in scope (ie, an old-style declaration, or no declaration
at all), and the definition is an old-style definition, it's
approximately as above.  I think calling functions with no prototype in
scope but with a new-style definition "works right" provided the
function takes a fixed number of arguments and their formal types match
the actual argument types after the default promotion rules are
applied.  Anything else is, I think, undefined or

In particular, mixing prototype declarations with old-style definitions
is not permitted in general (I think it's permitted under some
circumstances, but I'm not entirely clear what they are - something
like, if all the formal argument types match the corresponding
prototype argument types, and they are all types that are unchanged
under the default promotion rules).  gcc extends this in a rather
bizarre way, allowing a prototype declaration to override(!) the formal
parameter types specified by an old-style definition - an invitation to
write grossly nonportable code in the name of portability(!!).

> I think it would be useful to have a smallish piece of code that we
> can point people at and say "look, differences between architectures
> if you use K&R"...

int pr(x,y)
long int x;
int y;
 printf("%ld %d\n",x,y);

int main(ac,av)
int ac;
char **av;

Under IL32 or IL64 (actually, when int and long are the same as far as
argument passing goes), this "works"; when long occupies more
argument-list space than int, you get brokenness - probably either
4294967298 or 8589934593, followed by a "random" value that generally
comes from uninitialized stack trash.

Convert pr to "int pr(long int x, int y)" and the misbehaviour
disappears, because the arguments get promoted properly (a new-style
definition also provides a prototype declaration).  Write the call as
pr(1L,2) and you're OK too, even with the old definition, because then
the arguments match the definition.  Making x ordinary int (not long)
and passing 1L produces brokenness in the other direction when
prototypes aren't in use.

/~\ The ASCII				der Mouse
\ / Ribbon Campaign
 X  Against HTML
/ \ Email!	     7D C8 61 52 5D E7 2D 39  4E F1 31 3E E8 B3 27 4B