Subject: Re: C Language Standard(s)
To: None <current-users@NetBSD.ORG>
From: der Mouse <mouse@Collatz.McRCIM.McGill.EDU>
List: current-users
Date: 01/10/1996 03:06:53
>> The K&R style definition breaks things.
> No the incorrect prototype breaks things.
> Or [...] the programmer broke things
Or the compiler broke things, by not implementing gcc's
prototype-overrides-definition semantics. Depending on how you want to
look at it, any of the pieces can be thought of as breaking things. I
don't see much point in trying, though; just note that
"int foo(short x);" is inconsistent with "int foo(x) short x; {...}"
and leave it at that.
> by thinking
> int foo(x) short x; { ... }
> meant anything.
It does mean something; it has reasonably well-defined semantics. They
might not quite obey the principle of least surprise, but they're
there.
> [...] I don't recall the last time I wrote a fucntion that took a
> sub-int as an argument.
Are you sure you'd realize it? If you wrote anything using comp_t,
dev_t, uid_t, or gid_t, well, those are sub-int on my NeXT at home. On
the Suns at work, those plus mode_t, nlink_t, and wchar_t. On
NetBSD/sun3, I didn't find any *_t types that were really shorts.
> Thus for me old-style defintions lose me nothing - my code always
> behaves correctly. And I get maximum portability of my code.
For maximum portability, you must be using the *_t types...and if you
ever pass them directly, you're using sub-int types without realizing
it, and worse, you're doing so on only some platforms.
> Not if you fill the code base with
> char foo(short x) {...}
> when you convert that sort of thing back to K&R you will have a _lot_
> of work to do to debug it.
Um, you will in any case, and gcc has warning options (-Wconversion) to
help find danger spots.
der Mouse
mouse@collatz.mcrcim.mcgill.edu