Subject: Re: Style guide
To: D'Arcy J.M. Cain <darcy@druid.net>
From: Ask Dr. Stupid <greywolf@starwolf.starwolf.com>
List: current-users
Date: 05/28/1997 18:56:41
D'Arcy J.M. Cain sez:
/*
* Oh, just to add more fuel to the fire, I also have a problem with the
* use of (void) in front of every function call that discards it's return
* value. Does anyone have a good arguemnt for keeping this? It really
* bloats code for no reason IMO.
*/
[NOTE: Before you continue, please keep in mind that I am not deliberately
attempting to be rude or to hurt feelings, step on toes or be a general
a*hole. What I *am* attempting to do is to perhaps bring some light
to some issues which have been hitherto blown off. I have seen handwaving
almost to the complete exclusion of explanation, and much of the explanation
has been handwaving as well. The ratio of rational:irrational has been
substantially less than 1:1.]
To borrow a THNDism:
"Wall?"
Pardon me if I seem crass or rude, here, but good god(desse)s,
_what_ *are* you _thinking_, man?
It's perfectly conforming to ANSI to cast something to (void) when ignoring
a return value; in fact, gcc seems to demand it, and lint will certainly
complain about it!
Let's take close(2), for example.
There are times when I want to use the return value from close(),
and there are times when I don't. It would be an inconsistency to
re-declare close to be void close(), and it generates warnings to
use
close(fd);
without assigning it to something. Now, tell me, what's more code bloat?
A. (void) close (fd);
B. int rv;
.
.
.
rv = close(fd);
I'd say B., personally; I don't think there's a rhyme or reason to
force an allocation of a variable to hold a return value from a function
whose return value I want to ignore anyway. What's the point? The
void type was introduced, and the cast notation was introduced earlier;
it's only natural that you'd mix the two.
Regarding type promotion et al: Why doesn't ANSI just specify that
old-style definitions are correct if they specify the promoted type of
that specified in the declaration? It would make life a whole lot
simpler.
K&R isn't going to go away; I certainly don't proto in it because it's
a pain in the ass _for me_ (note opinionatic stance). If I want to
release something, I'll go back and whack it up to be ANSI, then everyone's
happy. But it's easier and faster for me to write it up in K&R first.
__P()/__STDC__ isn't going to go away, either, for two very big reasons:
1. K&R/ANSI-compatible compilations are _everywhere_; because
2. _NOT_ _EVERYBODY_ _HAS_ (_ACCESS_ _TO_) _AN_ _ANSI_ _C_ _COMPILER_!
This has been listed by quite a few folks. What is so unclear about this
statement that the ANSI-only extremists (pardon the term; I don't know
how else to put it) do _not_ understand?
This has gone beyond religious into overstepping the bounds of
practicality. You don't seriously expect every bit of useful code
to be ANSIfied overnight, now, do you?
--*greywolf;
SIDE TANGENT:
BTW, regarding Sun's compilers: I've not only seen cases where
it refuses to produce correct code, I've seen a couple of cases where
it refuses to accept code which (supposedly) conforms to its own
implementation. Usually at the preprocessor level.
--*greywolf;
--
Support Open Operating Systems -- subvert the Microsoft paradigm.