Subject: Re: bin/7592: programs' error handle broken by good intentions
To: Greg A. Woods <woods@most.weird.com>
From: Olaf Seibert <rhialto@polder.ubc.kun.nl>
List: netbsd-bugs
Date: 05/26/1999 23:02:56
Since we're discussing "traditional C", all references will be to
Kernighan & Ritchie, 1st edition.

On Wed, 26 May 1999, Greg A. Woods wrote:

> In traditional C the expression "(bufp == NULL)" was covered by "the
> usual arithmetic conversions"

Not true. The "usual arithmetic conversions" are described on p.41
and pointers are not mentioned there. Which is for the better, since a
pointer is not an arithmetic type.

> i.e. if "NULL" is defined as a plain '0'
> (as it always was -- NULL was just a more meaningful name for zero) and
> "bufp" is declared as something somewhat different and perhaps wider
> than an ordinary "int", such as a pointer to a "char", then the '0'
> would be promoted to be the same width as that of the type of 'bufp',
> and indeed pointers were merely integers (though perhaps with a unique
> width and ), and any "null pointer" had the integer (albiet usually
> wider than an "int") value of zero.  There was literally no difference
> between pointers and "long"s w.r.t. the type conversions to/from "int"s
> in assignment or equality operands, or function parameters.

Not true. See p.102, section 5.6, conveniently titled "Pointers are not
Integers". 

> Everything
> was very clean and elegant.

[...]
> So, it appears that (at least in c9x) you are right:  the integer value
> of 0 will indeed be treated specially when assigned to a pointer or
> passed as a pointer value, even though this was not necessary in
> traditional C.

This was already the case with "traditional C", see the top of p.98,
where NULL is explained:

    In general, integers cannot meaningfully be assigned to pointers,
    zero is a special case.

(Appendix A, page 192 mentions that this holds only for constant
expressions with value 0)

> This is unfortunate because it's still possible to call a function
> without a prototype declaration in scope (i.e. within the definition of
> the language), and passing zero as a parameter to such a function
> expecting a pointer will potentially cause problems because this
> automatic cast which might in fact widen the representation of zero to
> something of greater rank than an ordinary 'int' will be missing (as it
> most certainly will on many architectures if the same code is compiled
> with an older non-Standard compiler, and as I say even c9x will still
> permit traditional function definitions).  Standard C has apparently
> sobattoged the ability of a standard compiler to warn of failed
> backwards compliance.  It also means it is impossible to give any

Not true. A Standard C compiler can warn for anything it likes. (No
reference to K&R1 here, obviously)

> non-zero integer values special meaning in relation to pointers (such as
> -1 or ~0) and expect them to behave the same as zero does in assignments
> and function parameters where pointer values are expected.

Huh? I don't think you mean what you write here - why would you want to
make p = -1 do the same as p = 0 ?

[...]
> be made of "a null pointer constant" in the first place (in order to
> permit at least one misguided compiler vendor to get away with "#define
> NULL ((void *)0)", in case anyone doesn't remember any of the debates of
> the original ANSI C standardisation effort).  They may as well have just

I agree that #define NULL ((void *)0) is not a good idea, since that is
a null object pointer and cannot be converted into a null function
pointer.

> made NULL a keyword and been done with any formal association of null
> pointers with the integer value of zero!

> C is dead -- long live C!

The C you want is very much alive in my V6 installation. I suggest you
download one too, and a PDP-11 emulator.

-Olaf.
--
___ Olaf 'Rhialto' Seibert - rhialto@polder.ubc. ---- Unauthorized duplication,
\X/ .kun.nl ---- while sometimes necessary, is never as good as the real thing.