Subject: Re: lib/35401
To: <>
From: David Laight <david@l8s.co.uk>
List: netbsd-bugs
Date: 01/11/2007 20:20:02
On Thu, Jan 11, 2007 at 08:34:03PM +0100, Christian Biere wrote:
> > 
> > I seriously doubt it that we'll see saturation arithmetic in the C integral
> > types, but your point is valid.

I have seen it - on the C compiler for the StarCore DSP.
It is a right PITA!

> Even if not, a compiler could potentially just compile the check away due to
> optimization. I think there have been similar changes to GCC recently with
> respect to pointer arithmetic [1] i.e., checks relying on wrap around just
> disappear. There's also a related discussion regarding integer overflows on
> the GCC mailing list:

IMHO some of the recent changes in the way gcc is interpreting the C
standard and adding optimisations that break existing code (and some
of the assumptions that system software engineers ought to be able to
assume) because the C standard has allowed them in order to allow some
non-standard architectures to support C - even though only a limited
amount of specialised code would ever run on them.

In cases like the above, the test would only have been writen if the
writer thought its result might vary - so optimising such tests away is
very unlikely to improve the performance of any code, just break code.

I had a problem many years ago with a compiler that assumed that the
address of a code/data symbol could not be zero (aka NULL), and
optimised away any tests that attempted to test for it.  Unfortunately
the linker supported weak symbols which need not be fixed up at run time,
so it became rather difficult to determine whether a waek symbol had a
non-zero address....

	David

-- 
David Laight: david@l8s.co.uk