Source-Changes-D archive
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index][Old Index]
Re: CVS commit: src/sys/sys
> Module Name: src
> Committed By: rillig
> Date: Fri Apr 4 20:52:32 UTC 2025
>
> Modified Files:
> src/sys/sys: cdefs.h
>
> Log Message:
> sys/cdefs.h: fix __predict_true and __predict_false for lint
>
> -#define __predict_true(exp) __builtin_expect((exp) ? 1 : 0, 1)
> -#define __predict_false(exp) __builtin_expect((exp) ? 1 : 0, 0)
> +#define __predict_true(exp) __builtin_expect(/*CONSTCOND*/(exp) ? 1 : 0, 1)
> +#define __predict_false(exp) __builtin_expect(/*CONSTCOND*/(exp) ? 1 : 0, 0)
This seems wrong to me. Why should we tell lint that _every_ input to
__predict_true/false is constant?
What lint objected to in this case is that ALIGNED_POINTER(...) is
always 1 on x86. But it's not always 1 on all architectures.
Home |
Main Index |
Thread Index |
Old Index