tech-toolchain archive

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index][Old Index]

Re: make -V default behavior change



On Tue, Jun 20, 2017 at 08:01:22AM +0700, Robert Elz wrote:
 > First, this whole discussion was much too quick - it started at 15:11 UTC
 > and seems to have been completed with a resolution at 20:00 UTC (all Jun 19)
 > that's less than 5 hours, beginning to end.
 > 
 > That's too fast - any kind of rational technical discussion needs more
 > time than that, time to think, to experiment, and to consider alternatives
 > (and probably more than 2 people involved.)
 > 
 > Normally, a minimum of several days (at least) - sometimes several weeks,
 > before any conclusion is drawn from the discussions should be allowed.
 > 5 hours is simply absurd.

This issue's come up before, so some people have probably already done
some of that in the past.

 > To me, the whole thing is woefully underspecified.  From make(1) -
 > about -V: [...]

I've committed some fixes to the man page. Among other things it seems
that the .MAKE.EXPAND_VARS hack wasn't ever documented; the man page
mentions it but not what it does.

 > Eg: I see in <bsd.own.mk> the following fragment 
 > ...
 > 
 > 	.for _t in CC CPP CXX FC OBJC
 > 	ACTIVE_${_t}=   ${AVAILABLE_COMPILER:@.c.@ ${
 > 		!defined(UNSUPPORTED_COMPILER.${.c.}) &&
 > 		 defined(TOOL_${_t}.${.c.}) :? ${.c.} : }@:[1]}
 > 	SUPPORTED_${_t}=${AVAILABLE_COMPILER:Nfalse:@.c.@
 > 		 ${ !defined(UNSUPPORTED_COMPILER.${.c.}) &&
 > 		 defined(TOOL_${_t}-.${.c.}) :? ${.c.} : }@}
 > 	.endfor

*gag*

The @.c.@ loop syntax should really not be allowed... the behavior of
ordinary for loops is confusing enough.

(what it means, modulo some bugs in make, is:

	.for _t in CC CPP CXX FC OBJC

	ACTIVE_${_t}=
	.for .c. in ${AVAILABLE_COMPILER}
	.if !defined(UNSUPPORTED_COMPILER.${.c.}) && defined(TOOL_${_t}.${.c.})
	ACTIVE_${_t}+=${.c.}
	.endfor
	ACTIVE_${_t}:=${ACTIVE_${_t}:[1]}

	SUPPORTED_${_t}=
	.for .c. in ${AVAILABLE_COMPILER:Nfalse}
	.if !defined(UNSUPPORTED_COMPILER.${.c.}) && defined(TOOL_${_t}-.${.c.})
	SUPPORTED_${_t}+=${.c.}
	.endfor

	.endfor # _t

...and after unrolling it I have my doubts about the - after the TOOL_
bit in the last part.)

 > If given that (and that bsd.own.mk is included), I do
 > 
 > 	make -V _t
 > 		   (or make -v _t or make -dV -V _t or make -V \_t ...)
 > 
 > what do I expect to see?

Nothing, because as the man page did actually already say, values are
extracted from the global context and _t is a loop variable.

 > Does anything even make sense?   And if not for this variable, why for
 > any others?  And how do I know which ones I can look at, and which not?
 > If I ask for .ALLSRC or .TARGET what do I get?

Again, nothing, because those aren't in the global context.

 > And how do I work out
 > from the documentation that whatever happens is correct?

You didn't, but hopefully it's better now.

 > Then the man page goes on to say (still about -V):
 > 
 > 	Do not build any targets.
 > 
 > which is also perplexing - if we are not building anything, how can we
 > know that we have the correct value of a variable, particularly one that
 > is set from the results of running some external command
 > 
 > 	VAR!=...
 > 
 > where the value returned may differ depending upon what make has already
 > produced.

We don't. If you do that kind of thing though you're already in pretty
deep water regardless of what happens with -V.

 > All of this suggests to me that the whole notion of getting "the value of
 > a variable" is hopelessly broken, and that part of the reason for the
 > disagreement, is that there is no common ground as to what is really
 > needed, or why.

Sure.

 > In cases where the objective is to extract some data to use (ie:
 > not debugging the Makefiles) that is, have make do something and
 > obtain a result, the mechanism looks simple to me: that's what make
 > targets are for - you build some target, and the result of that is
 > that something happens - if that "something" happens to be just
 > some text string being written to stdout, rather than new/modified
 > files appearing then that is fine.  Doing it this way means that
 > the value produced can be exactly the value that is meaningful, as
 > it is designed by whoever wants/needs the value, rather than the
 > coders of make, who cannot possibly have any idea what is useful or
 > needed for the purpose in mind.

Right; the purpose of -V is to inspect behavior of a makefile without
changing it. It's not necessarily for the purposes of debugging the
makefile; more often it's for the purposes of asking what the
makefile's going to do when invoked. In that sense it's like the -n
option but addressing a different set of questions.

 > For debugging, "the value" of a variable seems like a particularly
 > useless feature, and is only tolerated as it is all that currently
 > exists.  What is really wanted there is a way to trace the
 > successive values of the variable, and where they come from, so
 > every time it is changed, the developer can see it happening.  So,
 > an option that tagged a variable to be traced when set would be
 > useful (the output needs to include the new value assigned, and the
 > line # and filename of the Makefile (or fragment) that is making
 > that change happen, or the external source -- command line,
 > environment, MAKEFLAGS, ....)  Request that, in conjunction with
 > making some appropriate target (for cases where all that is wanted
 > is to see what happens during make initialisation, all that is
 > needed is a dummy target that depends upon nothing and makes
 > nothing) and it should be possible to easily observe what is
 > happening to the variable, and why, and hence, find any error in
 > the Makefile much more easily.

We have -dv, but it prints all updates to all variables and (because
sys.mk always assigns a lot of things during startup) produces a lot
of noise.

Probably it would be useful to have e.g. make -dv -V FOO -V BAR log
only changes to FOO and BAR; but that will probably take work to
implement.

 > Of course all this is complicated by make's somewhat strange concepts
 > of when the actual value of a variable is produced (hence the VAR= and
 > VAR:= difference, and that it is possible to have a variable set to an
 > "unexpanded" value which mysteriously simply gets expanded when used,
 > rather than when set.)

It's not that strange. The expansion of a macro (contents of a
variable) isn't evaluated until it's used. In fact, it's the same as
cpp:

#define FOO foo
#define BAR FOO
#undef FOO
#define FOO moo
BAR

prints "moo".

 > You can stop reading here, beyond is just a rant...  (but what is
 > below might explain why I will not be implementing any of what I
 > just suggested...)

Sure :-)

 > Aside from that, everything (and I think I mean *everything*, though I may
 > have forgotten some other reasonable change) that has been done to make over
 > the years has been a mistake.

I don't really agree. A lot of things that have been done to make over
the years have been mistakes, granted. In particular there's been a
repeating pattern of "oops I need X so let's hack something quick in
to do X". Now we have forty-odd years of accumulated quick hacks, all
of which have to be retained for compatibility with existing practice
and very few of which are principled, general, compositional, or
useful outside their original context.

That said though, I think it's quite reasonable to consider make's
variable language a macro preprocessor. At that point it's a language
and it's clearly desirable to provide a good language, which means
simple, general, and orthogonal functionality. Or something along
those lines.

The problem is not that make has for loops and that they're
specifically a textual expansion facility; the problem is that they're
lacking a few things that ought to go with them, like loop-local
scratch variables.

However, some things just end up obfuscated; you can write bad code in
any language. The right way to write the mess you quoted would be

        .for C in ${AVAILABLE_COMPILERS}
	.if !defined(UNSUPPORTED_COMPILER.$C)
	SUPPORTED_COMPILERS+=$C
	.endif
	.endfor

	.for T in CC CPP CXX FC OBJC
	.for C in ${SUPPORTED_COMPILERS}
	SUPPORTED_$T+=${TOOL_$T.$C}
	.endfor # C
	ACTIVE_$T=${SUPPORTED_$T:[1]:S/^$/false/}
	.endfor # T

...and I was going to say there are a couple things in there that
don't work, but there aren't. This form works fine.

Over the years people keep trying to reinvent make and failing, and
the primary common thread among all these attempts is the curious
notion that make is too powerful and their new thing should be less
expressive. Then of course they end up with a tool that can't do what
people need, so it never goes anywhere.

 > I am no fan of the auto* tools, and would never suggest any of them
 > as the right solution to anything, but one thing that they do which
 > is absolutely the right way, is to build a makefile from a
 > specification, and some programming to work out what should go in
 > it.  The auto* tools don't try to work out what is out of date, or
 > what order is the right order to build things, that is make's task.
 > And when done this way, make does not have to try to work out what
 > is the compiler to use, or what options it takes, ...  This is a
 > good division of labour, and is the correct model.  The details of
 > how they do it, and the specification used might all be brain dead
 > in this particular case, but the model is a good one.

This is how ninja works, FWIW.

Turning make's macro preprocessor into a separate program would have
various advantages (don't think I haven't thought of this) but it gets
stuck on per-target variables. Admittedly, most of these exist because
of target generation schemes that are *not* textual, and mostly
halfassed... but not quite all.

 > For example, before make existed, programs were built from shell scripts,
 > that simply compiled everything, every time.   Clearly wasteful.

Except when not :-)  I'm not old enough myself but I've heard stories
from the sun3 era of "cc *.c" taking a few minutes and "make" taking
overnight... because of having a lot of small source files, and I
guess the combination of poor graph algorithms in the make of the era
and the extra memory footprint of make and swapping it in and out.

-- 
David A. Holland
dholland%netbsd.org@localhost


Home | Main Index | Thread Index | Old Index