Subject: Re: road map for new immigrants?
To: Richard Earnshaw <email@example.com>
From: Jonathan Stone <jonathan@DSG.Stanford.EDU>
Date: 09/15/1998 10:27:17
> Also, this can
>rapidly lead to a situation where you can't build something until you have
>it already. A good example of this is the linker which needs libc, but
>rules in share/mk assume that the latest version of ld is installed (they
>use flags which aren't in the 1.3 release version).
This, is largely an own goal by the release engineers. The problem
was foreseen, pullups to fix exactly this problem were requested,
and rejected out-of-hand becuase they weren't "bug fixes".
I agree with the general point, though.
>A proper bootstrap system should (in my opinion):
> Build a minimal set of tools that are needed to compile the rest of
> the tree
> Install them somewhere out of the way
> Use the minimal set to build and install everything else.
> Be able to do everything (except the final install) without needing
> to run as root.
>The build of the minimal set should be very portable (ideally it should
>generate a set of cross-compilation tools as well if necessary). It
>doesn't *HAVE* to use all the most fancy features of netbsd, for example,
>the linked binaries don't have to use shared libraries at this point.
Seconded. In principle.
If we had a workingcross-compilation system, then we should be able to
set up a ``cross-compiler'' built using the installed libc, compiler,
etc, to build the ``cross-compilation'' system targeted at the current
host. That's basically what you're asking for, but we'd have to add a
few extra program-source-generators that produce MI output --- [ lex,
yacc, rpcgen etc. -- to the "cross-compile" suite.
But the biggest problem in doing this as a "make system" is
dependencies between the "minimal tools". If the source code of the
latest revision of one of those the "minimal tools" depends on having
the most-current version of another of those minimal tools, what do
you do? How do you even encode that in a "build system"?
That's why the current approach is "if something breaks, rebuild the
latest version of it against your old system, install it, and start
Suppose we had a list of "rebuild tools" --
make, /usr/src/share/mk/*, gcc, as, ar, ranlib, nm, tsort,
lorder, sort, join, lex, yacc, rpcgen, lint,
plus others, and rules to build the latest source on your old system,
put them in a sandbox, and clean up the source tree, before doing
`make build'. And a way to use that sandbox as a "cross-compilation
system". And if building the "sandbox" contents fails (due to
dependencies between the sandbox contents) , you get to repair things
manually, as you do now. How far would that go to solving your problem?