Subject: Dependencies, including "make update" issues.
To: None <tech-pkg@netbsd.org>
From: Richard Rauch <rkr@olib.org>
List: tech-pkg
Date: 06/30/2005 16:38:39
Hi, I have two thoughts I thought I'd share.


The first is for packages that only get loaded as dependencies for other
packages.  I turned this over in my mind for the first time a while back,
and saw a post about it, but the response, I think, misunderstood the key
idea.

The idea is this: When you build a package, mark whether it is built
because it is required by some other package, or if it was built/installed
*directly* by the user.  E.g., I build (or in any case, install) the GIMP.
Let us suppose that aalib is already installed.  This is handled already
by the package installation.  Let us suppose that fontconfig is *not*
installed.

In this case, fontconfig will be attempted to automatically install.
After installing, add a tag to the /var/db/pkg database entry
for fontconfig to indicate that "this package was installed automatically
to meet dependency requirements.

Now, suppose that you delete the GIMP.

When pkg_delete runs into fontconfig while cleaning up the dependency
tree for the GIMP, it will see that fontconfig was only installed as a
dependency for some other package (pkg_delete does *not* need to know
that it was for the GIMP).  So, pkg_delete will try to delete fontconfig.
(Or rather, it will delete fontconfig, if fontconfig has nothing else
still depending on it.)

The upshot is that if you install, say, 10 packages manually, you may
end up with more than 10 packages added to your system---but if you then
delete those 10 packages without adding anything else, then you'll have
all packages removed that weren't installed before those 10 were added.

I've never really sifted through the package install/delete support,
but conceptually this sounds simple.

The reason to want this is so that you don't get an accumulation of
cruft packages just because you briefly install/remove some other
packages.  Cruft packages can create security issues and can add
overhead to updating packages.



Updating packages is the second issue.

I use "make update" and usually it's okay.  But then I also avoid
some of the really *nasty* packages like KDE.

How practical would it be to automate a kind of recursive "make replace"?
This may not be for everyone, since it creates a window during which
arbitrary packages mail be broken in mysterious ways.  But the idea
is to start at "back" end of the dependency tree and "make replace" on
anything that needs to be replaced (and on anything that depends upon
it as you roll forward).  In most cases, the ABI will be compatible, or
compatible enough, that you won't notice problems.

If a package fails to be replaced, then instead of stopping (as
"make update" does), record the failure somewhere where the process
can be readily integrated into a fresh "recursive replace" after a
pkgsrc update (or after a manual attempt to fix the broken package).

This would seem to satisfy the complaints of those who believe that
"make update" is inferior to "make replace", while also providing a
mechanism for cleanly getting everything up to date.

If something fails to build, everything else stays, but if
it succeeds in building (or when a fresh pkgsrc update permits
a new build), all dependencies can be updated.

No, I'm not offering to do this.  I'm "happy enough" with make update.
But maybe those debating ways to change pkgsrc will find this idea
novel and useful.


Just a couple of ideas.  Use them or not.  (^&


-- 
  "I probably don't know what I'm talking about."  http://www.olib.org/~rkr/