pkgsrc-Users archive

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index][Old Index]

Re: Setting up bulkbuild



Hi!

On Sun, 2023-09-03 18:49:32 -0400, Greg Troxel <gdt%lexort.com@localhost> wrote:
> Jan-Benedict Glaw <jbglaw%lug-owl.de@localhost> writes:
> > I'm interested in building stuff for VAX systems. The current NetBSD
> > install images (fresh builds with a number of patches---people are
> > working on getting them upstreamed) seems to build quite happily,
> > either with gcc-10 or gcc-12.
> >
> >   Based on that, I'm trying to setup bulkbuild. Got it compiled and
> > started it in a `screen`, all in a SIMH environment (ie. it's a
> > simulated VAX system.) The goal here is, in the end, to run this in
> > conjunction with distcc and/or a number of such SIMH instances in
> > parallel. My understanding is that bulkbuild would support both.
> 
> You are saying "bulkbuild", but do you mean "pbulk"?  There have been
> several bulk build mechanisms over the years.

That is .. a good question! I basically followed this Wiki description:

	http://wiki.netbsd.org/tutorials/pkgsrc/pbulk/

My current setup (as far as I am) is to automatically build a trunk
NetBSD install ISO, automatically install a SIMH instance with it,
then create a sandbox, run usr/pkgsrc/mk/pbulk/pbulk.sh (all done) and
finally start actual packaging builds (inside a `screen`) with

	/usr/pbulk/bin/bulkbuild

This is now sitting since a few days and has now finished ~ 1564/19388
items (I guess: packages as counted per directories).
	
> >   It's now in the "Scanning..." phase and manages to "scan" about 250
> > packages per day, with a total of nearly 20k. So it'll be scanning for
> > about three months, give or take. Are there ways to speed this up? By
> > doing it parallel on several hosts, or create the scan result (I gues
> > it's inter-package dependencies?) on a different (faster) host and
> > then let the VAXen do actual build work (possibly proxying the
> > compilation work through distcc)?
> 
> I am very far from an expert, but I would ask: do you want to try to
> build some subset first, and then go up, vs trying a full build of all
> 25K packages?  I would try limited if I were you, and AIUI pbulk has a
> facility to build only a list of packages.  I would start off with
> meta-packages/bulk-small, which has been curated for exactly this
> scenario.  Once that is built, I would move on to bulk-medium.
> 
> You ask a really good question about the scan phase being cross.  I
> don't know.

I'm fine with chaotically building packages, as long as it actually
*builds* stuff. VAXen aren't known for their speed compared to today's
computers, and a simulated VAX (I do have a good number of real
machines, but I'd like to use those for verification) is about the
speed of The Real Thing.

  So it would be nice to not spend too much time in administrative
work, while that time could also be spend in actual package building.
(Next iteration would be to chain in distcc, as eg. described in
https://hackaday.io/project/218-speed-up-pkgsrc-on-retrocomputers/details .)

  And, given that it's start-up cost seems to be quite high, I need to
see how that lines up with a `cvs update` in the pkgsrc tree. For
sure, I wouldn't want to spend ~ 3 months every `cvs update` to figure
out that about nothing changed. Thus I hope to find ways to keep
start-up and keeping-up costs as low as possible. (Or it direct
cross-compilation an option here? The `./build.sh tools` toolchain
seems to be quite useable, and pkgsrc seems to have some abilities to
do cross-compilation.)

Thanks,
  Jan-Benedict

-- 

Attachment: signature.asc
Description: PGP signature



Home | Main Index | Thread Index | Old Index