tech-pkg archive

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index][Old Index]

Re: OK to import wip/scalapack?

On 7/29/22 04:11, Dr. Thomas Orgis wrote:

Am 28. Juli 2022 23:49:04 MESZ schrieb Jason Bacon <>:
On 7/27/22 04:53, Dr. Thomas Orgis wrote:

-- Found MPI_C: /usr/lib/ (found version "3.1")
-- Checking for module 'mpi-fort'
--   Package 'mpi-fort', required by 'virtual:world', not found
-- Could NOT find MPI_Fortran (missing:

We've been discussing this off-list already. Issues:

- CMake ships a FindMPI.cmake that does rubbish with mpi-ch vs. openmpi ( is _not_
- Fortran MPI is a requirement and the f90 option in openmpi is not default.
- MPI libraries in pkgsrc are ancient, outdated. We need someone who actually uses them, ideally, to update and test.

I provide MPI with the compiler toolchain on my HPC systems. So  the neglected state of  MPI packages went past me.

The config that should work is

- MPI_TYPE=openmpi
- PKG_OPTIONS.openmpi=f90

There's fixup needed ... uncovered by my attempt with scalapack.

Alrighty then,


I used to maintain the openmpi packages, but no longer have time for it.

Based on my experience, though, MPI packages could work well in pkgsrc,
as long as each implementation is isolated, e.g. under
${PREFIX}/openmpi/version, ${PREFIX}/mpich/version, etc.

Dependent packages should probably be installed under the same prefix.
This will be necessary for some dependents like fftw, for which it may
be necessary to install both mpich and openmpi builds.

I've heard arguments that conflicts between MPI packages and their
dependents are OK, but that won't fly on an HPC cluster where many
different users are running different software.  Maintaining entirely
separate pkgsrc trees for each MPI implementation would cost man-hours
in a field where there's an extreme talent shortage.

wip/openmpi is more or less functional and installs 4.0.0.  It shouldn't
be hard to update it to 4.1.4.

Home | Main Index | Thread Index | Old Index