Subject: Re: [Fwd: Re: compiling php5 out of memory problem]
To: Heitzso <heitzso@growthmodels.com>
From: Bruce O'Neel <edoneel@sdf.lonestar.org>
List: port-sparc
Date: 09/25/2006 15:26:03
Hi,

The other one which is a nightmare to debug is when your stack size
is too small.  On sparc (at least) you just get a bus error at some 
function entry.

cheers

bruce

On Mon, Sep 25, 2006 at 08:22:36AM -0400, Heitzso wrote:
> -------- Forwarded Message --------
> From: Heitzso <heitzso@growthmodels.com>
> To: Julian Coleman <jdc@coris.org.uk>
> Subject: Re: compiling php5 out of memory problem
> Date: Sun, 24 Sep 2006 10:43:47 -0400
> 
> > > I'm having trouble with the lang/php5 make on a dual cpu sparc20 w/ 512M
> > > of RAM and 1G of swap.  The make crashes out with a memory allocation
> > > error while trying to allocate a single 64M block on top of earlier 2M
> > > worth of allocations.
> > 
> > > Turns out I was running current from last Feb or so and already had that
> > > NKMEMPAGES line in my kernel.  I'm upgrading /usr/src current and will
> > > recompile to see if a current kernel GENERIC.MP can handle the PHP5
> > > update without choking on the 64M memory allocation.
> > 
> > Reading your original message and the replies, it's not clear to me if the
> > problem is that the compile stops or if the machine crashes.  Assuming that
> > the make stops and the machine is fine, have you tried increasing the data
> > size limit passed to the make process?  I needed to do this to get some
> > pkgsrc programs to compile with gcc4.  If you are using sh or ksh:
> > 
> >   ulimit -d # show current data size limit
> >   ulimit -dH # show maximum data size limit (hard limit)
> >   ulimit -d `ulimit -dH`; make # run make with maximum data size limit
> > 
> > Hope this helps,
> 
> Ah ... the wonders of the simple ulimit!  That was all that was needed.
> The default data size limit was 64M, so the gcc memory allocation
> request of 64M on top of the prior 2M broke and gcc halted.  The hard
> limit was 512M or something on that order.  So bumping up the current to
> the hard limit was all that was necessary.
> 
> I had read a paper off the Internet that discussed this, but no where
> did it say that a program's memory allocation came from the data memory
> (versus the dozen or so other types of memory).  I was creeping up on
> experimenting with that, but thought I would recompile a current kernel
> to insure that I was working with a latest-greatest kernel.
> 
> I don't know if anyone wants to mess with this, but adding this
> explanation to the doc re pkgsrc would be useful. Something along the
> lines of ...
> "
> If a package make stops with an error message stating that a memory
> allocation failed, then you should try rerunning the make after
> increasing the current virtual memory data size to the maximum data
> limit size (also known as the hard limit) using the command sequence:
> ulimit -d # show current data size limit
> ulimit -dH # show maximum data size limit (hard limit)
> ulimit -d `ulimit -dH`; make # run make with maximum data size limit
> "
> 
> 
> Thank you very much!
> 
> Heitzso
> 

-- 
edoneel@sdf.lonestar.org
SDF Public Access UNIX System - http://sdf.lonestar.org