tech-userlevel archive

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index][Old Index]

Re: Import of XZ



On Wed, 04 Aug 2010, der Mouse wrote:
> > LZMA as algorithm is essentially LZW with huge window size and
> > better entropy encoding.
>
> How huge is "huge"?  I'm concerned about small-memory machines (I
> still have an hp300 with, well, "total memory = 6136 KB" / "avail
> memory = 2648 KB") finding themselves unable to even uncompress
> things, at all.

Here's an extract from what the xz man page has to say about memory
usage:

       The memory usage of xz varies from a few hundred kilobytes to
       several gigabytes depending on the compression settings.  The
       settings used when compressing a file affect also the memory
       usage of the decompressor.  Typically the decompressor needs only
       5 % to 20 % of the amount of RAM that the compressor needed when
       creating the file.  Still, the worst-case memory usage of the
       decompressor is several gigabytes.

       To prevent uncomfortable surprises caused by huge memory usage,
       xz has a built-in memory usage limiter. The default limit is 40 %
       of total physical RAM.  [...]

       When compressing, if the selected compression settings exceed
       the memory usage limit, the settings are automatically adjusted
       downwards [...]

       If source file cannot be decompressed without exceeding the
       memory usage limit, an error message is displayed and the file is
       skipped.  [...]

The good part is that setting a suitable memory usage limit on
the compression side should avoid excessive memory usage on the
decompression side.  I am not sure how I feel about the decompressor
skipping files that would exceed its memory usage limit; but I suppose
if I want it to continue regardless of memory usage then I can provide
enough swap space and use a command line option to override the memory
limit.

--apb (Alan Barrett)


Home | Main Index | Thread Index | Old Index