[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index][Old Index]
Re: pkgsrc use curl or wget
On Fri, Jul 12, 2013 at 08:27:26PM +0530, Mayuresh wrote:
> Anyway, I have taken a note of this and I use the default downloader for
> pkgsrc. (Hardly 1 or 2 times till now I had used aget.)
If you can't get your TCP issues sorted (I would expect they likely
stem from bursty packet loss which TCP traditionally does not
accomodate well) I don't think we mind (nor are we likely to
notice) if you open a small number of connections in parallel.
On the other hand, using 10 or 20 connections at once to fetch
the same file has effects we do not like:
1) It requires a server process per connection, and socket buffers
per connection, and these are limited resources. If you use
far more than your share, they are not available to other
people trying to download from our servers.
2) It bypasses our per-connection bandwidth limits which are
intended to conserve bandwidth for other users and for
3) Since you're fetching blocks from all over the target files
in parallel, it can negate the effect of read-ahead
caching and slow down the disk subsystem.
Because of this, when we notice users making large numbers of
connections in parallel to fetch the same file, we get annoyed,
and if it's really severe, we blacklist the addresses that do it.
I know other server admins do the same.
So if you're going to use a tool like this, and by default it
makes a large number of connections (such as 10) in parallel,
I would strongly suggest you reduce that default to something
like 2 or 4.
Main Index |
Thread Index |