tech-userlevel archive

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index][Old Index]

Re: fetch_pkg_vulnerabilities enabled by default (was: CVS commit: src/etc)

On Wed, Jan 20, 2010 at 7:23 AM, Bernd Ernesti <> 
> On Tue, Jan 19, 2010 at 10:08:11PM +0000, Julio M. Merino Vidal wrote:
>> Module Name:  src
>> Committed By: jmmv
>> Date:         Tue Jan 19 22:08:11 UTC 2010
>> Modified Files:
>>       src/etc: daily security
>>       src/etc/defaults: daily.conf security.conf
>> Log Message:
>> Add the fetch_pkg_vulnerabilities option to the daily script to keep the
>> packages vulnerability database up to date.  This will only fetch the
>> file from the server if it has changed since the last run.
>> Add the check_pkg_vulnerabilities and check_pkg_signatures options to the
>> security script to check that the installed packages are sane.
>> All of these options are enabled by default but they will only run if
>> there is, at least, one installed package.
> I object for enabling that by default and you haven't answered my concerns
> when you brought this up.
> This is like calling home to get something, yes I know that it can be usefull
> but we didn't enabled such a thing in the past to get something from a remote
> server and modify a file on the local system with a newer copy automatically.

The fact that we didn't do such a thing in the past is not an excuse
not to do it now.

I've dug your concerns from the previous email as I had forgotten about them:

> * not every system has a working connection to fetch it

If that's the case, you'll get an error in the report.  It is hard to
imagine such a system, specially using packages, but for those rare
cases, you can always disable the feature.  We can add a hint in the
report if we detect an error.

> * not every system has packages installed

Then the fetching does not take place.

> * it modifies files without the knowledge of the one who installed
>  this system

Which is exactly what's supposed to happen with this feature.  What
would you do about it?

> * it can cause more workload on and exceed the connection
>  limit on if a lot of systems use it

The file will only be downloaded if it has been modified.  And, in
those cases when it is actually downloaded, I seriously doubt we'll
hit any connection limits.  Yes the cron jobs run at the same time
but: first, they depend on the timezone so that will spread the load
along the day.  Second, the daily script does tons of disk-intensive
work before fetching the script, and those will cause variability in
the machines.  Third, the file is tiny (around 60K), which means it
will download in a few seconds, so the skew introduced by the previous
jobs in the same script will be enough to spread out the connections.

Yes it will cause more load on the server.  No I don't think it is a
problem.  And if it turns out to be a problem, we should fix it on our
side, not by disabling a feature we've advocated as important for a
long time but never automated because it was, until recently, not part
of the base system.  (As a last resort, we could revert the default

It's not like other OSes never call home to get updates or other
stuff.  And they do frequently, with much more users than us, and they
cope with it.

Julio Merino

Home | Main Index | Thread Index | Old Index