[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index][Old Index]
Re: Shipping SSL certificates in the base system
On 04/07/2017 23:02, Jan Danielsson wrote:
On 07/04/17 21:15, Benny Siegert wrote:
There are other stories as well, but that's a good illustration of
why it's a bad idea to just hand over a bunch of CA's to users without
any mechanism for keeping the CA database, and CRL's, up to date.
I expected this argument, but it is finally irrelevant. This is because most users do one of two things:
(a) do nothing and effectively trust all certificates, because none are installed;
(b) install the mozilla-rootcerts package and trust the mozilla set.
(c) users who consciously select a subset of those certificates — probably a tiny minority> Compare with root certificates in the base system:
Users in (a) gain cert verification. Users in group (b) do not have to do a manual step. Users in group (c) lose nothing, because they still can futz with root certificates manually.
I assert that having a somewhat outdated set of Mozilla’s root certificates is better than having none at all and implicitly trusting everyone — or worse, trusting no one and having, say, Mercurial refuse to clone repos over https by default.
Perhaps, but I think you're mixing two different issues together.
If users choose to disable certificate verification, that's on them.
If TNF takes on the role of a trusted CA source, then that implies a
lot of responsibility that they don't currently have. They can't say
"here, have a bundle of outdated root certificates; we ship them only so
that some programs will shut up." -- that's irresponsible and it's
certain to cause unflattering comments.
Don't take me wrong, I want a solution which would make the X509
experience in NetBSD smoother. But being a trusted CA source means
splotlight and willingness to answer questions if something goes wrong.
I wouldn't be willing to take on that responsibility myself, so I'm not
going to ask TNF to do it. (Though I would obviously be delighted if
they assigned a Chief PKI Officer role and offered a proper CA
With all that being said, you're not wrong about the complexities of
X509 actually lowering security in many instances, but it's still the
user's choice to do so.
Here's a thing: most users do not have the tiniest clue that there is
such a thing as SSL, even less X.509, certificates or authorities for
However, we can go from a default where SSL-based connections never work
(ie can never validate the remote certificate) to a default where those
connections with a trusted CA will be accepted. Even if the list of
trusted CAs is imperfect, it is better than having users decide to never
validate connections, or to use a different OS altogether.
So even though my first reaction to Benny's suggestion would be rather
against it (being a security guy with the associated bit of paranoia) I
think it would be better to ship with certificate authorities, and then
let those savvy users choose to not trust them all by default. Even
then, I do usually prefer the slick experience of the system being easy
to update in case of a security issue, rather than having to toggle the
trust status manually.
Then comes the problem of maintaining and dealing with security issues
regarding these authorities. I can picture two options here:
1. Releasing a security advisory each time, and issuing a new release as
2. Providing a signed list of trusted CAs that can be updated daily,
much like we do for vulnerable packages.
While the first solution can be very difficult to manage (especially
without a rolling release), it should be possible to fully automate the
second solution, eg with a script running on a TNF machine kept up to
date with pkgsrc's security/mozilla-rootcerts package.
Main Index |
Thread Index |