What exactly does it mean when Chrome reports 'no certificate transparency information was supplied by the server?'

  • When visiting Gmail in Chrome, if I click on the lock icon in the address bar and go to the connection tab, I receive a message 'no certificate transparency information was supplied by the server' (before Chrome 45, the message was displayed as 'the identity of this website has been verified by Google Internet Authority G2 but does not have public audit records').

    What exactly does it mean that the certificate does not have public audit records? Are their certain threats a site using a certificate without public audit records has that a site using a certificate with public audit records does not?

    Examples of the message as of Chrome 45 and before Chrome 45

    I see you had to take a screenshot to export the message. If you think it should be easier to copy the connection information out of Chrome, vote for https://code.google.com/p/chromium/issues/detail?id=254249

    Has anyone found an example *with* public audit records?

    For an example *with* public audit records, see https://embed.ct.digicert.com/ . It's a purpose test site. Perhaps we'll see transparency proofs on real sites (most likely Google and Twitter) next year.

    Update February 2015: twitter.com is now "publicly auditable" in Chrome. You'll see a link "transparency information" to open a dialog "signed certificate timestamps viewer".

  • tylerl

    tylerl Correct answer

    7 years ago

    Note: If you're here because your certificate isn't trusted by Chrome, this is not the reason. Chrome will still trust certificates without CT information. If your certificate isn't trusted, there is an additional factor that you may have missed.

    Comparison of auditable versus no audit record

    This has to do with the concept of Certificate Transparency.

    The Problem

    Browsers currently trust certificates if four conditions are met: (a) the certificate is signed by a trusted CA, (b) the current time is within the valid period of the certificate and signing certs (between the notBefore and notAfter times), (c) neither the certificate nor any signing certificate has been revoked, and finally, (d) the certificate matches the domain name of the desired URL.

    But these rules leave the door open to abuse. A trusted CA can still issue certificates to people who shouldn't have them. This includes compromised CAs (like DigiNotar) and also CAs like Trustwave who issued at least one intermediate signing certificate for use in performing man-in-the-middle interception of SSL traffic. A curated history of CA failures can be found at CAcert's History of Risks & Threat Events to CAs and PKI.

    A key problem here is that CAs issue these certificates in secret. You won't know that Trustwave or DigiNotar has issued a fraudulent certificate until you actually see the certificate, in which case you're probably the perpetrator's target, not someone who can actually do any real auditing. In order prevent abuse or mistakes, we need CAs to make the history of certificates they sign public.

    The Solution

    The way we deal with this is to create a log of issued certificates. This can be maintained by the issuer or it can be maintained by someone else. But the important point is that (a) the log can't be edited, you can only append new entries, and (b) the time that a certificate is added to the log is verified through proper timestamping. Everything is, of course, cryptographically assured to prevent tampering, and the public can watch the contents of the log looking to see if a certificate is issued for a domain they know it shouldn't have.

    If your browser then sees a certificate that should be in the log but isn't, or that is in the log but something doesn't match (e.g. the wrong timestamp, etc), then the browser can take appropriate action.

    What you're looking at in Chrome, then, is an indication as to whether a publicly audible log exists for the certificate you're looking at. If it does, Chrome can also check to see whether the appropriate log entry has been made and when.

    How widely is it used?

    Google maintains a list of "known logs" on their site. As of this writing, there are logs maintained by Google, Digicert, Izenpe, and Certly, each of which can maintain the audit trail for any number of CAs.

    The Chrome team has indicated that EV certificates issued after 1 Jan 2015 must all have a public audit trail to be considered EV. And after the experience gained dealing with EV certificate audit logs has been applied, they'll continue the rollout to all certificate issuers.

    How to check the logs

    Google added a Certificate Transparency lookup form to their standard Transparency Report, which means you can now query for the domains you care about to see which certificates for those domains show up in the transparency logs. This allows you to see, for example, which certificates out there are currently valid for your domain, assuming the CAs cooperate.

    Look for it here: https://www.google.com/transparencyreport/https/ct/

    Remember that if you want to track a given domain name to be alerted when a certificate is updated, then you should follow the logs directly. This form is useful for doing point-in-time queries, not for generating alerts.

    This is a great answer but leaves the question open as to why on earth Google, doesn't use a cert that is audible? Is this just not available at all yet and Google is putting the indicator in Chrome there to try to get CAs to change?

    @Fraggle Google does something even better. They pins their certs using HSTS, and more impressively, ship their browser (Firefox too) with the correct HSTS entries preloaded. So firefox and chrome won't even ALLOW you to use a fraudulent google certificate, no matter who signed it. The same protection is extended to anyone else who asks.

    @tylerl, Nope. The protection only goes to privileged big boys like Google etc. It's not a solution that would scale. If you had a site `my_site.com` and you tell Google to add `my_site.com` to their preloaded STS list in Chrome, they would simply ignore you. And **even if** STS list is implemented, it still wouldn't stop the MITM attacks.

    @Pacerier it really is open to anyone. Here is where you add your site: https://hstspreload.appspot.com

    @Pacerier Sts doesn't stop mitm, but pinning does. And chrome supports pinning via your STS entry.

    @tylerl, Pinning doesn't stop mitm. Pinning only ensures that the browser makes https requests instead of http requests. It's still possible that those https requests are mitm-ed. Why do you say that pinning stops mitm?

    @Pacerier STS ensures that requests are made over TLS and pinning ensures that only the pinned certificate can be used. Together you're guaranteed that all requests use your designated certificate which makes MITM impossible unless your private key is compromised.

    @tylerl, You state "chrome **supports** pinning via your STS entry". Please elaborate. As far as I know, STS preloaded lists don't allow you to specify a particular cert.

    @Pacerier Chrome overloads the HSTS functionality to support not only indicating your STS preference, but also to indicate the public keys allowed.

    @tylerl, https://hstspreload.appspot.com/ only has one input box. How would you specify the cert?

    *"... a key factor is that the signing behavior of the CA is not auditable"* - audits are *ex post facto* or *reactive*. They are only consulted when there's a known problem. We need a system that's proactive so we can stop issuance before it becomes a problem. Oh wait, we have that but the browsers abandoned it (and the third party auditor that provided the checks/balances).

    @jww Unfortunately the *Name Constraints* mechanism doesn't really work with the present realities. Any organization that needs to be able to sign xyz.com also typically needs to sign xyz.net and xyz.co.uk. Plus, every existing ca would insist on signing authority over .com, which defeats the purpose of name constraints. The biggest worry is a CA issuing a cert they shouldn't; either for a domain they shouldn't or for a subordinate CA that should't be trusted. This allows that behavior to be monitored real-time by automated systems.

    @tylerl - I don't disagree with you. But those are not technical problems, so they are not our concern. *"Any organization that needs to be able to sign xyz.com also typically needs to sign xyz.net and xyz.co.uk"* - yep, that's the constraint. When I asked why Mozilla did not require it of the subordinate CAs issued to organizations, they told me the CAs told them it was effectively "too much work". Then I asked "how much work is it" and no one could answer. Browser security is such a joke... Worse, its polluting non-borwser user agents and other software.

    @tylerl - *"Plus, every existing ca would insist on signing authority over .com"* - this is slightly different use case. In this use case, the RA is present *and* independent. Hence, there's reduced risk in allowing the trust. But it *might* make sense on ccTLDs since there's a natural administrative boundary present.

    I use Chrome in a corporate environment where they put https traffic through a proxy and use their own certificate. Official browser is IE who wouldn't complain. As of recently, Chrome claims the certificate doesn't have public audit information. Can I somehow manually tell Chrome to trust that specific certificate?

    STS and pinning are 2 different things, with full pinning, the client has baked in a list of domains and a matching fingerprint, so the browser 100% knows that it is wrong if the fingerprint comes back and doesn't match what it already has. Of course this really only works if you control the domains and the client doing the checking (in this case google chrome) and can update it regularly enough to keep those fingerprints correct. STS just forces you to https, without going through the redirect for all future loads to that domain, this stops some downgrade to http attacks by a mitm.

License under CC-BY-SA with attribution


Content dated before 7/24/2021 11:53 AM