Hamster Plus Hotspot Equals Web 2.0 Meltdown NOT

Wed, 15 Aug 2007 09:28:40 GMT
by pdp

Robert Graham (CEO Errata Security) gave his Web 2.0 hijacking presentation to a packed audience at Black Hat 2007 today. The audience erupted with applause and laughter when Graham used his tools to hijack someone's Gmail account during an unscripted demo. The victim in this case was using a typical unprotected Wi-Fi Hotspot and his Gmail account just popped on the large projection screen for 500 or so audience members to see. Of course had the poor chap read my blog about email security last week he might have avoided this embarrassment. But for the vast majority of people using Gmail or any other browser or "Web 2.0″ application, they're all just a bunch of sheep waiting to be jacked by Graham's latest exploit. Hamster plus Hotspot equals Web 2.0 meltdown!

I have nothing against Robert, he is a good guy, but I have to say that his research has nothing to do with Web2.0. Man-in-the-middle attacks have been known for ages and being able to sniff the session identifier from a HTTP connection over unprotected/unencrypted channel is not new. Of course it works. I mean, of course it works. And yes, do not use Telnet because someone will be able to capture your credentials. Of course it works! It is unencrypted channel, therefore it means that everyone will be able to see the traffic.

Cookies are standard mechanism to imitate statefulness for otherwise stateless HTTP connections. If someone sniffs them from the air they will be able to impersonate the connection they support. This is it! Finito! And, btw, you don't need any special tools to do all that. All you need is bash with some very basic utils you can find on any standard Unix/Linux distribution. Here is an example:

  1. Start Kismet
  2. Read the Kismet dump file
  3. Extract Strings
  4. Match and extract cookies


> kismet&
> tail -f kismet.dump | string | grep -iE 'Set-cookie:|Cookie:'

Here you go! So, I don't really understand what is the fuss all about. Again, I repeat, this is not Web2.0 problem and I repeat this is not Web2.0 problem and I repeat.

Archived Comments

If memory serves me correctly, Graham *sort of* mentioned that this wasn't entirely Web 2.0-centric, but he just highlighted his PoC's applicability to Web 2.0 style applications. In any case, you (and just about anyone else who's blogged about Graham's presentation) are completely correct: it's not a Web 2.0 problem. People should knock off the "OMG Web 2.0 is broken...again!" nonsense.
I understand that Web2.0 is buzzword and by using it it pretty much guarantees pretty good media coverage of your work but let's not abuse it. There are problems with Web2.0 but they have nothing to do with AJAX (not directly) nor with WiFi Sniffing as presented by Graham.
Gareth HeyesGareth Heyes
All the more reason to use a VPN on public wi-fi.
or at least tunnel via SSH...
I'm glad you posted this as I was also thinking wtf all the fuss was about. It just goes to show that this industry is like any other, give it 10 years and the topic will come back into fashion, like flared pants :)
You are the first who continues to abuse terms, Web 8.0 Mashup Hacking with Yahoo Tubes. WTF? Can I laugh? Where is the technical stuff you published at the beginning of this blog?
First of all, nobody is laughing. Come on, we are not kids.
Web 8.0 Mashup Hacking with Yahoo Tubes. WTF?
well, yes. Yahoo Pipes is a Web2.0 technology so I don't see any problems with using Web2.0 terminology. Moreover, the pipes interface proves one thing: I can spider Web Applications in search for vulnerabilities circumventing to an extend the same origin policies. That wasn't possible before. There is more to that but you will hear about it soon. So yes, it is new and yes it is Web2.0. So, what exactly is your point, Galeazzi? The technical stuff are still on the blog but I have to agree with you that there was sort of a dry period lately. The reason for this is mainly because I was involved into two huge projects, the XSS Book and the Google Hacking for Penteasters vol2 book. However, there is a lot in the background going on that you cannot see. :) So, stay tuned.
Daniel, you are completely right everything new is well forgotten old thing.
Hehe pretty funny, although everyone knows that wireless connections are insecure from day one. So, yeah you are right.
Isn't this why SSL was invented? That some webmail providers don't support it isn't worthy of a BlackHat talk. If anything, all this shows is that BlackHat needs some better people to review submissions. However, I do agree with the above poster that you are also guilty of publishing the same old thing again and again, pdp. Your JavaScript Spider is just another take on Jikto which was just another take on your own earlier research. Any service which will "launder" http requests for you enables partial violation of same origin. Big deal. The fact that new services (pipes, etc) are now out there that make it easier isn't anything new - much like sniffing cookies over wifi isn't new. So while I agree that perhaps "Web2.0" is a misnomer for Graham's work, you have also been rehashing the same old thing for quite some time. Yahoo Pipes will now send a post for you! So, who really cares? Anyone can create a similar service with a very cheap webhost account. You could probably do it with netcat and bash. It does not equate to "MAGE POWERFUL".
rezn, first of all "MAGE POWERFUL" was a play of words which obviously didn't succeeded to get the message out the way I had pictured in my head. Anyway, I see what you are saying. I completely agree with you that the JavaScript spider was rehashed version of Jikto and yes Jikto is pretty much the proxy POC I published last year but this is not what my talk and the work on GNUCITIZEN is all about. It is about agents. It is about autonomous robots that live on the surface of the web. Let's face it, what's the point of discovering vulnerabilities on the fly? I mean what's the point of having XSS scanner written in JavaScript? It makes no difference at all. It is slow, sloppy and highly ineffective. So why? Let's forget about it. If you combine several key components of the so called Web2.0 we can really come up with something nasty and something probably worth our attention. This thing will be based primary on services which we cannot easily shutdown and will have recovery processes to ensure preservation. And the POCs (the JavaScript spider and XSS scanner) are just demonstration of certain features that possibly will be included to one degree or another. What if I tell you that JavaScript can receive as well send emails. It starts to get interesting, right? What if the attacker have several agents spread across the web which he/she can control via distributed broadcast messages. Now we completely change the game. Where is the head of the worm? There isn't any. How do I stop this? You can't! I hope that with my next presentation it will get a lot clearer what I mean. I am sure that I can convince you in what I believe and show you the value of the research if we talk in person. However, the truth is that we have to deal with virtual boundaries and it is completely my fault for not having the message out as clear as possible today.
Nothing to do with WEB 2.0. This is just a design problem with most web apps that use login form authentication (99% of webapps). Even if SSL is supported during authentication, the connection downgrades to clear-text HTTP for overhead reasons right after submitting your username and password. However, this attack remains one of my favorites against hotspots. It's easy, passive and works like a charm. As long as Robert doesn't claim is new or WEB 2.0 related, I don't see a problem with this research. My wife was amazed when I tested this against her gmail account. This means that although most people in the security community know about this attack, most average users don't. And remember, there is NO idle session time out on Gmail. And who clicks on logout? Only geeks do :D