Christian Heilmann

Posts Tagged ‘xss’

Liberté, Accessibilité and Securité – that was Paris Web 2009

Tuesday, October 13th, 2009

Last week I went to Paris, France to speak at a Yahoo Developer Network event and Paris Web. Paris Web is a web development, design and accessibility conference that runs for the fourth year (I think) and I’ve been speaking there for the third time.

My presentation – basic housekeeping

Originally I planned to speak about my favourite topic – the web of data and how to use it – but the organisers had other plans for me. Normally I hate changing my topic and being asked to do specials but I have a fond spot for Paris Web so I talked about web security instead. The slides of my “basic housekeeping” talk are available on SlideShare:

In the talk I covered some very basic measures you can take to protect your web site from becoming a spam hub, part of a botnet or simply get spammed. I pointed out the following mistakes people make:

  • Underestimating the severity of web application security holes – it is not about your server but also about users who use the same passwords all over the place.
  • Keeping folders listable and thereby allowing people to find vulnerable scripts and dig into data they shouldn’t be able to see (the example I showed was eat.co.uk failing to protect their /cgi folder and thus allowing full access to an admin section and listing their DBs)
  • Allowing search engines to index admin sections of web applications (I proposed using robots.txt but as one attendee pointed out in the Q&A protecting with .htaccess makes a lot more sense)
  • Keeping error messaging on and thus allowing people to gain insight into your server setup
  • Having an insecure PHP setup with globals enabled which would allow for overriding security checks and remote code injection (using phpsecinfo can help you find these issues)
  • Blindly relying on software and not testing installs. Also, not overriding preset users and passwords (as an example try the user and password “builtin” on any Ektron-powered web site)
  • Not keeping installs and plugins up-to-date
  • Relying on HTML as a source of data for JavaScript/Ajax (I can easily manipulate this in Firebug)
  • Relying on JavaScript – you can’t and if you use it as the only means of validation turning off JavaScript allows attackers to inject any kind of data.
  • Leaving information inside the HTML by commenting out in HTML - always comment on the server side.
  • Not filtering inputs.
  • Trying to filter instead of whitelisting
  • Allowing for inclusion into iframes and thus allowing for clickjacking.
  • Failing to provide easy to use and stress-free interfaces and thus allowing for social engineering (“This is too hard for you, give me your password and I will fill this out for you”).
  • Staying authenticated and logged in over a longer period and thus allowing attackers to make you click on web sites that contain CSRF traps (the example was demo code that could get protected Twitter updates).
  • Giving users the impression that you are the one responsible for security instead of it being the job of both the user and the site provider.
  • Relying on Captchas as a sole measure against bot attacks (check PWNtcha for a captcha cracking tool).
  • Not keeping their software up-to-date
  • Not periodically checking their logs for hacking attempts.

I then quickly went over some of the ideas we now have in place to make the web easier to use and at the same time safer: Guest Passes, One-off logins, oAuth, OpenID and Caja.

I explained the security threats and trends in phishing social networks, the mobile web, camera access, geo location access and biometric recognition.

I had good feedback and I love speaking in France. You can make jokes and people are happy to laugh out loud when you bring up things that are just not expected.

The rest of the conference

This is a general thing at Paris Web. The speakers do not only really know what they are talking about but are also happy to be unconventional when it comes to presenting. Whilst the slides of ParisWeb do only hint at that you can find some very cool photos of what is going on there.

Paris Web 2009 : Day 2 by  ~Thanh.Paris Web 2009 : Day 2 by  ~Thanh.

Also check out the video:

The location is very luxurious (IBM’s HQ in France) and has all the latest systems you need for presenting – microphones, a great projection system, on-stage monitors, live translation and so on. The catering was very impressive and the food was – well, it is France, we don’t need to say more.

Double budget approach

The other great thing that Paris Web does that other conferences should copy is that on the day after the conference there are workshops with the speakers who are happy to give them for a very low price (last year it was 10 Euro, not sure what it was this year). This allows students that cannot afford the main conference to come only on Saturday and still take advantage of the experts coming to Paris.

All in all I am always very proud to be part of the conference and to see the enthusiasm and great things that happen in France when it comes to advocating web standards, future technologies and ways to work professionally as web designers. The strong streak of accessibility and usability that compliments the high-tech talks makes it a useful conference for anybody who creates any work on the web.

Alas, there is one issue.

The language barrier

As the conference is held predominantly in French, a lot of the great insights, information and practices is lost for non-francophones. This is a shame as I am very impressed with the pragmatic approach of the talks. There is not much “blue sky” thinking but very down-to-earth information on how to build better products, how to talk to your boss in the right way, how to make web development an important part of your company’s portfolio and a lot of talks about quality of our work and pragmatic accessibility. All the talks are filmed and recorded and it would be a great step for Paris Web to translate the transcripts – maybe that is something that can be done with crowdsourcing?

Is it getting harder and harder to show very easy examples?

Tuesday, April 7th, 2009

I am right now teaching a four day class of DOM and Ajax in Sunnyvale, California and also do some tech editing for Scriptin with JavaScript and Ajax by Charles Wyke-Smith and I find one thing that is pretty worrying: easy examples of web development practices are dangerous to show these days.

I’m talking about practices that make it easy to get quick results and give readers and attendees “I am getting this – this is easy” fuzzy warm feelings.

One very obvious example is form validation and re-rendering of a form using PHP_SELF and displaying user data using $_GET or $_POST. Unfiltered they are a free invitation for any XSS attack and will turn your server into a spam-hub or bot-net drone. Explaining countermeasures of XSS normally is out of scope for an example that only shows how a form would work that you enhance progressively.

The same applies to simply outdated ideas like onevent handlers. It is easy to show an example that uses a few onclick handlers, but explaining event handling really well takes a bit of time. Again, this is something that really does not fit in the scope of a DOM course.

I do however think that it is important to get it in there, as there is no such thing as knowing one technology in the web development stack and being able to use it. There’s a lot of overlap with other areas and in order to be a good developer and play well with others you need to be aware of your effects and areas of overlap with your colleagues’ skill-sets.

The other extreme I find myself doing is being too over-cautious. I went through the tough times of the first browser wars and got a deep-rooted mistrust towards anything some browser tells me is OK to do and use. However, I get the feeling that it doesn’t really matter any more if Internet Explorer has a problem with name vs. ID or whatever other shenanigans we have to be aware of when we build things from scratch.

I do get the distinct feeling that not building on top of a good client-side library is simply a waste of time these days. Libraries allow us to write code, not to work around bugs and wonder what other safety measure we have to put in.

That’s why I started asking people in my courses to use Firefox with Firebug and use a good text editor to code along. Today I managed to breeze through how to write HTML that is ready for internationalisation and works with assistive technology, over simple DOM access to the document and at the end writing a validation script for a form using generated DOM content. By concentrating on how things are meant to work instead of debugging random issues I managed to get the students to reach far into the matter in a day – even those who never touched JavaScript before.

Maybe it is time to get beginners accustomed to a market that builds on working solutions and benefits from browser abstraction via libraries than teaching developing from total scratch – bad browsers and bad people taking advantage of any technology to gain access or spam us seem to have made this way of working redundant.

Let’s make 2008 the year of embracing the server side with Ajax

Sunday, December 30th, 2007

I am always fascinated by the amount of Ajax tutorials and examples out there that totally ignore the backend part of an Ajax app. A lot of times you’ll find page-long ravings about the 6-7 lines of JavaScript that allow the client to make an HTTP request but when it comes to talking about the proxy script needed to allow for cross-domain requests a lot is glossed over as “you don’t need to know this, just use this script”.

That would not really be an issue if the scripts offered weren’t that bad. Unsanitized URLs are the main attacking point for cross-server-scripting attacks. If you use a PHP_SELF as the action of your forms you shouldn’t be too confused about a lot of mail traffic from your server or text links on your site you didn’t sign off and get money for.

The other thing about Ajax information on the web that amazes me is that people keep complaining about the slowness and problems with converting data from one format to another on the client side. Let us not kid ourselves: even after all the articles, books and podcasts about Ajax we still have no clue whatsoever what a visitor uses to look at our products. We cannot tell for sure what browser is used, if there is assistive technology involved or anything about the specs of the computer the browser runs on. This to me makes the client side the least preferable place to do heavy calculation and conversion.

The server side, on the other hand, is in your control and you know what it can do. Complex regular expressions, XSLT conversion, all of this is much easier to do on the backend – and you know that the text encoding will work to boot. A lot of complexity of Ajax apps is based on bad architecture and design decisions and on relying on the client side to provide necessary functionality.

So if you ask me what the ratio of client-to-server code of a good Ajax app is I’d say 30% client and 70% server. The 70% on the server should be used to provide security, non-JavaScript fallback functionality (yay accessibility) and conversion of data to small, easy-to-digest chunks for the client (think HTML and JSON). The 30% client side code should mainly be used up to enhance the usability of the product and make it easier for your visitors to reach their goals.

So here’s my plan for 2008: whenever I talk Ajax I will try to cover as much backend as frontend. I’ll do this by partnering with other experts as I myself created some terrible PHP in the past. I hope that others will follow that example as Ajax is a wonderful opportunity to bridge the gap between frontend and backend engineering – and we have to talk to each other to create a good app.

wp-super-cache cached too far for me (and others)

Thursday, November 8th, 2007

Having just upgraded this wordpress to the new one I wanted to have the whole goodness and installed wp-cache to have static pages of my posts. However it seems that the newly released wp-super-cache plugin for WordPress had some nasty vulnerabilities.

The first to report that to me was Chris Messina on twitter followed by Stefanie Sullivan reporting about Tiffany Brown having the same issues. Checking the folders created I found the same two injection attempts Tiffany mentioned. The caching allowed code injected as txt urls via “i” or “s” parameters to be executed.

In my case I found that half my server was mirrored into the supercache folder in the plugin’s cache folder. Not good.

I was happy to see that my etc folder and other more interesting bits were not reached yet before I deactivated the plugin. Right now I am playing grepmaster to see if there are some injections left. My action: deactived and deleted all caching plugins and their cache folders (best via SSH as FTP is a PITA with so many files).