Christian Heilmann

You are currently browsing the Christian Heilmann blog archives for March, 2012.

Archive for March, 2012

TTMMHTM: Batman helps kids with cancer, NIN nostalgia, future friendly things and love hotels

Thursday, March 29th, 2012

Things that made me happy this morning:

Sneak peek: Mozilla Evangelism Reps program – how to create a screencast

Thursday, March 29th, 2012

This is a small preview of a new thing we are working on in Mozilla. The Evangelism Reps program involves Mozillians getting help, mentoring and training to become public speakers, start blogging and running local events. All of this will be open and available on the Wiki. So here is a sneak peek.

How to create screencasts

Screencasts are amazingly powerful things. There is nothing better than showing how to use a tool or write a certain piece of code. There are several different types of screencasts:

  • Soundless screencasts – these just show how to do something. They can be one-off things for Twitter to show a certain effect. For example showing how something looks for browsers that might not support it: demo canvas cropper. These are also great for presentations. Instead of live using a product and wasting time typing in data you can run the screencast and give the audience a blow-by-blow explanation what is happening
  • Spoken screencasts – instead of just showing you explain what is happening. A demo would be the introduction to the 3D tester
  • Personal screencasts with overlaid video or cutting betweeen video and demonstration – these are the most complex to do as you also need to look good and exciting. This is the most personal screencast type.

Regardless of which you go for, you should prepare and know your tools so here are some tips on how to do screencasts.


  • Get yourself a headset – the main reason is that your voice will be much clearer and you don’t have the issue of the mic recording feedback or outside noises like keyboard clicks. I use the Plantronics 655, which is affordable and comfy to wear
  • Turn off any social media channels and email clients on your computer – you don’t want any instant notifications popping up on the screen. You record now, this is all that should go on.
  • Have a script ready that you want to follow – this includes what you want to talk about and to have all the things open that you want to show. Loading times of apps and sites you want to show in your screencast is wasted time.
  • Be prepared to record a few times as you will get stuck from time to time. You can stitch together one screencast from various steps.
  • Take breaks – don’t try to record everything at once. When you are ready with one section, pause recording, have a sip of coffee or a walk and then come back – you’ll sound much fresher.
  • Speak clear and at a moderate pace. There is nothing more frustrating than a screencast where the presenter mumbles or is too fast to follow
  • You can record the screencast and then record your audio – in a lot of cases this will have better results
  • Plan your screencast and show only what is needed. Screencasts should follow a few rules
    • Be short – nobody wants to listen to hours of talk. If you can keep it under 3 minutes – win. You can also cut up longer topics into several screencasts
    • Be indexed – you should offer time stamps for people to jump to when covering a few topics so that more advanced viewers can, for example, skip basics
    • Be easy to watch – remember that not everybody will see the screencast fullscreen, but embedded. Thus use a larger font size in your editor and make examples that don’t rely on low contrast or pixel-precision. Especially video conversion will blur a lot
    • Be relevant – show what can be done, not what people need to set up to get there – this could be accompanying text info.

Recording tools

There are a few tools to do screencasts. Many are free, but it makes sense to spend some money as you avoid hosting issues and watermarking or limited features.


If you don’t want to install anything, you can use screenr to record a screencast on any computer. Screenr is a Java Applet that allows you to define a part of the screen to record and gives you five minutes of screencast time. It can record the audio from your microphone, too. You sign up with Twitter and the videos are hosted on for embedding. You can also download the MP4 and directly send it to YouTube.

The downside of Screenr is the five minute limit, that you can not edit the final screencast and that you need to crop a certain part of the screen rather than have a full screen recording that can be cropped and shifted afterwards. You can stop and start the recording though.


iShowU is a very minimalistic screen recorder for mac that allows you to define a section of the screen and follow the mouse cursor. For $20 is it pretty cheap and does the trick.

Screenflow (recommended)

Screenflow is very much worth the $99 it costs. As you can see in the demos on how to use it it records the whole screen and you can then crop to what you need. You have several tracks to edit and shift and you can annotate your screencast and have effects to transition in between sections of it. Screenflow exports to YouTube or various local formats. I really got to like screenflow as it also allows you to edit other video and images into your screencasts easily.


Once you done recording, it is time to get your video out there. The simplest way is to upload them to YouTube or Vimeo – both are supported as direct uploads from the apps mentioned here. If you have the chance and bandwidth, export and upload high quality video – you can always make it smaller later, but you can’t make a bad quality video better quality.

Seeing that we are an open company it seems prudent to avoid closed formats. Nobody wants to download a WMV and then have issues playing it. What I normally do is upload the original video to Amazon’s S3 for safekeeping (or use DropBox) and then use for conversion. converts any video to 20 formats and redirects the system you watch the video on to the correct format. Notice that free accounts are rate limited, so it might be a good plan to use them for conversion, but host the videos yourself – or get a full account.

Are free apps evil?

Monday, March 26th, 2012

Lately there has been quite some debate about “free services” and what they mean to their users. A lot of it started with the excellent article “Don’t be a free user” by Maciej Ceglowski which described a few unpleasant truths:

  • When you use a free service, you are the product that gets sold
  • This means in the worst case that your identity information, likes and dislikes and connections with your friends are sold to third parties for data mining
  • If the service doesn’t require you to enter this information the content you add to the service becomes the product sold to third parties
  • Using a free service can also mean that you have no right to the content you add to it
  • You shouldn’t be surprised that you lose everything when the service goes belly-up

The article however also points out that the solution to a lot of these issues – paying for the service – is not that common. We’ve been conditioned to expect everything on the web to be free. That is why the article was so important.

Lately another issue came up with free services – this time related to the mobile market: free apps are a big culprit of battery drain on mobile devices because of the constant loading of ads and reporting of data back to the ad providers and the app.

Strange reactions

What bugs me about all this is not that free services have issues. What bugs me is how a lot of people on the web react to the issue. There is a sense of gloating and arrogant repetition of the points in the Pinboard article as “duh, you should know and expect this” and there is a general consensus of “you can’t expect quality when things are free” and that “free users are doing it wrong”. The latter is especially ironic when a comment like that comes from a Yahoo, Gmail or Hotmail email account.

I got a lot of feedback when I complained about the fact that Twitter sold old tweets in bulk to data mining companies. Whilst my “not cool, Twitter” tweet was easily one of the most re-tweeted in the last months, the feedback was very polarised. A lot of people were outraged (in some cases overly so), and an incredible amount called me out as being naive for thinking that would not happen. The amount of people finding the main problem with this move was very few. The issue was not that data is sold, the issue was that third parties get access to my data while I am denied it.

Basic denial of service

My beef with this was not that Twitter sells the data – I don’t care, as Tweets are openly available on the web and I willingly publish them. My beef is that companies who pay for my data get better access to my data than I get. The Twitter API changed drastically lately and it is not easily possible for me to get a whole archive of my tweets. The API defines that I can get “up to 3,200 of a user’s most recent statuses” and that’s it. Even for searching Twitter and getting access to more than your last 20 photos you need other third party services that pull your information and store them on yet another server (which might be free and cause the same issue).

I use Twitter a lot and a lot of people thank me for my contributions. In order to retain information I have Twitter linked up with Pinboard – which I pay for – to have an archive of my links. I find it sad that I need to hack a service with another service to make me access my data in the future.

Now, Twitter makes money with my tweets, but doesn’t give me an option to get all my data that went through it – and that seems to be totally fine for a lot of people out there. Notice that I am not asking for any user’s tweets – just mine. The argument of the “that is fine” people is that tweets are openly available on the web and thus fair game for mining anyways.

Misdirected blame

That data is freely available on the web for mining should not be a carte blanche for companies to not give you access to your archive. Why offer an API at all when it is “so easy” to mine that information? Naming the best quality of the web – simple publication – as a reason to deliver sub-par user access seems to me lazy. Even the ill-fated Oink now offers you to get an archive of all the info you put in before it will go to the app farm in the sky where many a product bought by a larger corporation goes.

The same weird act of blaming the wrong thing happens with free mobile apps draining your battery. A lot of the comments and feedback blame the users for being idiots to use free apps and not deserving any better.

Let’s think about this: the poster child of a successful app and the success every app developer thrives to repeat is Angry Birds. Now, try to not “be an idiot” and pay for Angry Birds in order to preserve your battery life. Not possible.

The flagship app and the Cinderella story of startup success is a free app.

And as much as playing it is fun, the experience with ads is absolutely awful. I prefer very much playing it offline – as there are no banners covering part of the level and the performance seems to be much better. Even the HTML5 version is plastered with ads – funnily enough for Angry Birds itself. And this is where the real issue comes into play.

A need for new advertising models and metrics

The concept of a mobile app that requires me to be online is wrong. Our mobile connections are too flaky for that. The great thing about apps is that they do one thing and have offline and local storage capabilities and don’t expect me to be online. That means I can use them on flights and in tunnels.

So what is broken here is really the ad display model. You normally put ads in your products by hosting a third party script and pull the information from their servers. This worked on the web and to ensure that gaming the system is a tad harder the ad providers want to host the code and the banners and do their own click tracking (to avoid you for example putting the banner in an iframe, move it off-screen and reload it automatically). In a mobile world, however, this needs re-thinking. I’d really like to see the numbers of banner clicks in mobile games – I can safely say the only time I clicked any was when I actually wanted to explode a bird or slice a piece of fruit which means I got frustrated and much less keen on purchasing things.

One simple idea would be to allow app developers to download banners in bulk and rotate them locally rather than pulling them one by one. That way only clicking them would mean data transfers to the web. The display could be handled by local tracking. I found a few web banner services that do local caching like that. This means there has to be more trust between the app developer and the ad providers, but it would mean that you’d also have offline ads.

The second big problem is metrics. Right now all we measure as the success of apps is the number of users and downloads and when it comes to banners of course the number of clicks. This means of course that free apps will always be deemed much more successful than the ones that ask for payment upfront. So seeing the issues with free apps maybe it is time to reconsider how we measure the success of apps and get less excited about playing the number game.

Discussion on Google+

The web is the platform – presentation at MDN hackday in NYC

Sunday, March 25th, 2012

Yesterday we went to New Work City for the MDN hack day and I kicked off the day with a talk about HTML5, the opportunities it brings for developers and what people can play with during the day.

The slides are available and I recorded a screencast of me presenting them. The screencast is available on YouTube or in various HTML5 compatible formats at (embedded below):

If you want an audio only version, you can find that one on or here:

Here are the things I am covering in the talk:

Time to board my plane, more details on the MDN work week and the hackday on the Mozilla blog soon.

[watching] Creating responsive HTML5 touch interfaces

Friday, March 16th, 2012

In another episode of “things Chris watched in the gym and you should, too” here is a great video from the BayJax event series normally held in the Yahoo offices in Sunnyvale, California.

In this presentation Stephen Woods (@ysaw) of Flickr talks about touch interactions and how they used them in the tablet version of Flickr. You can watch it on YouTube or embedded here:

The slides are available on Slideshare and there is even an updated version Stephen presented at SXSW that is a bit less Webkit centric.

Here is what I liked about the video:

  • It is 24 minutes long – enough to burn 273 calories
  • Stephen knows his stuff without showing off. There is no ego, just a very relaxed presentation and great info. No need for pep rallies or showmanship, get the info, use it as you think fit
  • There are no “this works magical and is awesome” moments, you learn the good with the bad – it is a very realistic talk
  • It is a great mix of UX concerns, performance explanations and technical info. As in – you see how things were implemented, not how they theoretically should work backed up by random benchmarks
  • I found the first utterly sensible and necessary use case for CSS matrix transforms (simulating pinch to zoom)
  • It has cats
  • It never tries to show off with awesome knowledge or clever large words for simple solutions. Stephen openly admits to have come to the conclusions shown by trial and error

This is a great example that whilst everybody and their aunt will happily tell you that Yahoo is up a small polluted waterway without proper means of propulsion there is great talent and great work being done there. It also shows that when web technology information comes from people who have to deliver to users rather than push a certain product you can get wonderful insights without being hit over the head with them. I liked it. Great job, Stephen.