Christian Heilmann

You are currently browsing the Christian Heilmann blog archives for May, 2021.

Archive for May, 2021

The (new) Developer Advocacy Handbook is live!

Monday, May 31st, 2021

I just finished the writing part of the new edition of the Developer Advocacy Handbook. I wrote the original almost 15 years ago, and this edition is a heavy re-write. A lot has changed in the world of Developer Advocacy and I tried to add my findings over the last few years to the book. It gives you all the basics of becoming a Developer Advocate and explains which channels to cover and how. I deliberately moved away from favouring certain products or social networks as these things change at a fast pace.

You can find the book at Developer-advocacy.com and here is the high-level table of contents:

Book chapter shown in the browser

Hosted on GitHub – contributions welcome

The book is fully written in Markdown and hosted on GitHub. I am also now opening the repo for contribution, so if you find something you’d like to change, please file an issue or create a fork. The book is licences with Creative Commons, non-commercial, so all contributions are welcome.

Next steps

I started recording audio recordings of the chapters, drop me a line if you think that’s a good plan to pursue further or not.

Automated accessibility testing gets you on the way, but doesn’t find all the problems

Monday, May 17th, 2021

Testing your products for accessibility issues is a tricky thing. Your users have all kind of different needs and setups. They use computers in ways that appear daunting to you and hard to replicate. Any sighted person that used a screen reader to test their products can verify this. Often we turn off the sound and rely on a text log of the screen reader or an on-screen display.

The problem is that we’re not used to interacting with our products in others ways than our own. There is a high chance that we’re doing things wrong when we emulate how a certain group of users would interact with our product. Assistive technology is complex. There are, for example, dozens of different keyboard interaction patterns to learn when you use a screen reader. This isn’t even an accessibility problem. It is incredible to see users in tests consume your products in ways you never thought of. And it is painful to see how often things that you consider “obvious” and “easy” are a barrier to a lot of people.

The best case scenario is that you have access to a diverse testing team who can bring these other points of view and experience for you. In my job, I am lucky to have a dedicated team of testers with different abilities and setups. Bug reports come with videos of them using the product and information where they get stuck. But not everybody has that opportunity. And it still means that it sometimes is hard for me to replicate the process.

Often you find developers using automated systems to run a quick test on their products to find the most obvious problems. Which is good, as it means you find the lowest hanging fruit and can deal with them. That cuts down the list of bugs that the real test team has to deal with. But, accessibility testing is interaction testing, and this is where automated processes fail. Many accessibility issues only show up when you start interacting with the product.

I am explaining this in my Skillshare class on accessibility. You will find that a lot of accessibility issues only show up in interaction.

Take for example the demo page of the class:

Demo page with some accessibility issues

An automated test will discover glaring accessibility issues. Missing alternative text on images, missing form labels and also contrast issues.

The Issues panel in browser tools for example shows these.

Accessibility reports in the Developer Tools Issues Panel

If you run a Lighthouse audit on the product you also get them.

Lighthouse accessibility audit

And Accessibility Insights fast-pass also flags up these issues.

Accessibility insights report next to page with flagged up problems highlighted in the document

Each of these gives you a false sense of success though. Because when you look deeper, there are still some obvious issues to fix.

While automated systems correctly flag up the search form as a problem, they ignore the donation form on the left hand side.

Forms in the demo page highlighted

The donation form has a label associated with the form field which means it passes this test. But, all the buttons in that form are DIVs and have no meaning to assistive technology. You can’t even access them with a keyboard. You can test that yourself by trying to tab to it. All you can access is the form field itself, but you can’t even hit `Enter` to submit the form as there is no submit button.

Using a keyboard to navigate the document is the best way to find out about these issues. You can also gain some insights by checking the accessibility tree. Or by using the Inspect tool to find out if the element is keyboard accessible and that it has a name and a role.

HTML page source shown in the developer tools with the accessibility tree showing up that the role of a DIV is generic and that it has no name, regardless of its text content

The Inspect tool shows in a more obvious fashion that the element has no name and only a generic role and that it isn’t keyboard accessible.

Inspect tool overlay showing the information of the broken button

Another problem that an automated test can not detect without extra effort is different changes in mode. For example, in the dark scheme of the page the Issues panel reports 2 contrast issues.

Issues panel of developer tools detecting 2 contrast issues in dark mode

When you switch your OS to light mode (or use the emulation in developer tools) you will find that there are six contrast issues.

Issues panel of developer tools detecting more contrast issues in light mode

This proves one thing: the earlier in the process of your product you take accessibility into consideration, the better the result will be. Any interaction is a possible accessibility barrier and needs to be flagged up for testing.

Experts know that. In a survey of Web Accessibility Practitioners 17.6% stated that only half the accessibility issues can be detected by automated testing.

The interesting thing is that tools even point that out. If you expand all the information on a Lighthouse Audit report you will see that the tool mentions that a lot of manual testing is needed. Question is, how many people even went that deep down the rabbit hole and looked closer when the accessibility score was already high enough?

Lighthouse asking you in a very timid way to also do some manual testing

A few years ago Manuel Matuzovic proved in a – admittedly contrived – example that you can create a fully inaccessible site with a 100% score on Lighthouse.

The main thing to remember is that automated tests are a great way to get started and find the most obvious flaws. But they are not the end of the process if you really want to create accessible products.

Want to learn more? Check out the class on Skillshare where – amongst other things – I talk through this process of moving from automated to manual testing.

Screenshot of the Skillshare course on accessibility testing

I have a new course on Skillshare: Product Management: Tools for Improving Product Accessibility

Tuesday, May 11th, 2021

Introduction photo of the course with me sitting on my desk

I’m super happy to announce that a new class of mine is now live at Skillshare. The title is Product Management: Tools for Improving Product Accessibility and in it I am covering a lot of tools to automatically check for accessibility in your products, but more importantly what you can do in your browser to make sure your products truly are accessible.

The course consists of seven lessons and is 36 minutes overall.

1. Introduction (3:59)
2. Accessibility and Why it Matters (4:21)
3. Automated Accessibility Tools (2:32)
4. Testing Color and Text Accessibility (9:34)
5. Testing Interaction Accessibility (8:48)
6. Testing Media and Image Accessibility (6:19)
7. Final Thoughts (0:35)

Here’s what Skillshare said about the class:

Reach more users of your product than ever before with Microsoft principal program manager Christian Heilmann!
Product accessibility is becoming increasingly essential to meet our ever changing needs and varied uses of the internet. Join Christian by putting yourself in the shoes of internet users, enabling you to develop products that are not only compliant with accessibility regulations, but flexible for a spectrum of uses.
Together, with Christian, you’ll learn:

  • What it means to make your product accessible and why it matters
  • The benefits and restrictions of using accessibility tools
  • How to test for color and text accessibility
  • How to test and fix issues of interaction accessibility
  • How to test and increase media accessibility

Whether you’re a developer who is looking to make your products easier to use or you have an interest in product accessibility in general, this class will get you to think like a user of your product, ultimately meeting the needs of more people.

If you aren’t already a Skillshare member, you can sign up and check the course with a a 14 days Premium Membership Trial.

Given the current situation, I didn’t fly to New York to record the class but had a camera man come to my place, so this course shows you how I work each day, which interestingly enough is the topic of another, upcoming, class on Skillshare.

Answering questions about my career for honeypot.io

Tuesday, May 4th, 2021

A few weeks ago Honeypot.io asked me to answer a few questions about my career, and here is the video of my answers.

  • 00:00 Introduction
  • 00:31 What are your top tips for career advancement?
  • 02:03 What strengths are most important for a developer?
  • 04:07 Work-life balance: how do you do it?
  • 05:11 What mistakes have you made in your career?
  • 07:06 What was the proudest moment in your career?