Christian Heilmann

You are currently browsing the Christian Heilmann blog archives for June, 2023.

Archive for June, 2023

Reading tweets without being logged in

Friday, June 30th, 2023

Update: nitter is broken right now .

OK, Twitter.com now doesn’t allow you to read tweets without being logged in. You can circumvent the need to login by changing twitter.com to nitter.it though.

So https://twitter.com/codepo8/status/1674835360516390912 becomes https://nitter.it/codepo8/status/1674835360516390912 .

screencast showing the hack in action

Is it too late to fix the problem of AI clutter on the web?

Wednesday, June 28th, 2023

Cluttered book store with a person trying to find a book

Ethical and compliance issues aside, the biggest problem I see with AI generated content or code is waste. Once again we use a new tool to generate more things quicker, rather than to generate fewer, better things. Even more interesting is that we could use this as an opportunity to analyse our ways and recognise cruft. In other words: if things are that formulaic that a machine can generate them, do we really need them?

A common joke is that people use ChatGPT to turn three bullet points into a well-formed email or cover letter and the recipient using it to turn that email back into three bullet points. It’s funny because it’s true. When I applied for jobs, I took the job descriptions and my CV data and asked ChatGPT to generate well-worded pieces for me. Quite a few companies I applied at use AI to screen incoming emails. I am also sure recruiters use AI to distill CVs and cover letters, given the amount of emails they have to deal with each day.

Application processes, CVs and cover letters have become formulaic to a degree where there are services to write your CV for you. It feels like hiring a lawyer or tax advisor, as the language necessary to get where you want to go is so far removed from day-to-day communication that it needs an expert. Do people read all that well-worded information though? I doubt it. Personally, I scan more than I read.

AI recommending us content doesn’t mean we have to take it all

Almost every application of AI sold as a way to make us more efficient means creating a lot of content automatically. I started typing this in Visual Studio Code with GitHub Copilot enabled. Copilot tries to be helpful by autocompleting my sentences and offering new paragraphs that complete the thought. Or so it thinks. What it did was annoy me with lots of unnecessary repetition of points made in the first paragraph. So I switched another editor – Hemingway, which keeps your writing terse and to the point.

I could have let Copilot go nuts and keep all its suggestions. It is tempting as it feels that you create a lot and you’re a more efficient writer. It is pretty common that people do that. The amount of generated content is overwhelming the current web. As the Verge put it, AI is killing the old web, and the new web struggles to be born. People generate a lot of articles, and moderators can’t keep up, so they also use AI to automatically detect AI generated content. It is the search engine optimisation arms race all over again, but this time it’s automated and it costs a lot of energy. Both human energy and electricity being wasted.

This is not only detrimental to the quality of the web as we’re drowning in mediocre, traffic-optimised content. It is also bad for the planet. AI functionality doesn’t come cheap. It is expensive in computation and means a lot of traffic going back and forth.

Trying to be green in an avalanche of generated content

We are currently looking how software can be greener and use up fewer resources and people like the Green Software Foundation do some amazing work in spreading awareness. And yet, the cost of AI consumption is not often questions as it is the cool thing of the moment.

Sure, with text this isn’t that much of an issue. Generated images, videos and upscaled low quality media means a lot of computation power and energy used for, well, what exactly? To prove that we can generated an image from a text saying “a monkey wearing a watermelon as a hat in the style of matisse”? Our few seconds of fame as a funny creator on social media without having to put any craft into it?

Diffusionbee generating an image of a monkey wearing a watermelon as a hat in the style of matisse

It’s pretty likely that this is another fad that will go away in the long run. Much like we stopped doing Simpsons avatars or Elfed ourselves. Younger audiences also consider GIFs as “cringe” which makes me happy as that was traffic and distractions nobody needed.

If AI generates code it can also optimise it

It is interesting though that the CEO of Twitter announced that soon Copilot and others will generate 80% of the code out there. The optimiser in me immediately saw this as an opportunity to cut the fat of our code bases. If 80% is generated boilerplate code, why should that always be created instead of re-used? During the course of my career one thing that annoyed me was that developers have no patience with platforms. Instead of taking part in open source and standardisation to make the platform do what we need, people kept writing their own solutions. Solutions often touted as a “stop-gap” solution that in the end, never go away. JavaScript libraries that made cross-browser development easier are now a dependency that can’t be removed any more. Even worse, they often aren’t even maintained, meaning they do not only become unnecessary traffic and code bloat, but also a security and performance issue.

I have to admit that in the last years I lost some interest in developing code. It’s not that there are no cool challenges or excellent new platform features. It is the way we approach development these days that bored me. We don’t solve problems, we don’t look for native solutions. Instead we include packages and components we don’t even know how they work or what they do. We build products from building blocks that other people wrote. It’s not a “let’s start a project by looking at the problem to solve”. It is “run this install to get the boilerplate code you might need”. When we release these products we find out they don’t perform. So we hire a performance expert to analyse our products and they find unused code and cruft. We then write more code to remove this unused code and create optimised code bundles for different use cases.

AI content generation feels the same. We generate a ton of content that isn’t ours and may be bad quality or a terrible idea and throw it out there. And then we use AI to cut it down to something that is understandable again. Seems wasteful, doesn’t it?

If 80% of the code of the future will be generated, this is a great opportunity to optimise that code. We won’t have to argue with engineers who want code to be done in a certain way. We can let machines generate code for machines. And then we have to ask ourselves why this code even exists. AI is great to detect patterns. So if hundreds of developers keep using the same code to solve an issue, couldn’t that code become part of the platform?

Let’s stop littering the web – it’s already full of rubbish

The web is in danger to be flooded with generated content that nobody needs. Our codebases are likely to get bigger because we get offered lots of code by hitting the tab key instead of writing it by hand. This is the time to be aware of this. Sure, convenience is a lovely thing. But we also need to think about the cost of what we create. For the ease of mind of other people, so we don’t overload them with content they will never read. And also what it means to the planet and our electricity consumption. AI is here to stay, and it can be used to optimise our workflows and our software products. But it can also help us to litter the web even more than it is right now.

Photo by Darwin Vegher on Unsplash

Interview with Goto Unscripted about the present and future of developer tools

Thursday, June 22nd, 2023

Chris Heilmann on stage with his hands in the air

Back in October 2022 I presented my work as a principal product manager of developer tools at GOTO Copenhagen and we also recorded a half hour interview. This is part of the GOTO unscripted series and Julian Wood talked to me about editors, debugging in context and what AI can do for developers. Here’s how they announced the video:

Are you tired of the traditional coding process of tedious debugging and manual coding? That’s where AI-powered IDEs come in. Christian Heilmann spoke to Julian Wood about why developers must break away from the traditional monolithic IDEs. He highlighted the importance of contextual tooling, where interfaces learn from user behavior and provide automated suggestions and simplified workflows. Hear this new #GOTOunscripted talk and learn to focus on writing rather than navigating complex environments.

You can watch the video on YouTube :

There is also a full transcript available at GOTOPedia .

Turning the screw with AI – ways for developer advocates to debunk the “AI can replace developers” myth

Wednesday, June 21st, 2023

A demonstration in the street with a protester having their hand raised

We are in times of turmoil in the tech world right now. On the one side, we have the AI hype in full swing. On the other, companies are heavily “healthsizing”. I just went through a round of job interviews as a prime candidate and wrote up why I didn’t take some jobs. It made me realise a few things.

Developers are worried

Things that developers worry about right now are if their jobs are safe or if they could be easily replaced by AI. They also worry about how to stay relevant as developers and not fall behind in a market that seems to churn out ground-breaking innovations by the hour. Furthermore, they worry how they should adjust their job expectations to not be part of the next round of mass layoffs.

Companies turn the screw and make developers feel that they are easily replaceable

All the big companies that developers want to work at right now flex their muscles trying to take away some of the freedoms and perks developers have been enjoying for a long time. To a degree, this is fine, as compared to other jobs we had a ridiculous amount of perks and freedoms. However, there are a few annoying things happening right now.

Companies in general implement an aggressive “back to office” policy which smacks of wanting to have more control over people rather than trusting their employees. Companies also cut costs at all cost. I’ve seen vital employees being laid off because they’ve been with the company for a long time and thus cost a lot.

Now, laying someone off in the US with a long track record isn’t that much of an issue. In Europe, however, the package you get for leaving and how many more months you get salary is connected to the amount you spent at the company. So, laying off a 10+ years employee is ridiculously expensive.

Generally, there seems to be a movement that companies try to see how far they can cut perks, salaries and benefits and yet still have top-notch employees. Often this goes hand in hand with a narrative that “AI is changing everything” and developers are not as needed as they used to be.

A quick reminder about Developer Advocacy

In my developer advocacy handbook, I defined a developer advocate as:

A spokesperson, mediator and translator between a company and its technical staff and outside developers.

So, if you are a developer advocate in a company, it is now the time to ramp up on first part – ensuring a good communication between the company and its developers. And – if need be – be their advocate in these cost-cutting negotiations.

This isn’t about the threat of AI

The argument that developers need to prove their worth to a company because of AI innovation is bogus. This isn’t about disruptive innovation. It is about ad sales plumeting and the stock market taking a punch. Companies are valued at profitability per head. So when profits go down – heads must roll.

Elon Musk with Twitter and others showed that you can act like a factory owner in the 1920s and people won’t complain and many people follow that example.

There’s also the problem that that we live in a general time of uncertainty and decline. And this is where populism thrives. The technology sector has been called by conservative and far-right spokespeople and politicians an “elite”. A job for only a chosen few, pampered millennials. So anything bad happening to the tech world is considered long overdue – we finally are also vulnerable. When the pandemic hit, the tech world was one of the few that still thrived as people needed to use the internet to communicate. When AI as a large new movement first came around, many people worried about their jobs. Now it is our turn to feel the same.

Some facts about AI and developer jobs

  • AI will take jobs
  • AI systems are a great way to make people more effective, thus saving the company money
  • Developers should be knowledgeable about AI
  • Most AI systems won’t be applicable to your company
  • AI means a big change to developer careers
  • Open LLMs are bad at teaching coding

Let’s go through them bit by bit and what developer advocates – or indeed anyone working in tech companies – should do about them.

AI will take jobs!

There is no doubt that some jobs will fall prey to AI now that it is much more readily available and computing power has caught up with its demands. The false narrative though is that it will replace specialist humans, experts and even highly creative jobs.

Generative AI is great at creating throwaway products. Quick prototypes, proof of concepts, an outline for an article that will need to get fact checked and re-written before publication.

This isn’t new. There has been an ongoing demand for predictable, throw-away products for quite a while that don’t really need to be done by a skilled developer or designer. In the past, these jobs could also be done with a content management system and a template. If we are honest with ourselves, even developers and designers have taken shortcuts for decades, re-using other people’s code, working with design libraries or putting together components we bought and customised.

The cool new thing tech companies try to sell is that we can create a web product by asking the right questions and an AI and it will give us designs, write copy for us and even create the code.

The first two results, writing generic copy and creating a “proven successful design” are worrying as they will result in thousands of look-alike products with no personal branding or character. This has been a worry of designers for quite a while now, but now we give people the illusion of being a director who can create by asking questions and refining them. The end products will still likely look like the same product with a different colour scheme and a different logo. Bootstrapping a product with a generic design and copy is a great way to get started, but it is not a way to build a brand.

Will this replace designers who know their job? Not really as there is a lot more to good design and writing than creating one interface. Will this replace writers? Not really, as a good writer works with the brand and the media limitations instead of churning out clever sounding copy.

The last part – that AI can also create all the code needed to build a web product – is a lie. Sure, you can generate code that creates this design with this copy and maybe add some basic interaction. But a real software product isn’t one design.

It aligns itself with the needs of the user, allows for customisation and people to change the look and feel to their needs. It also needs to know where the data in the interface comes from and what to do with user interaction regardless of input device. And – even more importantly – it should never make any assumptions about the text content of the product, as that will eventually come from a CMS anyways and have to support multiple languages.

This means though that products that don’t need to adhere to these quality standards or even legal compliance needs can and should be done with a “what you see is what you get” interface. And if people are happy to start with a human question to get there – all power to them.

This kind of AI is currently taking jobs that have been nothing but frustrating to designers and developers. We all have spent far too many hours building prototypes, slideware and demos for product shows that were eventually for the bin. And we had to deal with dozens of rounds of feedback that people want “this bigger” or “a nicer font”. If these people can now annoy an AI with this kind of demands – great.

If anything, this is a good opportunity to see where you apply developer and designer resources for products that will never be released anyway and redirect them to better causes. So here’s what we should do to deal with this threat:

  • Analyse what part of our company’s work could be automated with AI.
  • Provide coaching for teams to use these systems to create materials that can be augmented by the development team if needed.
  • Communicate with upper management that automating these things frees developer resources and provide work to be done by these teams – not replace them

Applying AI in context makes people more effective and saves the company money

Where AI systems can help is by making people more effective. This is where the “in context” part comes in. There are three areas where AI can help:

  • Machine Aided Code completion
  • Aiding Collaboration and Learning
  • Design to code conversion

Machine aided code completion are “AI code editors” that can help you write code. They are not new, but they are getting better. The idea is that you write a few lines of code and the editor will suggest the rest. This is not just about auto-completion, but about context recognition and style recognition. The big players are GitHub Copilot and Amazon CodeWhisperer. In addition to auto-completing your code, these tools can also explain code and have granular controls like “make this code more robust” or “generate tests for this function”. GitHub Copilot also offers a ChatBot interface that will run in the context of the currently open project, thus giving your developers the power of a chat client in their code editor. This also means that generated code is automatically validated.

The longer you use these tools, the more relevant the results get. I’ve been using Copilot since its beginning and whilst it offers me results I didn’t use before, it does mimic the way I write code. This means that it is not just a tool to save time, but also a tool to learn from. And that the generated code adheres to a standard that I use. This could be incredibly powerful in a team, where you could use an editor like this to enforce a coding standard and have the editor generate code that adheres to it without months of discussion.

Another interesting idea is Copilot for pull requests which is an autocompletion for pull requests. You tell it to write a summary and a step-by-step instruction and it takes the code in the pull request to create those. This saves a lot of time and ensures that your pull requests are always up-to-date. Having no readable or actionable pull requests is a big problem in many companies and this could be a great way to solve it.

There is already evidence that using a tool like this is great for efficiency of teams. The paper The Effect of AI-Assisted Code Completion on Software Development Productivity describes a study where a team of developers used a tool like this and were 40% more efficient than a team that didn’t.

On the design front, there are some interesting plugins for Figma and other design suites available. Locofy generates React and other framework components from your designs and helps with automated tagging of those. Deque’s Design tools give designers an idea what a certain component should be like in code and how to make them accessible.

It is up to you to find out how this could boost the efficiency of your teams:

  • Determine what tools your company could implement in-context and coach the teams how to use them.
  • Test out and validate systems out in the market.

Developers should know about AI

There is no doubt that developers should know about AI and get an in-depth look at how these “magical” systems work that can seemingly replace them. This means that companies should now ramp up on getting their employees to understand the workings and limitations of “AI”. Five years ago I wrote a course for Skillshare called Demystifying Artificial Intelligence: Understanding Machine Learning and whilst it is of course not up-to-date with current changes it still does a good job making people understand what “thinking machines” can do and where the limits, ethical issues and technical problems are.

The big players also understand that there is a need for engineers to know the inner workings of the big AI systems. DeepLearning.ai for example has a ChatGPT Prompt Engineering for Developers course that explains how to ask the right questions and Google has a full in-depth Generative AI learning path. Microsoft has a Quickstart Guide how to build a ChatBot for your own data.

The things we should be doing now are:

  • Collect up-to-date materials and courses and distribute them in the team
  • Negotiate with management to get a budget for official training on AI matters

Not all AI systems will work for your company

Whilst it is great to see all the things people do with AI, it is important to understand that not all of these systems will work for your company. Interestingly enough, even the big players like Google and Amazon warn internally about using chatbots as they are worried about data leaks and the systems being used for malicious purposes. So it seems there is a huge “Do as I say, not do as I do” going on here.

Companies need to question themselves if they can allow AI systems to index their codebase, if the internal setup and dependencies can even be accessed and if they are happy to have their codebase be used to train an AI system. Other questions are about usage and compliance. Can you use code without knowing its license or knowing where it came from? How does AI generated code fit into your review and compliance processes? How can you ensure code quality and security?

Steps we need to take now are:

  • Lead team members to review different tools and present them to the company followed by a discussion if the tools are applicable.
  • Assess and review different AI offerings and work with leads and upper management to see what can be applied.

Our careers as developers change

The biggest change we have to deal with right now is that if we get into a world where content generation via AI is the first step, that this affects our careers. We need to be aware that we are not the ones writing the code anymore, but we are the ones reviewing it. This traditionally is the job of a senior developer, but it is now the job of all developers. We need to change our career goals and expectations. The role of a junior developer becomes a lot smaller and people need to ramp up faster when it comes to their assessing and reviewing skills. And senior developers need to do more coaching and training of juniors to spot issues in generated code rather than writing code or doing reviews of hand-written code. In essence, we are merging two career levels into one. This could be a good thing, and it could save the company money in the long run.

To make this happen we need to:

  • Work with management and lead engineers to define a company strategy to make reviewing code a core skill.
  • Re-assess hierarchy and career goals and expectations accordingly.

Open LLMs are bad at teaching code

There are hundreds of demos and videos where people use ChatGPT to write code, build a web site or write a game. These are a great inspiration and I am all for people learning about code in a playful and engaging manner. The problem is that this doesn’t teach us how to program. It shows us how to use a tool to generate code. This is a huge difference. Granted, some tools also come with code explanations, but none of them tell you about shortcomings of the code. None of them tell you about security issues or how to make the code more performant. None of them tell you about the license of the code or if it is even legal to use it.

This is nothing new. Developers have been copying and pasting code they don’t understand from forums and tutorials for ages, changed some numbers around and when nothing blew up, submitted it to the project. This, however, was also an incredibly dangerous thing to do.

LLMs are great to make things sound like they are easy to do. They are good at making people feel like they are great developers. But great developers know why the code they write works, not just write code that works. Even worse, bad actors already use this as an opportunity to make developers use malicious and insecure code. There have been instances where ChatGPT generated code proposed packages that were misspelled or didn’t exist and when installed, they installed malware.

The way around that is to either set up your own ChatGPT like product or limit the sources your developers can use and immediately validate the generated code. GitHub CoPilot for docs is an interesting idea to give you a ChatGPT like interface but limit it to official documentation of certain projects.

One of my favourite things about it is that it allows you to customise the level of information you want the system to display. Thus it can be used to teach a junior developer the basics, as well as remind senior developers about syntax they may have forgotten.

Filtering options for GitHub Copilot for docs

If used inside your editing environment, this also means that code generated with these systems will automatically be checked for syntax errors and issues.

In order to use LLMs to teach internal developers, you need to:

  • Define and agree a “code hygiene” plan with the team for generated code.
  • Limit results of LLMs to validated sources and internal code repositories.
  • Work with management to use saved time to implement best practice trainings to learn how to validate generated code.

This is the time to fight for software development as a craft

These are some ways to counteract the “we don’t need developers, we have AI” argument whilst embracing these new ways of working in tech. I am sure there are many more and I’d love to hear your thoughts on this. Generally I think we now have the big task ahead of us to defend software development as a craft rather than a commodity. Sure, we can create a lot of stuff without any designer, writer or developer involved. But much like you can survive on microwave dishes, it is not the same as a home cooked meal.

New array methods in JavaScript bring immutability

Tuesday, June 6th, 2023

Scarlet Witch saying no more mutants

JavaScript now has a way to change elements, sort, reverse and splice arrays without changing the original, thus giving it immutability. Four new methods allow you to change arrays without having to create a copy first. The new methods are `with()`, `toSorted()`, `toReversed()` and `toSpliced()`. No need to create a copy with `[...arr]` first. Only missing support in terms of browsers is Firefox.

const arr = ['f','c','k','a','f','d'];
const newArr = arr.with(2,'m');
// newArr -> ['f', 'c', 'm', 'a', 'f', 'd']
const sortArr = arr.toSorted();
// sortArr -> ['a', 'c', 'd', 'f', 'f', 'k']
const reverseArr = arr.toReversed();
// reverseArr -> ['d', 'f', 'a', 'k', 'c', 'f']
const splicedArr = arr.toSpliced(3, 3, 'it');
// splicedArr -> ['f', 'c', 'k', 'it']