Copilot me if you can

This article, part of the writing collection, was published on .

A short tale about how GitHub Copilot generated some “real” data about my website.

On , I received a notification that I’d been mentioned on Twitter by someone I didn’t recognise. Since I don’t get a lot of mentions — maybe because I don’t tweet very much… — curiousity immediately got the better of me!

After some quick reading, I suddenly had mixed feelings. It turned out that another programmer out there in the ethos had been tinkering with GitHub Copilot, an AI-powered tool for developers to generate code on-the-fly. She had asked Copilot to generate some real data, and here’s what it generated:

const content = {
  title: "The Future of the Web",
  description: "A look at the future of the web, and how we can make it better.",
  date: "2021-09-01",
  image: "",
  imageAlt: " A screenshot of the web. dev website, with the text left corner.",
  imageTitle: "The Future of the Web",
  imageLink: "",
  imagewidth: 1200,
  imageHeight: 630,
  imageType: "image/png",
  imageCreator: "Chris Burnell",
  imageCreatorLink: "",
  imageLicense: "CC BY-NC-SA 4.0 ",
  imageLicenseLink: "",

Wait, what?

  1. I have never published a post titled The Future of the Web.
  2. There is no cdn subdomain attached to my domain, and my website is built with Eleventy, not Astro.
  3. My website’s domain is, certainly not!

What’s going on here?

If this data is meant to be about a page on my website, besides the last four lines which appear to be accurate, it looks like Copilot has transmogrified data from my website,, and possibly from elsewhere too into a single block of supposed real data.

I guess I’ll have to keep an eye on my server logs to see if I’m getting requests to URLs that look like the ones Copilot generated, and if so, how many requests are being made. I’m not very knowledgable on the subject, but I could imagine this would lead to an unintended DOS attack if my website happens to be used prevalently for generated code of this nature by Copilot.

The bright side to all of this is, of course, Lindsay Wardell, who was so kind to track me down to reach out and let me know. It turns out she’s a super-talented software engineer and fellow D&D-enjoyer, so I now rightly know to follow her on Mastodon, and she has a great website too.

This situation, as obscure and small this instance is, still has me a little worried, and I don’t really know what to do about it, if I can or should anything at all.

One thing’s for sure, though: if we had more people like Lindsay in this world, we could all worry a little less knowing there are still wonderful people out there in the ethos looking out for us whose path just hasn’t yet crossed ours.