old computer


Author: Adam Salvesen

Let’s start simple. Is AI going to take my job? No. Probably not. At least I hope not. Is AI going to change my job? Almost certainly. And honestly, it already has. 

Do we use AI writing tools at Invanity? Yes, though only at low levels. We’ve experimented with it across the business but currently only use it for reporting automation, or simple, repetitive tasks.

Anyone who’s worked in a creative job over the last two to three years is probably sick of people asking if you’re worried about Artificial Intelligence and what it might mean for you. While it’s nice that people are worried about us, it’s probably unnecessary.

Throughout human history, technological advances have transformed every job under the sun. Well, maybe not flint-knapping, but if you’re still working in that field, I’m afraid I have some bad news for you… It’s not really a growth industry. I don’t care if it’s a family business, you’re really better off doing something else.

AI Image Generation (asked our designer to photoshop this … got told to do one) – thanks Midjourney!

Palaeolithic crafts aside, it’s hard to think of a job that hasn’t changed dramatically over the last two decades, since the widespread adoption of the internet. Even in fields that don’t use the internet directly, like building, farming, mining, etc. there will have been changes to things like scheduling and project management. Plus, as technology has become cheaper, items like laser levels, GPS for tractors and sophisticated sonar have become more available. 

But what does this have to do with AI? Well, like AI, all of these technologies are tools. They help us complete tasks more efficiently and effectively. AI tools like Google Gemini (formerly Bard) and ChatGPT are no different.

a Historical Example

Imagine this. It’s 1440(ish) and you’re a writer in Europe. First of all, you’re clearly very well-educated, so well done, you. Secondly, you’re probably a monk, so swings and roundabouts, I guess. 

Your job is creating manuscripts – handwritten and hand-copied documents, usually written on parchment or vellum. Usually this is copies of the Bible, classical works, or religious texts. You get up in the morning before dawn and work in a silent room with a bunch of other scribes until the evening bells, around 6pm. Your only day off is Sunday. It will take around 15 months for you to copy a Bible, and it’s expected that you’ll make at least one mistake per page.

It’s painstaking and tiresome work. Nothing has really changed in the world of writing for over a millennium, when there was a general switch over from scrolls to codices (singular “codex”, an early type of book). The process takes so long that small monasteries, such as yours, may only have around a dozen books.

There’s very little in the way of new work being created and disseminated, simply because it takes so much effort. You’re certainly not going to be writing anything interesting, either, you’ve got pages to copy. It’s alright, even if you did write something, no one would read it, because barely anyone is able to.

Then, in 1440…ish (we don’t really know the exact date), a goldsmith in Strasbourg (Straßburg, if you’re German, or want an excuse to use an Eszett) unveils his printing press.

It uses moveable metal blocks engraved with letters that can be set in place and inked, allowing you to print exactly the same page again and again. Over a single workday, a printing press of this era could produce over 3,600 pages, while a scribe could manage just three or four.

Now you might think, “Wait, Adam, I thought you weren’t worried about AI taking your job? I don’t see any scribes still plying their trade in 2024. The only badly-lit rooms filled with men hunched over desks all day are tabletop gaming conventions. If AI is writing copy, what’s the point of copywriters?”

Well, firstly, unlike a scribe, my job isn’t copying someone else’s work all day. I leave that to my capable colleagues, Cmd+C and Cmd+V. Though heavily bounded by what our clients want and what won’t give Jack an aneurysm, my work is “original”.

Secondly, AI just isn’t that good… yet.

A Flemish printer's shop ... Not a phone in sight, just people living in the moment
A Flemish printer’s shop … Not a phone in sight, just people living in the moment


Don’t get me wrong – ChatGPT and Gemini are incredibly impressive, turning out massive quantities of work in a matter of seconds. But the quality of that quantity is… questionable. 

Is it grammatically correct? Yes.

Is it typically in the ballpark of what I was after? …maybe?

Is it good enough to copy and paste straight into whatever I’m doing with no alterations? Hell no.

At the moment, AI chatbots are a bit like the power loom – another invention that revolutionised its industry. They can produce massive amounts of material very quickly, but you’re going to look pretty stupid trying to use that material as a finished product.

There’s still a great deal of cutting, stitching, hemming and adjusting to be done, and that’s where I come in. When I use AI chatbots while creating content, I effectively become the tailor. It’s my job to make sure this mass-produced material is ready for the catwalk.

The news gave us a fantastic example of how bad unsupervised AI can be back in February 2024, with the now-infamous Willy’s Chocolate Experience™! in Glasgow.

A World of Pure…”Imagnation”?

A World of Pure…

Long story short, a man with a history of… questionable business practices created a website advertising a whimsical experience for children. The landing page was filled with spectacular visuals that portrayed the event much like Willy Wonka’s Chocolate Factory, except definitely not that, and clearly distinct, because even charlatans are afraid of Warner Bros’ legal team. 

All of these visuals were created using AI – likely DALL-E 3 – and were packed with spelling mistakes:

The less said about catgacating, the better

(The less said about “catgacating” the better)

What hath British Leyland wrought?

What hath British Leyland wrought?

Despite all this – and the website’s copy, which somehow reads like AI, whilst also being full of grammar and punctuation errors – people still bought tickets! 

Upon arrival, these poor saps were greeted by a half-empty warehouse, sparsely decorated with props and the occasional actor working off a 15-page AI-written script. Needless to say, this went over badly, and by lunchtime on the first day, the police were called as, “the threat of violence had become quite high”, according to the poor actor responsible for playing Willy Wonka “MacDuff”.

All this is to say that, no matter what incredible material AI churns out, there needs to be a responsible adult present to ensure it has some sort of grounding in reality.

Using unsupervised AI as a substitute for an actual content creator, because it looks good and costs less, is like buying a peacock instead of a racing pigeon. It’s a nicer-looking bird, and it can fly (just), but try winning a race with it.

Walter White’s Chocolate Factory


Just as much as an editor, AI also needs someone to tell it what to do. Prompts are the instructions that direct generative large language models to perform a task, and they need a writer. Depending on the software, prompt crafting can be something of an art form of its own. 

Crafting prompts for AI is an intricate skill akin to fine-tuning a musical instrument. Just as a conductor guides an orchestra with precision, a prompt directs the flow of AI-generated content. The art lies in articulating the task clearly, balancing specificity with openness to creativity. A well-crafted prompt can coax AI into producing remarkable compositions, but it requires finesse and understanding of the technology’s capabilities and limitations. Much like a master artisan, a prompt creator shapes the output, ensuring it aligns with the intended purpose and tone. 

But even if you sculpt the perfect prompt, and somehow manage to produce a piece of copy that suits your purposes with minimal revisions, there’s still one gaping pitfall waiting for you: Hallucinations.

Prompt Tip

Tip: Write your prompt like you’re talking with a person


Much like humans identifying a face in the moon, or recalling events that didn’t happen, hallucinations (also known as “confabulations”) are when an AI “sees” or “remembers” information that doesn’t exist. 

This happens for a similar reason to human hallucinations – pattern recognition. Both our brains and generative AI work by putting pieces of information together and drawing conclusions from them.

For instance, if I show you this graph that has two lines changing in parallel with each other, you’d probably assume that the two things being measured are linked, because as one falls, so does the other.

Graph-1 (1)

However, if I revealed that one line is UK divorce rates, and the other is the number of Disney movies released per year, you wouldn’t blame the House of Mouse’s excessive output for your parents’ marriage failing. You’d recognise the idea of coincidence. 

Graph-2 (1)

The same goes for this graph:

Graph-3 (1)

Ncuti Gatwa is not responsible for US agriculture. 

AI doesn’t always get that, and this can lead to big problems. If an AI powered by natural language processing doesn’t have the information to answer a request, it may fabricate facts that sound plausible, based on recognised patterns.

This has already had some real-world consequences – in June 2023, a lawyer was fined for using six fake case precedents generated by ChatGPT in a personal injury suit. Scientists looking for information within their specialist fields have also found AI fabricating papers that never existed, leading them to contact the authors for information, and a lot of resulting confusion.

Using AI for important marketing copy is like having a snake oil salesman write your wedding vows. It will sound nothing like you, promise the world and be really embarrassing to read in public.


There’s also the matter of tone of voice to consider. Sometimes, when my creative muscles are burnt out at the end of a long day, I’ll pop my nearly-completed blog post into Gemini or ChatGPT and ask it to create an introduction, conclusion or meta description, which I’ll then pare down and rework. 

Every time I do this, I specify that the new writing needs to match the style of the rest of the article. But, somehow, every time it comes back to me as either a direct response copywriter’s fever dream, or the driest, most lifeless text you’ve ever read. Like if Jacob’s Cream Crackers wrote Terms & Conditions.

I’ve even given multiple examples of the exact style I want it to imitate, right down to the sections I want it to create and the words it should use, only for it to doggedly ignore my instructions and write in whatever style it had done the past few times. I almost respected it for that. I wish I had that level of self-assuredness. “What’s that? You don’t like my work? Well, I’m sorry to hear that. That must be very hard for you.”

And yet, despite all these flaws, I don’t think I can 100% dismiss the idea that AI will take my job. And the reason for this concern can be summed up in a single word. “Profit”.


Copywriters need more than just caffeine (but I will take a coffee if you’re doing one)


The fact is, copywriters cost money. We’re human beings, and we need more than caffeine and an innate sense of superiority to live, despite what you may have heard. 

AI – for the most part – doesn’t.

Sure, it might cost £40,000 a year for a pre-built chatbot that’s programmed to write essentially unlimited words in the perfect brand voice for your target audience, and that copy may still have flaws. But the volume of content it can produce is far greater than even the fastest copywriter could manage. And therein lies the problem.

For all I’ve said about the weaknesses of AI – its hallucinations, its un-nuanced view of tone and the fact it needs a supervisor to tell it what to do and ensure its product is usable – it is cheaper than employing multiple copywriters. And when it comes to investors and c-suite executives looking to preserve their jobs, that’s often all that needs to be said.

At that level of business, whether the job that’s done is good or not ceases to matter, and the question becomes, “is it good enough, and will it increase our productivity/profits?” When the priority is generating profit for shareholders, the quality of work is irrelevant, as is its impact on staff workloads, so long as it benefits the P&L sheet.


AI might be faster than us, but it certainly isn’t better (…yet?). It can massively speed up some of your more mindless tasks, like endless product descriptions, and help you avoid common mistakes, but it’s still not perfect, and sometimes the writing it produces can stick out like a sore thumb.

But if you’re worried about losing your job, I’d suggest you look at who stands to benefit, not the tool they’re using.

Check Out our other blogs