AI Copywriting - A Review

When creating an actionable A/B test, I set out to find the following answers to these questions:

What do I want to test?

Why do I want to test it?

How much time is it going to take me to run this test?

What am I going to do with the results? (The most important question in my opinion)

And with the above answered, I can feel confident to move forward with a solid testing strategy knowing that I’m not wasting time.

Introducing my review of attempting to run an a/b test between ai and human written copy.

Lately, Twitter has felt like it’s been filled with AI chatter. How AI copywriting can save time, boost conversions, and that it’s the next tech stack addition marketers should be implementing. 

I have been thinking about taking an AI copywriting tool for a spin for quite some time, and so last week at Parcel I created a test to measure the effectiveness of human vs. AI written copy. This test would take place against our existing onboarding emails.


I’m eager to see if there’s any statistically significant results between emails that are written by a human (me), or created by AI (copy.ai). 

The process 

Signing up for copy.ai was fairly seamless; they offered a free trial and sent me on my way. 

My first impression was that maybe copy.ai is not meant for email writing. This platform had approximately 6 out-of-the-box email copy templates. Because I was interested in an initial welcome email for onboarding and another follow-up that would be feature specific, those are the two templates I focused on. Besides those, copy.ai was full of social media templates, blog templates, and other alternatives that I wouldn’t necessarily need to use. 

To begin with a welcome email, you’re to enter the product name (in my case, Parcel) and the product description, and then have the option to generate copy and copy any of it into an editor. 

The generator spits out a handful of options, of which you can save, or discard, or generate more similar examples. 

I ended up slicing a few emails together to make one and did change some of the final sentences working to be more action-focused. 

My findings

  1. Time-consuming. Overall, as someone who prefers to sit down and write copy and has always done this as part of my job, I found generating and sifting through copy options fairly consuming. It probably took me in a total of two hours to create an AI-generated email, whereas an initial human-written draft would take me 30 minutes. I think it took me a large amount of time because no AI-generated email was entirely generated. I wanted to take bits and pieces of each, to create an email that I felt was well-suited for my audience. 

  2. Lacked action. If you scroll through your inbox, you’ll note that most emails ask you to do something, aka have a call to action. The emails that were AI-generated lacked such. 

  3. Parcel, not Package. Oftentimes I found that the tool’s interpretation of the product name was quite literal, which I could see as problematic for other companies similar to mine. 

My verdict

I think a tool like copy.ai is fantastic for inspiration, and I’ll definitely use it occasionally when I’m feeling a solid bout of writer’s block. Still, I can’t see myself using an AI tool to replace my existing process fully. Copy.ai spat out multiple great one-liners and a handful of honorable mentions. I spent more time cackling at potential options that I was wowed with the perfect one. 

My A/B test has just begun, so I can’t say with certainty that the extra time I spent crafting AI copy paid off. 

Previous
Previous

Gmail Annotations - A breakdown

Next
Next

A/B Testing - Customer feedback strategy