Sometimes less is more – a lot more. Case in point: this case study from 2020, where removing some copy generated a 918% lift in revenue-per-thousand-emails-sent (RPME).
You can see the control at the left below, the test version is on the right. and the color key to content is there as well.
When you’re thinking about what you might test, you always want to brainstorm and develop some hypotheses.
When I looked at the control, which the company had been using to promote this line of products, I noticed that it was different than the other emails the company was sending.
In most of their emails, they had product images, product names, and a link to learn more or buy.
But in this email, they included bullet-pointed descriptions of the top 3 products. They believed that including these descriptions would motivate more prospects to buy, but they had never tested this belief.
Now was our chance.
We kept the control as-is. We modified it to create the test version, where we: removed the bullet-pointed descriptive copy.
In order to get rid of the white space this created, we shifter the wireframe to be a two-by-three product grid (which was more in line with the wireframes on their other email marketing efforts).
The sample sizes were the same for the control and test versions. The results appear below.
If you’ve been reading my work, you know that we always make a business metric our key performance indicator (KPI). In this case, our KPI is revenue-per-thousand-emails-sent (RPME).
The test version bested the control in RPME – it drove more than ten-times the revenue per email, a lift of 918% — thanks to removing those descriptive bullet-points.
We can look at the sale-related diagnostic metrics to understand why the test won. There were two factors that contributed to the variance:
The test generated a higher conversion-rate-from-sent (CR from sent) than the control, 0.012% vs. 0.008%, a lift of 46%
The test generated a higher average order value (AOV) than the control, $1,019 compared to $146, a lift of 598%
We can also look at the email diagnostic metrics for some insight.
The emails had virtually identical click-through-rates (CTR), at 0.5%; the test had a slightly larger CTR (lift of 6%), but this was within the margin of error.
Interestingly, the open rate on the control was actually higher than the open rate on the test – 17.5% compared to 11.4%, where the test showed a loss of 35% compared to the control. What caused this? I don’t know. Both email messages have the same from line, subject line, and preview text. It’s an anomaly, something that we can’t explain.
Moral of the story: If you have an email that’s copy-heavy, try testing a pared down version against the control and let me know how it goes. You may find that less is more in your email marketing program, too.
Be safe, stay well, peace,