Tuesday, October 21, 2008

Surprising test results of email marketing campaign

I don't have a lot of time to get into details on this but I thought it was worth sharing. I've now been working in the email marketing space for over a decade. I continue to be surprised by new test variables that prove to have interesting results. Interesting in that it can be well worth the bit of time involved to test.

In the past month, I've worked with 2 clients who were doing limited launches. Both for products outside of the Internet Marketing circles and in niche areas. Both followed a string of successive emails building up to launch day. Both had a limited number of products available for sale. Price points were almost the same. No affiliates/jv partners used. These were launches done to in-house lists.

Client A followed the normal strategy to almost a T in that everyone received the same emails. At launch - 3 hours, 1/3 of the 28,000 list received one of the following emails:

A. <> opening in 3 hours
B. <> opening at 1 pm
C. <> opening at 1 pm Pacific

A minor difference to most people, but the results and how this translated to sales was significant. Who received which of these 3 emails was entirely randomized. The content of the emails as well as the email that went out at actual launch was exactly the same for everyone. The shocker here is that not only did version C recipients open more than A+B recipients combined, but over 75% of the orders in the first 30 minutes came from version C recipients. In the end, version C recipients ordered only slightly more units than versions A or B. However it did take almost 6 hours to sell out entirely.

Lessons learned: Specificity on the timing side is important if you are shooting for a fast sell-out. If you are promoting as a JV partner or as an affiliate, it could help you to get a bit more sales in.

Client B's launch was the following week and I asked if we could take this testing idea one step further. His database actually contained state/zip/country for more than half of subscribers. For the rest, IP addresses were saved. We broke his database down into 11 segments based on time zone and 1 for ones we were unable to determine. Starting the day before launch, we started to customize emails based on these segments. There was one email sent out the day before launch, one email sent out -3 hours, one email sent at launch and then a final one sent at sold out. Here's where things get really interesting: for all 11 time based segments, between 60 and 75% of orders came in during the first 30 minutes. If we consider the 12th group as a control group - something to compare results against - the overall conversion rate was about the same (it took 4 hours to sell out) however only 18% ordered in the first 30 minutes.

Lesson learned: specificity is even more important than I would have thought. Making it a no-brainer for people to know the time they need to be ready to order can help boost immediate sales.

A few caveats:
  1. Because the numbers worked out this way in these tests does not mean that they would necessarily work the same way for you. This is why testing is important. (If you do run a test of this kind and would like to share your own results, I'd love to hear about it.)
  2. For time-based launches, it's really important to use a vendor that can actually get your emails out on time. I'm continually baffled by the number of big time Internet Marketers who use third rate mailing systems where their emails go out HOURS after a launch has actually sold out and/or many of their emails end up filtered in spam folder because they are using the same content as other marketers. Reliable email marketing service. (Yes, it's my company.)
  3. What about if you don't have data to do geo targetting? If you are using double opt-in, you likely have the IP information associated with a sign-up stored "somewhere". Although that is a laborious process to match back, you can go to Rentacoder or Odesk and get the work done for about $1-2 per 500 addresses.