Email marketing isn’t a matter of simply hitting ‘send’ and hoping for the best. The best email campaigns develop from multiple rounds of A/B testing. Not news to you, right?
While this is basic (and easy) stuff for some digital marketers, there are still many institutions struggling to actually putting it into practice. Staffing challenges anyone?
Come with Questions. Leave with a Plan.
How confident are you with your selection of international student recruitment markets right now?
The Intead/San Diego State University One-Day Workshop on December 13th will be a hands-on opportunity to learn from an awe-inspiring international student recruitment faculty.
- A full day of international student recruitment strategy and execution discussion
- Two luminary keynotes
- Luncheon on Social Justice with Dr. Jewell Winn and Dr. Adrienne Fusek
- Dinner on Chinese Student Influencers with Dr. Yingyi Ma and Brad Farnsworth
- At $200 for the day (inclusive of all meals), this learning opportunity is a steal. (Pricing goes up to $350 on October 24, 2022).
And for those of you going to NAFSA Region XI, be in touch so we can chat. 3 super Intead presentations coming your way during that event.
Getting back to the discussion of email marketing, the fact is, most enrollment marketing teams have limited time to review their data and modify their approach and content based on what they see. Those taking these steps are ahead of the curve.
The creative art of email marketing has everything to do with knowing your audience and tapping into your recipient’s curiosity. Your recipient has to think there is something of value to them because of the sender or the content.
This week, we take a closer look at email A/B testing, also known as split testing – the process of sending two slightly varied versions (version A and version B) of an email to two different sample groups of your email list. The email version that receives the most opens and importantly, valuable clicks (conversions), is deemed “winner” and gets sent out to the remainder of your list.
This approach is the best (and simplest) way to optimize your email marketing campaigns and quickly pinpoint what’s working and what’s not. Often it is the subject line that has the most power. But also you can test whether your prospective students click more on buttons or link text? What color button works best? Do parents respond to subject lines with emojis, or should you leave that to your student segments?
And given that you have so many important student segments to consider (domestic regional, domestic distant, international by country, non-traditional, first-year, undergraduate, graduate, transfer, program of interest, financial capacity, ethnicity), the testing you can be engaged in can become a bit complicated. So, it’s important to have this done with a bit of rigor and careful tracking.
Yes, of course, we are here to help when you are ready to truly adjust your content to maximize engagement and conversion. The results will justify the effort.
Read on for some of our best practice tips just to validate that your team is on the right track. And, be sure to share this post with the copywriters on your staff. It's a great primer for the newer marketers on your team.
While full-on email A/B testing can seem intimidating when you consider all of your student segments, we promise that it’s not that complicated. In fact, most email marketing software offers a built-in A/B testing feature. You can start with one email, then create the “B” version by tweaking one small aspect. Just follow these 3 easy steps.
- Define what you want to optimize
Perhaps your last open house email campaign had fewer registration clicks than anticipated. Or maybe your newsletter open rates are lagging in your international student list. Decide which metrics you want to optimize, such as open rate and clicks, so you can decide what to measure during your test.
Then, consider whether you want to perform the test on a specific email segment (e.g. students within 25 miles of campus) or test across your lists. We recommend performing segment-specific A/B tests because you may see varying results when working across lists. For instance, your prospective students may respond better to urgency-based subject lines than your current students.
- Develop a hypothesis
Having a clear hypothesis before starting your email experiment can help you stay focused on what you’re testing. Some examples:
- We believe that a hero image at the top of the email will result in higher click rates than an email that starts with text.
- We believe that a subject line personalized with the student’s first name will have a higher open rate.
At the conclusion of your A/B test, your hypothesis may be proven, disproved, or inconclusive. Important: test only one variable at a time. This way you can attribute the metric differences to that one factor.
- Choose your sample size
How many subscribers should you send your test emails to? We recommend using the 80/20 rule: Send email A to 10% of your list, email B to 10%, and then the winning version goes to the remaining 80% of your list.
It’s important to make sure your samples are split up randomly. That way, you can get an unbiased result.
Now, what to test…
It’s a key question. Next week we will take a closer look at what and why you choose one aspect over another. Stay tuned to our blog and be sure to share it with the members of your team responsible for the nuts and bolts of your email campaigns.
In the meantime, if you want to discuss specific ways you can increase email engagement, be in touch. We’re here for you.