We’ve all been there. Our inboxes are overflowing, and we can’t keep up. Sometimes we delete the emails that don’t interest us, sometimes we simply ignore them, and sometimes we unsubscribe altogether.
In “The Social Break-Up” report, ExactTarget found that 67% of respondents actually take the time to unsubscribe when they’re no longer interested in a company’s content. And the top three reasons for unsubscribing are frequency, interest, and email volume in general.
An email preference center that lets people subscribe based on their specific interests — as well as determine the frequency of communication (think LinkedIn groups that you can join and then choose daily or weekly notifications or none at all) — helps to ensure subscribers only receive content they’re interested in and at a frequency they can manage.
If you’re with one of the 40 percent or so of companies that even offer an email preference center, congratulations! You’re already ahead of the curve. (And if you’re not, we can help.)
But do you know how well it’s performing?
I recently worked with a client who wanted to redesign their global preference center with the goal of enhancing subscriber profiles and increasing overall subscribers to 40% of active contacts.
For anyone interested in redesigning their preference center, or just learning how their current preference center is performing, we recommend A/B testing.
First, however, it’s important to establish benchmarks so you have a point of reference. By analyzing historical visitor traffic to your current Preference Center, as well as engagement, you can better predict the number of visitors you’ll encounter in your A/B testing. You’ll also find out if you can expect enough visitors to achieve statistically significant results.
When you’re ready to test, make sure you only test one variable at a time so you can more accurately interpret the results.
Test Example #1: Subscriber Profile Completeness
Hypothesis: The less profile information requested, the more preference information provided.
|Version A||Version B|
|Show full set of required Profile fields and all Preference information.||Show bare minimum of required Profile fields (e.g., Country, Preferred language) and all Preference information.|
Test Example #2: Focused Products of Interest
Hypothesis: Subscribers will be more likely to choose preferences if they are initially only shown products relating to the email they received.
|Version A||Version B|
|Show Products relevant to the email the subscribers came from. Remaining products should be collapsed, with the ability to view them if desired.||Show all Products.|
Note: Do not provide one large list with checkboxes. Products should be grouped visually into logical groups to make it easier for the prospect to identify them. Also, clearly indicate if they can make multiple selections.
Measuring the results
For each test, be sure to evaluate the following topline metrics:
- Preference Center form submits by version (this will determine your winner!)
- Overall Profile completion by volume of Preference selections
- Preference Center landing page abandonment rate
And for an even better understanding, dig into the minutiae:
- Landing page visits (how many came?)
- Abandonment stage and rate (how many left?)
- Landing-to-completion rate (what percentage actually submitted the form?)
- Global opt-out rate (how many decided to unsubscribe?)
Once you’ve completed A/B testing, analyze the results and use this information to design your new and improved Preference Center. But don’t stop there. Follow up with a simple subscriber survey to evaluate its success, which you can do in the following ways:
- Add the survey to the preference center confirmation page
- Email those who submitted preferences, but did not globally opt out
- Email those who provided an email address, but did not submit their preferences
The responses may be subjective, but it’s still good data to have. Keep it simple by asking something like, “Did you accomplish/find what you were looking for?” It’s okay to ask more than one question, but keep it short and sweet. And encourage completion by offering a small gift in return, such as a $5 gift card.
Well-designed preference centers help improve database health and list segmentation, both of which can enhance targeting, engagement, and ultimately revenue. Is yours working hard enough for you? If not, we can help!
Carolyn Acker is a marketing operations and customer experience management professional. As a DemandGen Sr. Project Manager, she drives consistent project management practices across all DemandGen products to create a unified and effective customer experience. In her previous role as a DemandGen Campaign Manager, Carolyn provided a strategic campaign framework for our clients to help them develop, implement and enforce workflow processes. She has extensive knowledge of campaign generation, execution tactics and best practices, to help the client achieve campaign success. Other articles by Carolyn Acker: Campaign Process Optimization: A Deep-Dive into Key Success Factors