Welcome back to our Client Spotlight, where we highlight some of the amazing work that ShareProgress clients have done, and where they get a chance to share testing tips, best practices, and strategies. This week, we chatted with Bill Gordon, Digital Director for Compassion & Choices. He told us what Compassion & Choices has learned about working with older readers and bigger font sizes, how to prioritize your testing, and one killer piece of advice for someone just starting a testing program.
Are there any tests you’ve run that stand out in some way (different results than expected, very definitive results, confirmed something you were wondering about, etc)?
One of the first things that we found when we started our testing program remains the most significant. We knew that our user base tended to be older, and we had some anecdotal evidence supporting the theory that the font size in our emails was too small. Granted, this was a standard 12 point font, but we wanted to see if making it larger would have a positive effect on click-thrus and end-of-chain actions. So we started testing. We tested 12 vs. 14, and 14 won. We tested 14 vs. 16 and 16 won. We considered testing 16 vs. 18 but our art director almost had a heart attack when we suggested it so we took the increases we saw and standardized 16 point font for our entire email program. It has been working ever since.
Are there a few best practices that have emerged from running a robust testing/analytics program?
It’s important to remember that testing everything isn’t always the best use of your time. In smaller programs where you might have to make hard decisions about staff allocation, testing every whim can result in lost productivity. We make lists of things we’d like to test long term and then rank them on difficulty to find results and the time spent setting the test up. We go for the low hanging fruit first and work our way up to more complex tests down the road. Additionally, it’s important to know what your organization boundaries are. If you’re testing language that is antithetical to your brand, even if it wins you’re not going to be able to convince the organization to move over and use it wholesale. Test within your brand and push the boundaries, but don’t create tests that have no chance of having the findings adopted.
What has your organization’s experience been with testing social share language?
Social testing has been the latest analytical tool we’ve added at C&C. It wouldn’t have been possible without ShareProgress. One of the really interesting things we’ve found is testing various lengths of FB share posts. Initially we thought that short and sweet would be better, but consistently we’ve seen posts that feature >50 words perform better. I think that this is because when you’re prompting someone to share they want to feel like they are putting something out into the world that has some weight and substance to it, especially if it comes from their name. Also interesting is the difference we’ve seen in testing the first person vs. third person possessive in our share copy. Share posts that are written in the first person don’t perform as well as those written without first person pronouns. We think that this is because while people will want to share great content, if you try and speak in their voice it will turn them off from action.
The winning Facebook image version in a test of longer and shorter descriptions.
What’s one piece of advice that you would give someone just starting out running testing for their organization?
Start small. Rack up some wins. Best practice tests are often much more useful to an organization than subject line tests. Always go into a meeting willing to test bad ideas that are floated. Always respond with data. Your greatest strength as someone running a testing program is that you can be “right” once you have significant data to support a conclusion, so leverage that internally to build the program you want.
Also, try not to get personally attached to your existing best practices. If you look at the state of email now in the political sphere, there are hundreds of email writers throwing their hands up in the air because testing revealed that big bold type with red fonts and yellow highlights sometimes performs better than emotional, story driven emails. Maybe this will be true for your organization too, maybe it won’t, but don’t fight the data.
Are you running awesome tests on ShareProgress? We knew it! We’d love the chance to highlight the kickass work that you’re doing. Shoot an email to anna@shareprogress.org to talk about featuring your organization in an upcoming Client Spotlight post.
Comments