Does it make sense to A/B Test Tweets?
Posted on Monday, May 27th, 2013 by Nancy Huynh
If you’re reading this, you’re very likely to know what A/B Testing is. If you don’t know what an A/B Test is, this article should catch you up nicely.
Most commonly, A/B Tests are conducted on website designs and email marketing (e-casts) with the goal of increasing conversions. I wanted to find out if it makes sense to conduct A/B Tests with tweets. As a test case, my goal is to increase conversions for a subscription database. The difference between A/B Testing tweets compared to website design or e-casts is that you’re not setting it up once and then collecting the conversion data. With an email, you can send it out once. With a website design you make your tweaks and then you wait to see if they’re effective at generating conversions. With tweets, neither of these scenarios fit the nature of the twitterverse.
With tweets, I would have to send out the same tweet over and over again over a specified timeframe. The life of a tweet is not very long, even when you take into account potential retweets and favourites. And if you are sending out the same tweets a number of times, will your followers be annoyed?
For this experiment, I’m going to bring up the model for conducting chemistry experiments I learned in the 11th grade. Who knew that “Chemistry 11” would somehow be useful in my life?
The Question: Which of two tweets will increase subscription conversions?
The Control: I will use one of my old tweets that asked followers to subscribe to my mailing list.
- Come up with a second alternative tweet that is comparably different in call-to-action style. And use a different Google Analytics campaign link so that I can actually track the conversions properly.
- Based on the current level of subscription, I would have to run the test for at least a month. I will send out both tweets twice a day at different times of day. This way it is unlikely that annoy followers because who wants to see the same tweet every hour. My aim at the end of this experiment is to get at least 30 conversions, anything less probably won’t tell me anything significant.
You might be thinking: timing will probably influence the conversion rate as well. And I agree, but from my understanding about having more than one variable the data becomes less conclusive to answer your initial questions. I learned this from “Statistics 110” in undergrad! So for this experiment, I’m not going to measure the results of the timing of the tweets. Wish me luck, and check back at the end of June to see how my experiment went.
Have you done A/B Testing with tweets? How did it turn out?