November 17, 2011 by George Eberstadt
This is such an important validation of the effectiveness of social merchandising that, if we’d thought of it, we would have commissioned a market research firm to write this study for us. But, even better, it’s actually a peer-reviewed article produced by a team of university marketing professors and published in the journal of the American Marketing Association, the Journal of Marketing Research. It’s titled: Online Social Interactions: A Natural Experiment on Word of Mouth Versus Observational Learning. (There’s also a nice write-up and interview with the lead author on Red Orbit.)
The findings are straight-forward: Online, as in the physical world, people are more likely to buy things that they see other people bought. There’s no word of mouth here. This isn’t about customer ratings and reviews. This is just about seeing the purchases of other people. The merchandising lessons are simple:
- You can improve conversion rates by showing shoppers that other people have really bought a product (on the product detail page)
- You can encourage consideration by showing the purchases other shoppers made (in your product discovery/recommendation/cross-sell merchandising)
The study looked at a period when Amazon put up and took down the “what other people bought” section on their digital camera products to see what effect having/not having this information had on sales. Using these data,
The authors observe a herd behavior among consumers when the OL or sales information is positive, but surprisingly, they observe no herd behavior when consumers face negative OL or sales information. [OL stands for “Observational Learning”, which in this case means “seeing what other people bought”.]
In other words: when shoppers saw that other people were buying a particular item, they became more likely to buy it. But if an item didn’t have peer-purchase information, that absence didn’t hurt sales. So you don’t need sales coverage for your whole catalog – show purchase information where you’ve got it, and don’t worry about it where you don’t.
Here are a couple examples of stores using tools that deliver the OL effect. For lesson #1 (on the product detail page, showing shoppers that other customers are also buying the item), have a look at the 98 check-out comments on these shoes at GoJane (scroll past the Q&A). For lesson #2 (showing products that other customers are buying to encourage consideration and cross-sell), have a look at the “See what your friends bought” tab on the right edge of the window here at emitations. What effect do these tools have on you? Does this sort of merchandising make you feel like buying?
If you want to take advantage of the OL effect to improve your sales, give us call.
September 27, 2011 by George Eberstadt
[For a downloadable version of this study, click here.]
To date, Q&A on ecommerce sites has been primarily a tag-along application to customer reviews (provided by vendors that specialize in customer reviews). This approach results in a Q&A model that’s more like customer reviews than a true social experience between shoppers and customers, missing the benefits that a truly social approach to ecommerce Q&A provides.
The key to Social Q&A is that shopper questions should reliably and quickly get answered by real customers, and participants should have the ability to go back-and-forth beyond the initial question, if they choose to. If shopper questions receive customer answers only rarely or after an extended period, the shopper is disappointed and the store has missed the chance to provide a fast reminder to the shopper about the purchase she was considering. Further, getting past customers to share their experience with real shoppers is a great way for stores to keep their relationships with the customer base fresh. The rise of social networks has conditioned people to expect a high level of interactivity from social applications – so if a Q&A tool isn’t providing that, it’s not really Social.
On many online stores’ Q&A systems, we’ve observed that most answers come from store staff. That can be an OK supplement to social answers (especially if the staff are really experts), but the store may be better off directing those questions to a live chat or phone line so the staff can interact with the shopper in real time. And if a shopper wants to know something subjective – like how the product held up after 3 months, or how it felt, or just if it’s really as fabulous as they hope it is(!) – they may only want an answer from someone like them who really bought the item. A Q&A system that relies heavily on staff answers also isn’t really Social.
That’s why TurnTo created an approach to Q&A for ecommerce that reliably provides a true Social experience – multiple, fast answers from real purchasers with continuing back-and-forth dialog. To measure the difference between the TurnTo approach and that provided by the leading customer reviews vendors, Bazaarvoice and PowerReviews, we conducted a simple test. We asked 16 shopper questions on a range of sites with Q&A powered by TurnTo and these other vendors, and we tracked how long it took for the answers to arrive. Here are the aggregated results:
Methodology: In our test design, we tried to keep the playing field level. We asked general questions that could easily be answered by anyone with experience with the product. We tried to ask the identical question about identical products wherever possible. Where not possible we tried to pick featured items on the Bazaarvoice and PowerReviews sites likely to have high traffic and have been purchased many times (no new arrivals items were used). We tried to pick sites where the Bazaarvoice and PowerReviews Q&A tools were implemented in a highly visible way on the page. That meant that the PowerReviews and Bazaarvoice sites were not always the largest in each vertical (in particular, in the photo gear category), but more often than not, the Bazaarvoice and PowerReviews sites had far more traffic than the TurnTo sites, and they did so in aggregate. We checked the item page where each question was asked at exactly the specified intervals and counted posted answers. We also provided our email address with each question asked and counted answers received by email. (The Bazaarvoice and PowerReviews stores often emailed answers well before those answers appeared on the sites, in some cases even before the questions appeared on the sites.) None of the sites were alerted in any way about this test. All questions were submitted on Wednesday, August 10, 2011 between 9am and 11am eastern time. Here were the test sites that we used:
On each site, we asked 4 questions. So in total, we asked 16 questions per vendor. Here are the details of the answers received, by individual site. (All numbers are for social answers – answers from customers – except those in parentheses, which are answers from store staff.)
Staff answers: We also tracked answers from store staff. These are shown in parentheses in the table above. At the end of the two week test period, the questions on PowerReviews sites received a total of 10 staff answers vs 7 social answers. The questions on Bazaarvoice sites received a total of 5 staff answers vs 9 social answers. No staff answers were received on the TurnTo sites – note that 15 out of 16 questions on TurnTo sites received at least 1 social answer within 24 hours.
We encourage you to try this test for yourself.
The raw data: Here are the urls for all the item pages for all questions in the test. The asker is “Andrew P”, “Andrew RP” or “Anonymous” – also look for a submit date of August 10th where that is shown. Note that on the Bazaarvoice and PowerReviews sites, we counted answers received by email, even though some of those answers – in some cases, even the questions – were not posted on the site by the end of the test period.
Sierra Trading Post (PowerReviews)
Johnston & Murphy (PowerReviews)
Abes of Maine (PowerReviews)
Bass Pro Shop (Bazaarvoice)
Cameras Direct (Bazaarvoice)
Bazaarvoice is a registered trademark of Bazaarvoice, Inc.PowerReviews is a registered trademark of PowerReviews, Inc.