Portland Marketing Analytics (PortMA) | Portland, Maine

Reviewing Past Research: What Worked Last Time?

by  

Tactical vs. Strategic DataWe’re gearing up to launch an experiential research project on a spirits brand for which we’ve researched for the past couple of years. One of the areas we emphasize in the project launch process is how to add more value to the analysis than in years past, so content stays fresh year after year. There are two easy questions in our project launch process that get the ball rolling toward adding value.

Part one of this blog post will cover the first key question: What worked last time?

Common sense says that any practices that generated favorable results in the years past should be repeated and improved upon in the next. I took some time to review the spirits project in 2013 and identified some best practices to apply to 2014.

Reviewing last year’s consumer profile

First, the field teams effectively collected a balanced number of surveys from consumers who sampled and from those who do not sample the brand, so we could measure the direct impact of the campaign.

For instance, we found 71% of consumers who never heard of the brand and who sampled at the event said they would purchase the product for their home, compared to just 28% of newly-educated consumers who did not sample. If the teams collected too many surveys from one group or the other, those results would be less robust.

How did impact vary by venue?

Second, we segmented the results by the type of venue at which the event activated.

Differences in impact between consumers attending on and off-premise events are very important in the spirits industry, but segmenting within those two venues provides even more insight.

Off-premise events were split into liquor stores, grocery stores and other venues. We identified that the most significant campaign impact occurred at liquor stores, where 94% of those who sampled were likely to purchase a bottle for home, versus 57% who did not sample.

Key drivers of ROI

Third, we calculated the ROI of this program in 2013 and provided a sensitivity analysis of the ROI by on and off-premise results.

Overall, the program was in the black with an ROI of 227%, but what were the drivers of that figure?

Breaking down the ROI into separate figures for on-premise and off-premise data indicated that off-premise activity was the key driver. Consumers left off-premise events more likely to purchase the brand than those who attended on-premise events.

Since costs were close to the same between on and off-premise activations in 2013, we concluded that off-premise events were more successful, and the teams should develop best practices based on how they executed in that venue.

That’s what worked last time. Part two will cover areas of improvement.

Photo Source: https://www.flickr.com/photos/37676753@N08/

Share Your Thoughts...
Tell us what you're thinking...
and oh, if you want a pic to show with your comment, go get a gravatar for free here!
Comments are moderated and may not appear immediately...