SYNCHRONY CONNECT. Making an Impact with Direct Marketing Test and Learn Strategies that Maximize ROI

Similar documents
Defining loyalty for your brand. Tips for building the foundation of loyalty

Digital Behavior Analytics. Combining deep analysis, experience and judgement to enable meaningful marketing actions

How We Make a Difference in a Omnichannel World

Rethinking branding metrics: how to turn brand consideration into a performance metric

5 Star London Hotels - Example Report

Creating Customized Solutions

Combine attribution with data onboarding to bridge the digital marketing divide

Key Service Qualities to Build Cardmember Satisfaction and Profitability

MANAGING FORWARD: Analytics for Today s Multi-Channel, Multi- Device Consumer. Larry Freed President &

For Mobile Success. Defining Benchmarks MOBILE AD MEASUREMENT. Global Headquarters 401 Park Ave S New York, NY 10016

Creating Customized Solutions

THE RIGHT WORDS AND PHRASES

MOBILE ADVERTISING BENCHMARKS

China Digital Insights: The Beauty Industry

EMBRACING DIGITAL PAYMENTS TO INFLUENCE CARDHOLDER BEHAVIOR AND ISSUER LOYALTY

Electric Forward Market Report

DATA ANALYTICS SERIES. Data Visualization Translating data into actionable insights for retailers

Call Center Benchmark India

Woking. q business confidence report

ConsumerView SM. Insight on more than 299 million consumers and 116 million households to improve your marketing campaigns

Predictive Conversations

Make Your Retail Sales Data Consumable, Actionable & Relevant!

Module 3: How to measure and maximize your Investment in the Addressable Platforms. Peter Vandre

CASE STUDY. I had no idea how much more I could get for my budget until I worked with TEMPO. Kimberly Clayton, VP Hampshire Self-Storage

MEASURING ROI & ATTRIBUTING SUCCESS TO DIGITAL MARKETING

Traffic Department Anchorage: Performance. Value. Results.

Selling High with Intelligent Planning

HOW-TO GUIDE: 5 WAYS TO MEASURE THE BUSINESS IMPACT OF A LOYALTY PROGRAM

Effective Frequency: Reaching Full Campaign Potential

FACEBOOK USER-GENERATED CONTENT (UGC) BENCHMARK REPORT

Marketing Levy Update Olivia Grey Marketing Manager

Engagement Advantage SM for Utility Companies An Integrated Approach to the Billing and Payments Life Cycle

The Industry s Leading Edge Process How Trading Partners Gain the Highest ROI on Collaborative Business Planning (CBP)

Session 204: A New World Order Managing Your Contact Center as a Business! Jeff Rumburg, Managing Partner, MetricNet

Q SOCIAL TRENDS REPORT

Decentralized Affiliate Marketing. Pitch Deck -- 9/22/2017

Measuring Promotion Effectiveness in Times of Increased Consumer Spending WHITE PAPER

Measuring what matters. Recommendations from Analytic Partners and Pinterest

Are you prepared to make the decisions that matter most? Decision making in power & utilities

COGENT REPORTS. Agile Experienced. Powerful. Accurate. In 2018 we are changing the rules!

MY STORY. why rootworks?

New Higg.org Platform Training. Understanding and Navigating the new platform

Company: Rockwell Collins Industry: Aerospace & Defense Category: Technology

Are you prepared to make the decisions that matter most? Decision making in banking & capital markets

FINANCE AND STRATEGY PRACTICE CFO EXECUTIVE BOARD. Safeguarding Supply. Protecting the Enterprise from Unforeseen Supply Chain Risks

Engagement on Facebook: When it matters

Results You Can Trust!

Measuring Results of Your Loyalty Program Investment

Increase Customer Acquisition & Profit Margins Using Multifront TM

Customer Service Workshop. Contents are subject to change. For the latest updates visit Page 1 of 6

Hello Attribution. Goodbye Confusion. A comprehensive guide to the next generation of marketing attribution and optimization

U.S. Manufacturing Technology Orders Finish 2016 Down 4.0 Percent

WHITEPAPER POINTLOGIC VALUEPOINT ADVANCED MEDIA ANALYTICS AND CHANNEL PLANNING SOFTWARE

Journal of Retail Analytics

SAP Hybris Marketing Cloud Solutions: Market in the Moment

Using Analytical Marketing Optimization to Achieve Exceptional Results WHITE PAPER

Forecasting With History

The right time for real-time marketing

PMO17BR303 The Big Bang Cerner s Approach to Agile Transformation Matt Anderson, PMO Director Cerner Corporation

Chapter 12. Analytics In Action

Business & Technology Solutions & Services (BTSS)

Marketing. Georgian Ballroom

Global Digital Advertising Report Adobe Digital Index Q4 2014

Why marketers aren t giving social the credit it deserves

Givers: where can you find them?

Brookshire Grocery Company: The Ultimate Customer Experience Follow the Data

Yes, Advertising Works. Now, What s My ROAS Across Media Platforms?

An Executive s Guide to Predictive Data Modeling. An introductory look at how data modeling can drive better business decisions.

Media Planning and Strategy

Selling Through Retail in the Age of the Digital Consumer

Social media meets health promotion

5 STEPS TO MAKING AUDIENCE DATA CENTRAL TO YOUR DIGITAL STRATEGY

COGENT REPORTS. Agile Experienced. Powerful. Accurate. Offering unique, dynamic solutions designed to meet the evolving needs of our clients

Substantial Changes Ahead for MISO North Michigan Impacts 2018 MMEA Fall Conference

FROM SOLUTIONS TO SYSTEMS MARK ZAGORSKI : EVP NIELSEN, CEO EXELATE

SECTION 8. RISK MANAGEMENT AND MITIGATION

Mobile Banking Impact: Quantifying the ROI and Customer Engagement Benefits. Understanding the Value of Engaging Consumers in the Mobile Channel

BUS 168 Chapter 6 - E-commerce Marketing Concepts: Social, Mobile, Local

The Service Desk Balanced Scorecard

THE ULTIMATE GUIDE TO MOBILE APP A/B TESTING

CILT Certified Procurement Professional. Contents are subject to change. For the latest updates visit

Managing Project Portfolio. Contents are subject to change. For the latest updates visit

Implementing the Leapfrog Hospital Rewards Program

Optimal Pricing & Market Positioning. Innovations in RM to maximise sales & optimise revenue

Thank you for joining, the webinar will start at 11 a.m.

Chapter 6 Planning and Controlling Production: Work-in-Process and Finished-Good Inventories. Omar Maguiña Rivero

City of San Clemente Water Usage Report

Certified Purchasing Professional. Contents are subject to change. For the latest updates visit

ELEVATING CONSUMER PROMOTIONS FOR RETAIL

DIGITAL ADVERTISING S TRUE IMPACT ON IN-STORE SALES

Chapter 12: Analytics in Action

Customer Value Analytics for Banking & Capital Markets

The MMM Playbook. Cartesian Consulting

WHITE PAPER. Why Dealership Databases Are the True Holy Grail: A Guide to Evaluate and Strategize for Vendor Accountability

Retention Revealed The Need-to-Know Facts Behind App Retention & ROI

MYTH-BUSTING SOCIAL MEDIA ADVERTISING Do ads on social Web sites work? Multi-touch attribution makes it possible to separate the facts from fiction.

For personal use only. Tech Mpire. Investor Presentation July 2015

MOBILE ADVERTISING BENCHMARKS Q Report

How To Evaluate SMS Marketing Solutions For Your B2C Enterprise

MOBILE ATTRIBUTION FOR DATA-DRIVEN MARKETERS 2016 REPORT HOW FOOT TRAFFIC ATTRIBUTION WORKS

Transcription:

SYNCHRONY CONNECT Making an Impact with Direct Marketing Test and Learn Strategies that Maximize ROI

SYNCHRONY CONNECT GROW 2 Word on the street is that direct marketing is all but dead. Digital and social media marketing have taken over, with many saying that direct marketing will soon fade into the sunset. But to the contrary, the data shows that the demise of direct marketing has been greatly exaggerated. According to the Data & Marketing Association, marketers still spent nearly $50 billion on direct mail in 2016. This spend is not coming from just a few organizations: Nearly 60% of companies said they used direct mail on promotional marketing activities, with 87% of organizations leveraging email on promotional marketing activities. 1 As a result, companies conducting direct marketing campaigns need to carefully consider the impact of their investments while identifying best practices to measure and quantify the campaigns Return on Investment (ROI). The ability to measure ROI was cited as one of the significant barriers to the adoption of data-driven marketing in a survey conducted by Ascend2 and Research Partners. 1 Let s say your new marketing analyst has a great idea for a marketing campaign. Instead of offering a 10% off coupon for $50 spend, let s offer a $10 off coupon for $50 or more. This will save the company money because most customers spend more than $50, and it will drive traffic to the store. And let s say it drives $1 Million in sales. Great, we have a winner, right? But, hold on. How much would customers have spent with the old campaign? How much would the same customers have spent if they didn t get an offer at all? Setting up a proper test-and-learn campaign structure helps answer these key questions, and ensures a company gets the most out of their marketing dollars. To effectively measure the impact of direct marketing initiatives, each campaign should be designed so its impact can be isolated and analyzed relative to impacts brought on by other factors. A properly designed campaign will generate critical insights about: 1 Customer Customers who respond to their campaigns. 2 Channel Channels that drive response. 3 Creative Creative and messages that prompt customers to action. 4 Offer Offers that drive the biggest returns. Given ever-changing budgets and priorities, an organization needs to understand each element to identify which ones drive the greatest impact on the results. Let s look at two structures that are used to test the effectiveness of a marketing campaign: Section I: Test vs. Control. This section outlines the design of a direct marketing campaign to assess its direct impact when control groups are available and outside influences are eliminated. Section II: No Control or Biased Control Group. This section focuses on situations where it is not possible to separate a control group or the selected control group is biased. Quick definition: A control group comprises potential targets who are not given the marketing offer. That way the impact of the campaign can be measured against the non-marketed population.

SYNCHRONY CONNECT GROW 3 Section I: Test Design 1. Set aside a random control group Measuring and quantifying results is critical to any marketing campaign. Accurate measurement allows the marketer to understand which customers are responding to which offers over which channels so future campaigns can be modified. To get an accurate read of a campaign, it s critical to separate a randomly chosen control group which will not receive the offer from the test population, which will receive the offer. Comparing the two groups will allow the marketing team to isolate the impact of the direct marketing campaign. 2. Select the right sample size To enable a statistically significant read on the campaign, selecting the right sample size is a critical step. For example, if you have a marketing campaign where 50% of those receiving the direct mail offer shopped and 45% of those not receiving the offer also shopped then it may be difficult to determine if the results are statistically significant or not. Broadly speaking, two types of sample size calculators can be used to help size a campaign. One is based on the Estimated Response to the campaign and the other is based on the Estimated Sales. Either or both methods can be used while designing a marketing campaign. 3. Ensure true randomization After the required sample size has been determined, and customers for the test and control groups chosen, checks need to be performed to ensure the two groups are similar to each other. Test and control groups need to be compared based on such key metrics as sales and transactions in the months leading up to the campaign, and sales during the same period in prior years (to account for seasonality). The two groups may also need to be compared based on other factors, such as tenure and risk profile, to ensure they are similar. If you re working in an industry where usage of demographic attributes is permissible, then the analyst should ensure that the test and control groups have similar demographic profiles. 4. Set up a universal control group Universal control groups are typically selected at the beginning of the year, with customers in the group removed from all marketing campaigns for that year. By comparing test customers against the universal control group, analysts can calculate the cumulative impact of all marketing activities. Universal control groups also assist in lift attribution and calculating the impact due to each incremental touch point. While selecting the group, analysts should ensure that the group s size is appropriate, and representative, of the organization s customer base. 5. Multiple factors being tested at once? Consider Design of Experiments The marketing team may want to test the impact of multiple factors at several points in a campaign. For example, they may be interested in testing the impact of offering customers $10 off as compared to 10% off, as well as testing different creatives or different channels. In each case, the number of test and control groups required multiply rapidly, so the team may run into issues with an inadequate sample size in each test group. Design of Experiments (DOE) is a technique that can be leveraged in such cases to limit the number of groups needed to effectively measure the impact of multiple factors.

SYNCHRONY CONNECT GROW 4 Section II: Test Measurement without a control group While it s preferable to have a randomly chosen control group for all campaigns, it may not always be possible to remove a control group. Examples of these types of scenarios include: Analyzing the benefits of multi-tender loyalty programs to get an accurate measure of the program s impact. Quantifying the benefit of an e-commerce platform launch. There also may be cases where a control group was selected, but it was not large enough or was corrupted. Under these scenarios, analysts typically perform a pre- versus post- or prior year versus current year analysis. These techniques may not paint an accurate picture of the campaign due to the following gaps: They do not factor in the seasonality associated with the retail spend. Natural attrition behavior is not factored in (e.g., spend among existing customers tends to decrease year over year due to attrition as some customers migrate away from the brand while those who still shop spend less than they did a year ago). They do not adequately address the question of causality versus correlation (e.g., how can the analyst be confident that any observed increase or decrease is due to the marketing initiative and not to other factors, such as changes in merchandise or pricing). 1. KPI Decile Approach A simple but effective technique to measure the lift when there is a biased control group involves grouping all customers, grouping those who received the offer/ shopped (test group) and grouping those who didn t (control group), based upon spend or other key performance indicator (KPI) in the pre-period, prior to the campaign. You can then compare test customers and control customers in each of these pre-period spend groups. This allows for an apples-to-apples comparison of the test and control groups and a better read of the impact due to the marketing effort, as illustrated in the chart below. Average Sales per Account (Illustrative) Control +5% $52.6 Test Biased Control $55.7 $55.7 +10% $50.4 Normalized Control The results of this test show that the normalized control scenario actually resulted in a 10% increase versus a biased control. 2. Greedy match algorithm The KPI Decile Approach, while simple and effective, matches the test and control group on only one metric (e.g., sales). The problem might warrant matching test and control groups on more than one metric (e.g., sales, tenure and transaction frequency). Greedy match algorithm is a technique that can be used to match test and control accounts on more than one attribute. The goal of a greedy match algorithm is to produce matched samples with similar characteristics across the test and control groups. Average In-Store Spend per Account (Illustrative) Synthetic Control Test To enable an accurate read of these type of campaigns, the following three techniques can be used (in order of increasing complexity): Pre-Period Marketing Initiative Post-Period Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec Jan

SYNCHRONY CONNECT GROW 5 3. Propensity Scoring You can also leverage modeling techniques to form a synthetic or artificial control group in situations where a control group was either not available or biased. Using such classification algorithms as logistic regression, the propensity scoring technique can score and match test and control accounts on pre-period attributes. A propensity scoring model is an alternative approach to specifying multiple attributes to match those who are in the test group and those in the control group. ROI Test Control Lift High - = $30 Medium - = $15 Low - = $6 Weighted Average = $17 4. Expected Sales An alternate approach involves developing a model to predict the expected sales (or other KPIs) in the postcampaign period. Based on customers who are not part of the test group, the model s intent is to predict what the sales (or other KPIs) would total without the offer/ treatment. The model can either be linear with the post-period sales of the control group as the dependent variable or formulated so the control customers with high sales are classified as responders. The independent attributes are the customer behaviors in the pre-campaign period. The resulting model is then scored against the test population to develop the baseline sales for each customer in the test group. The actual customer sale is then compared against the baseline sale to calculate the lift due to the campaign. Conclusion Many direct marketing professionals spend a great deal of time and effort designing the marketing campaign that is most attractive to their customers. But the setup of a proper test and control structure is key to determining which marketing campaign has the most impact on overall results. It is this structure that is the key to whether a campaign is impactful and delivers the best ROI to the business. In order to set up the right test-and-learn structure, it is essential to design it to effectively measure the impact of the campaign. To effectively do this, identifying a representative control group with an adequate sample size is critical to measuring the true lift of the campaign. Universal control groups are helpful when seeking to quantify the cumulative impact of all marketing initiatives, while DOE techniques allow marketing teams to quantify the impact of various offers, channels and creatives. In some cases, it is not possible to hold out a control. In such situations or when a control group is biased, analytical and modeling techniques presented in this paper can be leveraged to help measure campaign results. 1 Data & Marketing Association Statistical Factbook 2017. For more information or to connect with an expert, contact us at synchronyconnect@synchronyfinancial.com. Synchrony Connect is a value-added program that lets partners tap into our expertise in areas beyond credit. It offers knowledge and tools that can help you grow, lead and operate your business. synchronyfinancial.com This content is subject to change without notice and offered for informational use only. You are urged to consult with your individual business, financial, legal, tax and/or other advisors with respect to any information presented. and any of its affiliates (collectively, Synchrony ) make no representations or warranties regarding this content and accept no liability for any loss or harm arising from the use of the information provided. Your receipt of this material constitutes your acceptance of these terms and conditions. 0717-138