Showing posts with label market research. Show all posts
Showing posts with label market research. Show all posts

Thursday, May 07, 2009

Survey: 27% of Marketers Suck

A recent survey conducted by the Committee to Determine the Intelligence of Marketers (CDIM), an independent think-tank in Princeton NJ, recently found that:

· 4 out of 5 respondents feel that marketing is a “dead” profession.
· 60% reported having little if any respect for the quality of marketing programs today.
· Fully 75% of those responding would rather be poked with a sharp stick straight into the eye than be forced to work in a marketing department.

In total, the survey panel reported a mean response of 27% when asked, “on a scale of 0% to 100%, how many marketers suck?”

This has been a test of the emergency BS system. Had this been a real, scientifically-based survey, you would have been instructed where to find the nearest bridge to jump off.

Actually, it was a “real” “survey”. I found 5 teenagers in a local shopping mall loitering around the local casual restaurant chain and asked them a few questions. Seem valid?

Of course not. But this one was OBVIOUS. Every day we marketers are bamboozled by far more subtle “surveys” and “research projects” which purport to uncover significant insights into what CEOs, CFOs, CMOs, and consumers think, believe, and do. Their headlines are written to grab attention:

- 34% of marketers see budgets cut.
- 71% of consumers prefer leading brands when shopping for .
And my personal favorite:
- 38% of marketers report significant progress in measuring marketing ROI, up 4% from last year.

Who are these “marketers”? Are they representative of any specific group? Do they have anything in common except the word “marketing” on their business cards?

Inevitably such surveys blend convenience samples (e.g. those willing to respond) of people from the very biggest billion dollar plus marketers to the smallest $100k annual budgeteers. They mix those with advanced degrees and 20 years of experience in with those who were transferred into a field marketing job last week because they weren’t cutting it in sales. They commingle packaged goods marketers with those selling industrial coatings and others providing mobile dog grooming.

If you look closely, the questions are often constructed in somewhat leading ways, and the inferences drawn from the results conveniently ignore the statistical error factors which frequently wash-away any actual findings whatsoever. There is also a strong tendency to draw conclusions year-over-year when the only thing in common from one year to the next was the survey sponsor.

As marketers, we do ourselves a great dis-service whenever we grab one of these survey nuggets and imbed it into a PowerPoint presentation to “prove” something to management. If we’re not trustworthy when it comes to vetting the quality of research we cite, how can we reasonably expect others to accept our judgment on subjective matters?

So the next time you’re tempted to grab some headlines from a “survey” – even one done by a reputable organization – stop for a minute and read the fine print. Check to see if the conclusions being drawn are reasonable given the sample, the questions, and the margins of error. When in doubt, throw it out.

If we want marketing to be taken seriously as a discipline within the company, we can’t afford to let the “marketers” play on our need for convenience and simplicity when reporting “research” findings. Our credibility is at stake.And by the way, please feel free to comment and post your examples of recent “research” you’ve found curious.

Tuesday, March 24, 2009

It's All Relative

Brilliant strategy? Check.

Sophisticated analytics? Check.

Compelling business case? Check.

Closing that one big hole that could torpedo your career? Uhhhhhhh....... Most new marketing initiatives fail to achieve anything close to their business-case potential. Why? Unilateral analysis, or looking at the world only through your own company's eyes, as if there was no competition.

It sounds stupid, I know, yet most of us perform our analysis of the expected payback on marketing investments without even imagining how competitors might respond and what that response would likely do to our forecast results. Obviously, if we do something that gets traction in the market, they will respond to prevent a loss of share in volume or margin. But how do you factor that into a business case?

Scenario planning helps. Always "flex" your business case under at least three possible scenarios: A) competitors don't react; B) competitors react, but not immediately; C) competitors react immediately. Then work with a group of informed people from your sales, marketing, and finance groups to assess the probability of each of the three possibilities, and weight your business case outcomes accordingly.

If you want to be even more thorough, try adding other dimensions of "magnitude" of competitive response (low/proportionate/high) and "effectiveness" of the response (low/parity/high) relative to your own efforts. You then evaluate eight to 12 possible scenarios and see more clearly the exact circumstances under which your proposed program or initiative has the best and worst probable paybacks. Then if you decide to proceed, you can set in place listening posts to get early warnings of your competitor's reactions and hopefully stay one step ahead.

In the meantime, your CFO will be highly impressed with your comprehensive business case acumen. Check

Tuesday, January 06, 2009

Research Priorities Are All Wrong

I got an email today from the Marketing Research Association spelling out the "top 6 issues for protecting the profession". Included were:

  1. Increasing difficulty of reaching consumers via cell phones
  2. Consumer fears of behavior tracking
  3. Stage and federal government interest in shady incentive practices used to entice medical professionals
  4. Unpopularity of "robocalls" and automated dialing
  5. Public backlash to "push-polls"
  6. Data security and breach protocols
Wrong.

While each of these dynamics is a threat to the future of the research industry, the bigger threat is the increasing irrelevance of research to senior management. More and more companies have outsourced their strategic marketing research functions to suppliers. The suppliers have been consolidating, often being acquired by bigger agency or marketing services holding companies. Not surprisingly, there is a serious degradation of objectivity that occurs in the process. And the more junior marketers now left client-side to direct the research program within their companies are not generally as politically senior/influential as one needs to be to push through the right research agenda - especially in times of immense cost-cutting pressure. (see Rebuilding Trust in Research as a Measurement Tool)

Sure, there are many executional threats facing the research industry today. But unless the way research is conceived in an appropriate strategic/financial context and prioritized for the value it potentially holds, the methodological threats will be but cubes floating in an ocean of icebergs.

It's time the research profession re-rises to the occasion. I hope they do.

Wednesday, September 19, 2007

Knowing Is Believing

Now that 2008 budget season is upon us, it’s time to identify knowledge gaps in the assumptions underlying your marketing plan – and to lay out (and fund) a strategy for filling them.

We recently published a piece in MarketingNPV Journal which tackles this issue. In “Searching for Better Planning Assumptions? Start with the Unknowns” we suggested:

A marketing team’s ability to plan effectively is a function of the knowns and the unknowns of the expected impact of each element of the marketing mix. Too often, unfortunately, the unknowns outweigh the hard facts. Codified knowledge is frequently limited to how much money lies in the budget and how marketing has allocated those dollars in the past. Far less is known (or shared) about the return received for every dollar invested. As a result, marketers are left to fill the gaps with a mix of assumptions, conventional wisdom, and the occasional wild guess – not exactly a combination that fills a CMO with confidence when asked to recommend and defend next year’s proposed budget to the executive team.


Based on our experience and that of some of our CMO clients, we offer a framework to help CMOs get their arms around what they know, what they think they know, and what they need to know about their marketing investments. The three steps are:

1. Audit your knowledge. The starting point for a budget plan comes in the form of a question: What do we need to know? The key is to identify the knowledge gaps that, once filled, can lessen the uncertainty around the unknown elements, which will give you more confidence to make game-changing decisions.

2. Prioritize the gaps. For each gap or unanswered question, it’s important to ask how a particular piece of information would change the decision process. It might cause you, for example, to completely rethink the scope of a new program, which could have a material impact on marketing performance.

3. Get creative with your testing methods. Marketers have many methods for filling the gaps at their disposal; some are commonly used, others are underutilized. The key is determining the most cost-effective methods – from secondary research to experimental design techniques – to gather the most relevant information.

Don’t let the unknowns persist another year. Find ways to identify them, prioritize them, and fund some exploratory work so you’re legitimately smarter when the next planning season rolls around.