Showing posts with label testing. Show all posts
Showing posts with label testing. Show all posts

Thursday, May 07, 2009

Survey: 27% of Marketers Suck

A recent survey conducted by the Committee to Determine the Intelligence of Marketers (CDIM), an independent think-tank in Princeton NJ, recently found that:

· 4 out of 5 respondents feel that marketing is a “dead” profession.
· 60% reported having little if any respect for the quality of marketing programs today.
· Fully 75% of those responding would rather be poked with a sharp stick straight into the eye than be forced to work in a marketing department.

In total, the survey panel reported a mean response of 27% when asked, “on a scale of 0% to 100%, how many marketers suck?”

This has been a test of the emergency BS system. Had this been a real, scientifically-based survey, you would have been instructed where to find the nearest bridge to jump off.

Actually, it was a “real” “survey”. I found 5 teenagers in a local shopping mall loitering around the local casual restaurant chain and asked them a few questions. Seem valid?

Of course not. But this one was OBVIOUS. Every day we marketers are bamboozled by far more subtle “surveys” and “research projects” which purport to uncover significant insights into what CEOs, CFOs, CMOs, and consumers think, believe, and do. Their headlines are written to grab attention:

- 34% of marketers see budgets cut.
- 71% of consumers prefer leading brands when shopping for .
And my personal favorite:
- 38% of marketers report significant progress in measuring marketing ROI, up 4% from last year.

Who are these “marketers”? Are they representative of any specific group? Do they have anything in common except the word “marketing” on their business cards?

Inevitably such surveys blend convenience samples (e.g. those willing to respond) of people from the very biggest billion dollar plus marketers to the smallest $100k annual budgeteers. They mix those with advanced degrees and 20 years of experience in with those who were transferred into a field marketing job last week because they weren’t cutting it in sales. They commingle packaged goods marketers with those selling industrial coatings and others providing mobile dog grooming.

If you look closely, the questions are often constructed in somewhat leading ways, and the inferences drawn from the results conveniently ignore the statistical error factors which frequently wash-away any actual findings whatsoever. There is also a strong tendency to draw conclusions year-over-year when the only thing in common from one year to the next was the survey sponsor.

As marketers, we do ourselves a great dis-service whenever we grab one of these survey nuggets and imbed it into a PowerPoint presentation to “prove” something to management. If we’re not trustworthy when it comes to vetting the quality of research we cite, how can we reasonably expect others to accept our judgment on subjective matters?

So the next time you’re tempted to grab some headlines from a “survey” – even one done by a reputable organization – stop for a minute and read the fine print. Check to see if the conclusions being drawn are reasonable given the sample, the questions, and the margins of error. When in doubt, throw it out.

If we want marketing to be taken seriously as a discipline within the company, we can’t afford to let the “marketers” play on our need for convenience and simplicity when reporting “research” findings. Our credibility is at stake.And by the way, please feel free to comment and post your examples of recent “research” you’ve found curious.

Friday, December 19, 2008

Blogging On (or is it "Blogging In"?)

OK. I'm back.

I actually got quite a few requests to resume this blog, even though there were very few comments posted during the year I ran it originally. Plus, it seems to do REALLY well on Google organic search results.

So what have I learned?

1. Blogging on a subject matter like marketing measurement is less about the number of engaged readers than it is the quality of engagement of a few.

2. Blogging is far more about building a well-rounded web marketing presence. No single piece of the puzzle puts one over the top on search results. It's constant experimentation. Having dropped the blog for a while, I can tell you we saw a clear drop in performance of our organic search traffic.

3. Social media is so immature at this point that we're experimenting with many platform components from Twitter (follow me as "measureman") to feedster, to several dozen other elements. The cost of experimentation is high, and I used to think we weren't making sufficient progress towards any real insight. Then I had a bit of an epiphany... the experimentation process really IS the marketing process. Experimentation isn't just what we do to get to a marketing plan. The marketing plan is a summary of how we're experimenting with various methods, tools, and messages to get the desired results.

If you're interested in how we're measuring our own results here at MarketingNPV, shoot me an email and we can talk about the specific metrics.


Wednesday, September 19, 2007

Knowing Is Believing

Now that 2008 budget season is upon us, it’s time to identify knowledge gaps in the assumptions underlying your marketing plan – and to lay out (and fund) a strategy for filling them.

We recently published a piece in MarketingNPV Journal which tackles this issue. In “Searching for Better Planning Assumptions? Start with the Unknowns” we suggested:

A marketing team’s ability to plan effectively is a function of the knowns and the unknowns of the expected impact of each element of the marketing mix. Too often, unfortunately, the unknowns outweigh the hard facts. Codified knowledge is frequently limited to how much money lies in the budget and how marketing has allocated those dollars in the past. Far less is known (or shared) about the return received for every dollar invested. As a result, marketers are left to fill the gaps with a mix of assumptions, conventional wisdom, and the occasional wild guess – not exactly a combination that fills a CMO with confidence when asked to recommend and defend next year’s proposed budget to the executive team.


Based on our experience and that of some of our CMO clients, we offer a framework to help CMOs get their arms around what they know, what they think they know, and what they need to know about their marketing investments. The three steps are:

1. Audit your knowledge. The starting point for a budget plan comes in the form of a question: What do we need to know? The key is to identify the knowledge gaps that, once filled, can lessen the uncertainty around the unknown elements, which will give you more confidence to make game-changing decisions.

2. Prioritize the gaps. For each gap or unanswered question, it’s important to ask how a particular piece of information would change the decision process. It might cause you, for example, to completely rethink the scope of a new program, which could have a material impact on marketing performance.

3. Get creative with your testing methods. Marketers have many methods for filling the gaps at their disposal; some are commonly used, others are underutilized. The key is determining the most cost-effective methods – from secondary research to experimental design techniques – to gather the most relevant information.

Don’t let the unknowns persist another year. Find ways to identify them, prioritize them, and fund some exploratory work so you’re legitimately smarter when the next planning season rolls around.