space Home space
space Contact Kevin space
space Availability space
space Great Mind Award space
space Hall of Fame space
space Personal Interests space
space Shocking Truths space
space Best Practices Test space
space Curriculum Vitae space
space Brand Death Watch space
space Case Histories space
space CEO / CMO Exam space
space Webcasts space
space Marketing Blog space
space Copernicus space
Kevin J Clancy - Marketing Consultant
spc_image spc_image
Hall of Fame

Focus Groups May Kill Your Brand

The most popular form of death wish research, widely used by the Dr. Kevorkians in marketing management and marketing research, is the ubiquitous focus group interview.

In a focus group, a handful of people under a moderator’s direction focus their discussion on a certain topic.  It could be a new product, an ad campaign, how people use a product, what caused them to buy it, or what they think about it.

In the mid 1960’s, focus groups spread like a contagious disease from advertising agencies and package goods companies into financial services, hard goods, and industrial applications. Ad agencies particularly liked them because they could be videotaped and later edited for use in client and new business presentations.  By the late 1980’s they became the most widely used type of market research says Thomas L. Greenbaum, an executive vice president of Clarion Marketing Communications and author of a handbook on groups.

Focus groups appeal to marketers because they are easy, cheap, and sometimes conducted in interesting places.  They can represent a junket for the marketers and researchers—a surprising number are held on the outskirts of Las Vegas and near Disney World. 

Users of groups are probably not even aware of all of the published academic research which finds that the number and quality of ideas for ad campaigns, new products, etc. coming out of a focus group is less than if they had done individual interviews among the same number of people. In other words, interviewing 8 people separately will generally produce more new ideas and better ideas than interviewing 8 people in a focus group.

Too many marketers and researchers believe the myth that qualitative research tools, particularly focus group interviews, are serious, helpful marketing research tools.  Many actually believe that the information produced by focus group research is as accurate and useful as the results of survey research at less than half the cost.  I’ll always remember a breakfast meeting that the late Robert Shulman and I had with Dan Quayle weeks after he and George Bush Sr. lost the presidential race to Clinton and Gore.  Quayle reported that the Bush team was so committed to focus groups that they were running them continuously throughout the campaign. And every time a group would seem to suggest that a change in strategy was called for, they would change their strategy.  Quayle lamented that he believed that this lack of strategic consistency was the reason they lost the election.

This discovery hasn't stopped other politicians from misusing focus groups. More groups were conducted during the last presidential campaign by both the Obama and the McCain camps than any other time in American history.

Unfortunately, focus groups are no substitute for other more expensive, more reliable, research techniques.  Although people use them for this purpose, focus groups cannot tell a marketer how prospects will really react to a new product, a new positioning, or a new advertising campaign.

Staying with the same context, consider presidential election surveys.  Why don’t research companies use focus groups to forecast election results?  Both political parties choose candidates who try to appeal to more than 50 percent of the electorate because a candidate has to have a majority to win.  To obtain a level of sampling precision, researchers typically poll 600 to 1,200 people nationwide, so they can say that the study shows Candidate Obama preferred by 52 percent of the voters, plus or minus three or four percentage points.  So why can’t we obtain the same results with focus groups?

1)  Volatile results.  A typical focus group has eight to ten respondents.  A typical marketer does about four focus groups on a given topic to answer a question, so the total sample is thirty-two to forty people.  As Dan Quayle learned, this sample is too small to give stable results; do the research again and the results may be entirely different.

If a research company conducted four focus group sessions and everything else were held constant, it might conclude that product X, or positioning Y, or promotion Z is the most appealing.  If another research company replicated the research—did it exactly the same way—it is just as likely to conclude that product A, positioning B or promotion C is the most appealing.  This volatility can lead to disaster.

2) Not only are the results unstable, they are not representative of any segment of the population.  When a reputable research company does a poll, it makes sure that the sample includes people in proportion to their incidence in the population.  So if 20.7 percent of the U.S. Adult population lives in the Northeast, approximately 20.7 percent of  those sampled live in the Northeast; if 60 percent of the people are registered democrats, 60 percent of the sample are registered Democrats.  Political researchers make sure a study is weighted geographically, by urban area, by gender, and sometimes by religion because all these factors can affect the study’s outcome.

Focus groups are not representative of any population. Even if a company holds 100 focus groups so that it includes 1,000 people (an admittedly crazy idea), focus group researchers do not choose representative samples.  After all, many researchers today conduct focus groups in suburban malls, among people who have the time to participate in the research.  We’ve said, half seriously, most focus groups are made up of semi-comatose people with time on their hands who are roaming through shopping malls searching for some excitement, which comes when an attractively dressed woman offers $25 and a sandwich to participate in a group.  These are not representative samples.

3) Dominant voices affect the group. Even if a company conducted many focus groups and designed the groups so they were representative of the population, the groups are so dominated by a few voices—sometimes just one—that what the research picks up is not many voices but a relatively few because a small number of voices in each group color the expressed beliefs of the other group members. This is one of the reasons why individual interviews generate more ideas and better ideas than group interviews.

Also, what people say in a group setting with strangers is not necessarily what might say to an individual interviewer, whether the conversation is at home or in a mall or on the telephone.  Many people do not contribute their opinions and views, on topics they’ve never thought about before or on topics they’re embarrassed to talk about, or on topics they regard as confidential, in a group with one or several dominant voices.  So some people say too much, and others too little.

4)  The moderator’s abilities, interests, predilections, and predispositions color or temper the group’s response.  We recently observed some groups for a client that used three different male moderators.  They came to three different conclusions.  Perhaps the variety came about because the sample was small or was not representative or because one voice dominated the groups, but in fact one moderator was psychoanalytically oriented, one was consumer-behavior oriented, and the third was simply dumb.  Each came to conclusions based on his orientation, training, and talent.

5) Different observers often interpret the same group differently.  A number of years ago, a major packaged goods company with excess capacity in its potato chip division decided to develop and test fruit flavored potato chips—cherry flavored chips, lime chips, lemon chips,etc.  The research director commissioned a large number of focus groups in which people first talked about the concept and then tried the flavored chips.  He attended some of the sessions and had to watch only two or three to conclude that the concept was an unequivocal flop.  Talking about the concept, people tended to grow green, and they became even more green after they tried the product.

The research director was dumbfounded when about three weeks later the associate research director’s memo summarizing the focus groups suggested that in fact a fruit-flavored potato chip was a wonderful idea that showed great promise.  She reported that consumers not only liked the concept, but, after tasting the product, said they would buy it. She called her associate into her office and said:

"This is just nutty.  I went to some of those focus groups, and people hated the concept.  I don’t know how you came to this conclusion, but I want you to transcribe all of the comments.  Then go through those transcripts yourself with a red and green felt-tip marker, and any time anyone said anything negative about the concept or the product, underline it in red, and when you find something positive, underline it in green.  Then add up the number of comments, and you will demonstrate that that concept isn’t worth the time we’re giving it."

A week or so later, the associate research director returned with the interesting news that the numbers of red and green comments were just about equal across the groups.

We suspect that the research director and his associate observed what they wanted to observe, heard what they wanted to hear, and, though a process of selective perception and forgetting, took away from the groups what they brought to them.   This is a trap everyone—not just researchers—should avoid.

In twenty years of observing focus groups, we have learned something significant from them fewer than a dozen times  Usually the focus group reveals what has already been turned up by analyzing copy and product claims, generating ideas from copywriters, or discussing the situation with corporate management.  We cannot think of a single thing a focus group “discovered” that a client eventually adopted.

Focus groups can be helpful when a company wants to explore a topic, obtain some suggestions (or wild ideas), or provoke opinions.  They can be an interesting vehicle for picking up the language of consumer behavior—for learning how people talk about different things.  But a company cannot use them to reach any conclusions, to draw any real inferences, or to make any decisions.  And we find it frightening that so many companies today are doing just that.  As we’ve said elsewhere, focus groups are to serious research what bumper stickers are to existential philosophy.  Focus groups are a virulent form of death wish research which are frequently used to accidentally kill brands and marketing programs.


Shocking Truths:

> There's a Negative Relationship Between What People Say They Will Do and What They Actually Do
> Quality and Price Are Positively, Linearly Related
> As Price Goes Up, Sales Go Down
> New Product Appeal and Profitability Are Not Positively Related
> Jobs-Based Segmentation Is Not a Remedy to Marketing Malpractice
> Most Brands Are Unpositioned
> Higher Levels of Customer Satisfaction and Retention Don't Always Translate Into Higher Profitability
> Net Promoter Scores Suggest That Most Companies Employ a Failed Business Strategy
> Back To The Future: How a Discredited Research Tool Discarded in the 1960s Has Become Popular in 2012
> Spending Money to Build an Emotional Connection with Your Brand Won't Build Market Share
> Most Companies Are Operating without a Vision
> Derived Importance Measures Will Lead You to the Wrong Decision
> Focus Groups May Kill Your Brand
> The Maximum Difference Methodology: a Questionable Solution in Search of a Problem
> Heavy Buyers are the Worst Target for Most Marketing Programs
> CEOs Don't Know Much About Marketing
> Advertising ROI is Negative
> Many CEOs Never Take The Time To Do It Right
> Given lots of cues and prompts, few people remember anything about your television commercial the day after they watched it
> A Dumb Way To Buy Media Is Based On The Cost Per Thousand People Exposed—CPMs
> Implementation May Be More Important Than Strategy
> Zip Codes Tell You Little About Consumers And Their Buying Behavior
> Retailers Rarely Send Truly Personalized Mailings to Individual Customers
> Too Much Talk About Brand Juice
> Marketing Plans are more Hoax than Science