Wilson Research Group Blog

Market Intelligence at Market Speed

We need to “comprehend, shape, adapt to, and in turn be shaped by an unfolding, evolving reality that is uncertain, ever-changing and unpredictable.”
John Boyd, creator of the “OODA Loop.”

Have you ever had the experience of seeing a market research report and saying, WOW! that is interesting – I only wish I had known that six months ago when we made that decision to develop our product line.

In this time of revolutionary changes in market after market, business intelligence needs to change. They need to speed up. Adaptive business intelligence systems need to be built and need to be turned on constantly. It is not good enough to take the pulse of your customers every year or even every six months. Doing it now and doing it constantly is the only way to stay ahead of competitors.

If you wonder how to do this, there are ways and new approaches and new tools that can inform your decision teams constantly of recent market movements and competitive changes. We can suggest a few. Adopting this new type of intelligence system will separate out those who can adapt to change from those who are mired in old paradigms, where research is more of a self assuring prop than a dynamic part of the process.

If you think of market research as stuffy, eye glazing, hard to use and even harder to understand, put-it-on-the-shelf kind of stuff, you are surely not using it. And if you are not using it, you are not able to bring relevant and current market data into your everyday product marketing decisions.

For more on speedier data-driven decisions see our blogs on:
1. The Market Intelligence OODA Loop: A Dynamic Business Model
2. DDD vs. HIPPOs

The Market Intelligence OODA Loop: A Dynamic Business Model

Our whole approach — market research at market speed — centers on putting our clients in front of the data they need for everyday decisions and actions. Speeding up the cycle of information means executing at a faster pace than competitors and paying even more attention to proven research processes. The data must be accurate in the first place.

This idea is summed up in a concept known as the OODA Loop (http://en.wikipedia.org/wiki/OODA_Loop), a concept developed for winning wars on the battlefield. It was a factor in winning the first Gulf War in the Middle East. It has also been used in business management courses to influence strategy in winning the skirmishes in market places. Such a model has to be used with caution, but the main ideas are simple and relevant.

The OODA Loop is shorthand for a series of short steps that many businesses and competitive teams do naturally, but don’t have a name for it.

Observation (Means observing unfolding events, gathering research and intelligence, which includes actively collecting intelligence about the battlefield/market where the action is taking place, the so called frontlines, or in business, the place where the customer decides on which product or service to buy, the moment of customer action.)

Orientation (Means sythesizing and analyzing and empathizing with the overall traditions, backgrounds, settings, previous experiences, successes, rejections, and observations into a whole picture of the environment in which the action is taking place.)

Decision (Means arriving at plan of action based on all known intelligence, observation and orientation.)

Action (Means executing the plan of action and feeding back positive and negative results into O-O steps above)

Those who execute the OODA Loop faster and more efficiently than their competitors are more responsive to current market needs and market forces, and will win the battle for the hearts and minds of their audiences.

Without the OODA Loop…

“we will find it impossible to comprehend, shape, adapt to, and in turn be shaped by an unfolding, evolving reality that is uncertain, everchanging and unpredictable.”
John Boyd, author of the OODA Loop.

Pilots in air combat present an extreme case in point of being trained in executing the OODA Loop faster than their counterparts. Here the main importance of the Observation Step of the loop comes into sharp focus: the data fed to the pilot must be accurate, must be easy to assemble and make sense of, and must be current.

Colonel John Boyd, who developed the concept of the OODA Loop, placed the highest importance in the loop on Step 2, Orientation, a place of synthesis and analysis of the total environment. It is what provides the difference between what informs your decisions and what informs that of your competitor.

Not unlike the pilot, the product marketing manager or entrepreneur must have a similar systematic way to arrive at valid market decisions that will outperform her market counterparts. Although the product marketing manager is not in a life threatening position like a pilot, her product line is. Her product line will continue to perform well or will wither away based on the manager’s ability to use and execute in a timely manner her market intelligence loop.

Bottom Line: The OODA Loop is a simple easy-to-remember model of a common sense idea. Accurate, up-to-the-minute, easy-to-access market intelligence should be in the arsenal of all business managers if they are to make customer-sensitive decisions that will ensure the survival and success of their product lines and services. As any combat pilot will tell you, flying by the seat of your pants is not an option.


1. Wikipedia background on the OODA Loop by John Boyd (http://en.wikipedia.org/wiki/OODA_Loop)
2. http://www.chetrichards.com/modern_business_strategy/boyd/essence/eowl_frameset.htm
Without the OODA Loop “we will find it impossible to comprehend, shape, adapt to, and in turn be shaped by an unfolding, evolving reality that is uncertain, everchanging and unpredictable.” John Boyd, author of the OODA Loop.


Is your company’s direction based on DDD or HIPPOs? You are likely losing ground if it is not DDD-based.

Translation: MIT’s Sloan School has researched and reported that “data driven decision-making” (DDD) is more effective than “highest paid persons opinions” (HIPPOs). Finally, we have some objective evidence that DDD works, it increases profits, sales, and provides better guidance to investments in new product lines and markets than do HIPPOs, not that we didn’t already know this from much of Deming’s pioneering work years ago and from what market researchers basically do.

Of course, as researchers, we have been saying this for years. Our ExecStats tool makes DDD not only possible, but it makes DDD slick and easy and interactive for executives running the show. Putting current audience data into your front-line thinking on a regular or daily basis DOES indeed make a difference to your bottom line.

It does not mean that you can simply let the data decide — intuition and gut are still factors, but for the most favorable impact, intuition is a factor AFTER you have the data in front of you.

And finally consider the cost of DDD vs HIPPO. DDD is far less expensive. It seems to be a no brainer, yet companies still run with the HIPPOs.

For more depth on this research on DDD’s see the following report, “Data Two, Gut One” in E-Week:


“Sweet Spot” Pricing Research: Too Much, Reasonable, Bargain and Steal.

Do you need to know the “best” price to charge for a product or service? Here is a practical way to get after this issue without spending a mint.

It is too simplified and misleading to list a series of attributes/pricings and ask for a rating on a scale of 1 to 7 as to which are most important. All items will likely skew towards the higher ratings, because the person doing the rating does not have to trade anything off, so she does not lose anything by rating all attributes high.

On the other hand going into advanced “conjoint analysis” or “max diff” analysis where trade-offs are analyzed ad infinitum to reach a price point about various product attribute mixes has its shortcomings as well. It is a costly research maneuver requiring specialists and research expertise that is beyond most clients. It can suck up all the time and energy of a survey and not allow for other types of important questions from this audience, and it can still be misleading.

More reasonably, if you can explain a product fairly simply with its main attributes, you can ask a question that tries NOT to find out which attributes are most desired, but what is the package price at which people “tune out”, “tune in”, are “attracted”, and are “smitten”. These are human dimensions that we all experience when we shop. These reactions send (or suppress sending) dopamines to the synapses (true). (See our “Brainy Decisions” blog.)

Buyers almost always consider “cost” as one of the most important considerations in a purchase. Based on our 20 years of research experience, they almost always do not consider cost to be the “most” important consideration. In our experience, it is usually ranked third or fourth most important, after other attributes like “quality”, “performance”, “style”, “durability”, “reliability”, “brand”, and so on.

Respondents in real buying situations can only seriously consider four or five key attributes of any product or service. After that, differences in attributes start to blend together or fade completely. So a way of keeping pricing questions real and keeping the respondent from low-balling a pricing question, is to ask the following type of questions.

1. For (product A), at what price would you consider this product…
a. To be too much or “overpriced” and would definitely not purchase this product $_____
b. To be a “reasonable” price and would likely consider purchasing this product $_____
c. To be a “bargain” price and would definitely consider purchasing this product $_____
d. To be a no-brainer “steal” at a price that is almost too good to be true $_____

2. For (product B), at what price would you consider this product…
a. To be too much or “overpriced” and would definitely not purchase this product $_____
b. To be a “reasonable” price and would likely consider purchasing this product $_____
c. To be a “bargain” price and would definitely consider purchasing this product $_____
d. To be a no-brainer “steal” at a price that is almost too good to be true $_____

This style of questioning for two or more combinations of attributes can provide a very accurate picture of what the audience values, and also fits into the respondent’s way of thinking about pricing, as most people think of these categories (overpriced, reasonably priced, bargain, steal) in almost all buying situations where they have done even the most minimal shopping.

Bottom Line:
In designing questions to understand the “sweet spot” in a market (not too much and not too little), we consider first how most people think when they purchase products or services. Although “price” is usually not the first or foremost attribute people consider (“quality”, “performance”, “reliability”, “durability”, “style” and “brand” are often higher rated attributes), when they do get down to the price (almost always ranked among the top four attributes) buyers get turned on or off by their own internal human measuring system of “too much, about right, bargain or steal.”

(See also our blog “Brainy Decisions”, our web page on Startups where we have actually used these techniques, and our page on the Data Robotics Drobo startup.)

For more information on pricing, tell us about your project , or email Larry@wilsonresearch.com for a complementary discussion of your pricing research needs.

Brainy Decisions

I just finished a book I highly recommend by Jonah Lehrer on “How We Decide”, something that market researchers, among others, should look into. There is good stuff in there on focus groups, how distressful situations need prefrontal-lobe concentration, how emotions can sometimes interfere and other times aid in making decisions, how complex decisions that might overload the prefrontal lobes are aided by “time outs” to allow the emotional brain time to “catch up” and provide support, how a world class gambler hones his intuitions in making betting decisions. So when you have a tough decision that is literally freezing you into decide-a-phobia, go take a shower, or a nap, or a walk and give it a rest, that is if you are not in a jet plane at 36,000 feet with no way to steer.

Sully glided his jet liner gently onto the Hudson River with great prefrontal lobe help taking over. A miracle in itself. Lehrer describes how years ago a DC-10 jumbo jet from Denver to Chicago suffered a broken disc in its tail engine, severing the three redundant hydraulic systems, and leaving the pilots with no means of steering the plane. Yet somehow they found a unique decision-making path through this unprecedented situation (nothing in the pilot’s manual about this, no phone help from expert pilots) and, although they unfortunately lost 126 lives, they also saved 186 lives almost certain to have perished were it not for the pilot’s decisions. The pilot was on his own as was Sully. His decisions led to new training of pilots to cope with this harrowing situation.

It turns out that focus group research for shows like Seinfeld, Hill Street Blues, and The Mary Tyler Moor Show all indicated flops in the making. How did this research go wrong? You have to understand what it is that is happening. Remember Bob Dylan’s “Ballad of a Thin Man”? He was talking about researchers (Mr Jones) when he said “You walk into a room and you know there’s something happening, but you don’t know what it is, DO YOU, Mr. Jo-o-o-ones?“ You really have to understand what exactly it is that you are asking, of whom, and if you are missing the point by asking someone to tell you why they like the taste of this better than that. When you start focusing on one trait or another, you can skew or muddle the overall experience, the big picture, the most important picture.

Nuf said. Related to this decision making is research on the Brain that is now illuminating all kinds of interesting things. Go to www.CharlieRose.com and watch the first four episodes (of what will be a total of 12 hour long segments when completed) of his Brain Series with Eric Kendel, Nobel Laureate for neurological work on memory. (Lehrer was a student in Kendel’s lab). This is truly a great use of the airwaves for educational advancement in understanding the neurological workings of our moods, social behavior, empathy towards others, visual fields, memory, facial recognition, and much more.

Here is a blog by Jonah Lehrer that I also highly recommend. Personal disclosure: None! I have no relation to Jonah whatsoever. He just does good stuff worth reading about and thinking about. http://scienceblogs.com/cortex/2009/12/free_will_and_ethics_1.php#more

Americans Are Giving Research Shows

As of January 18, 2010, two-thirds of U.S. adults (64 percent), and an even higher percentage of African-Americans (81 percent), have given or plan to give to relief efforts following the earthquake in Haiti, according to a survey conducted by Zogby Interactive, a Utica, N.Y., research company.

Interesting aside: Jonah Lehrer in “How We Decide” (see next blog) shows in the “ultimatum game” that decisions that researchers might think would be more “selfish” in nature are actually based on a kind of moral sense of fairness. Our brain synapses get boosts of dopamine when we receive rewards or gifts, but we also get almost as much of a dopamine boost when we give to charities and participate in altruistic causes. So we to some degree have a “hardwired” predisposition to give aid to others.

Don’t let this take away from your own personal decision and free will to give. It is just that biology and neurology help us to be what we really want to be anyway. Two of my year in and year out favorite charities who are always hard at work are: www.oxfam.org and www.doctorswithoutborders.org , but I am sure you have your own.

Communicating With Designers

As a former product marketing manager (acquistions editor) in the publishing world, I have worked with many a designer on book jackets, book design and artwork. I have always wondered why it was so difficult to get the designer to see what it was that I wanted in a design. Of course I was usually trying to convey functional ideas about what the design should do. And designers always seemed to talk a different kind of language. I eventually — it took many hands on experiences — learned a whole lot from the designers I came into contact with. I am not sure they learned anything from me.

The following link is written by a designer who appreciates the fact that designers (including himself) have some quirky ways. Getting the communications channels uncluttered and working between quirky designers and geeky developers is an interesting challenge. Some good suggestions and thoughts at this blog. Hope these thoughts smooth the waters between two worlds that sometimes seem like unbridgeable two-brane universes, and two types of brains.


Sample Size Calculator

We have a new sample size calculator up on the site. It is an excel spreadsheet that you can download and use as you wish. You can find it at http://www.wilsonresearch.com/main/market_research_company_articles.php

New Article

I got my latest article published on EzineArticles.  Here it is if you are interested.  Here is the orig link: http://ezinearticles.com/?id=2552698

Top 10 mistakes in conducting online market research

1. Not knowing what you don’t know

Its easy to do online surveys these days. Too easy. It may be so cheap and easy that you do it without understanding the basics and end up with misleading answers that send your business down the wrong path. This is worse than never doing any research in the first place. Spend a little time and get to know what you don’t know about market research. A basic review of the following topics is a great start.

  • Sampling and sampling error
  • Quantitative vs. qualitative research
  • Question bias / question design
  • Response rates / confidence levels
  • Questionnaire coding
  • Why people take surveys (social contract)

Some great books on these subjects are:
Mail and Internet Surveys: The Tailored Design Method” by Don A. Dillman
Asking Questions: A Definitive Guide to Questionnaire Design” by Norman Bradburn, Seymour Sudman, Brian Wansink

2. Not eliminating sampling errors

Now that you know what sampling error is you can understand why it is critical to conducting meaningful market research. Many of the online surveys you see today are full of potential sampling errors. Don’t be one of them. Take the time to develop a good sample and then make sure you get as many of those people as possible to your survey. This is probably the biggest difference between professional market research and your do-it-yourselfers. The pros take the time and money to develop good samples and then make sure that they get good response rates. You can to if you put in the effort.

  • Always use a true random sample
  • Tracking your respondents (PINs)
  • Program the survey to eliminate duplicates and respondents with bad intentions
  • Check the data for oddities (clean the data of illegitimate records)
  • Use incentives (does not have to be monetary, see social contract)

3. Making decisions with inaccurate information

If you never understood any of #1and #2 it is a good bet your survey is useless. Worse than that you may think it is telling you what to do with your important business decisions. Making decisions with inaccurate information is worse than taking a guess.

4. Writing bad questionnaires

You might get everything else right and then go and write a bad questionnaire. Lots of online surveys have at least one bad question. What is a bad question? It’s any of the following:

  • Biased questions
  • Unanswerable questions (impossible to know the answer)
  • Questions with two meanings
  • Hard to understand questions (way to long, strange use of words)
  • Dumb questions (asking about something the researcher should already know, or has already asked)

5. Programming a hard to take survey

After you have spent all that time creating a good sample and writing good questions don’t ruin it by programming a hard to use survey. One of my top gripes is forcing respondents to complete every answer. Too much of this is going to get you either a contrived answer or the respondent leaving. Neither is good.

  • Don’t force non-critical questions
  • Don’t have non-standard buttons
  • Don’t use non-standard technologies (java applets, etc.)

6. Going cheap

Both the good and bad thing about online market research is that it can be much less expensive than in the past. The bad of this is that it is just too easy to conduct flawed market research. Many of the above items cost time and money (sampling, questionnaire design, etc.) Spend the time and money to do it right. Even better hire a quality market research firm like Wilson Research Group to do it for you. Either way you will save money in the long run by conducting quality market research.

7. Confusing social networking with quantitative market research

Talking with lots of people (social networking) might gain you valuable qualitative information but it is not quantitative market research. The difference is qualitative information rarely represents all of your audience and gives you individual opinions and ideas. Quantitative research on the other hand is designed to represent all of your audience and gives you answers that you can know reflects all of your customers. Don’t confuse the two. Social networking can be useful but understand its limitations.

8. Being overly “cute” with the survey tool

Your market research is supposed to gather meaningful information about your target audience. It is not supposed to impress them with all the high technology you can master. Keep your survey technology as simple as possible to reduce excluding respondents that are not up to speed with the latest and greatest.

  • Keep Flash and JavaScript to a minimum (use them but not in critical areas, always provide alternatives.)
  • Use tried and true web technologies

9. Relying on only one source of information

Market research is a snapshot of opinions at a certain time. If your research results in wildly different answers than you were anticipating it is wise to confirm these conclusions with more data.

  • Conduct another survey
  • Look for corroborating data

10. Ignoring your market research

If you go to all the trouble to conduct a good study then have a plan to do something with that information. Too many organizations will conduct market research for one reason or another and when they get information back just sit on it. Don’t be the one who ends up saying “Wow, if we had just done what our market research told us we wouldn’t be in this bad position”. Before you conduct any online research have a plan as to what you will do with it.