I’ll start this piece on surveys with one of my own – do surveys still work?
Ben Wright, Business Editor of the Telegraph said in London last week that “most surveys with predictable responses by predictable types of people won’t cut it. We like questions asked in unusual ways, for example.” Andrew Hill, the FT’s management editor, told me that his team likes to see raw data, as this gives them the chance to write their own story.
Reporters have become more knowledgeable and sceptical about inadequate sample sizes and dodgy methodology. They see many, even most, surveys from businesses as transparently self-serving.
Moreover, a media industry under pressure is disintermediating classical “survey producers” by doing its own surveys. The FT’s monthly City Network gives insight into business leaders that virtually any individual business – or market research firm working for them – would find hard to match. Bloomberg, Reuters and others have an impressive survey output, and trade publications try to exploit their niche connections to stay in the game.
Big data analytics and the “nowcasting” it enables has made the survey that takes months to produce by interviewing, say, CFOs, look old fashioned. Social networks – LinkedIn especially – are now being used as data sources for compelling storylines, whether analysing WPP executives moving the FAANGs, or London bankers moving to Frankfurt.[JS1]
Trade associations, pressure groups and think tanks, eager to demonstrate value, now provide huge competition to single-firm surveys. Their access to senior audiences can be better, and they can have a credibility with journalists that few individual firms match.
Fewer column centimetres
I have no firm figures on what percentage of surveys designed for the media get meaningful coverage, but my hunch is that this is falling. I examined several major outlets’ recent coverage – concentrating on surveys by a single commercial company and disregarding public sector and quasi-official surveys, such as IHS Markit purchasing manager surveys.
The Economist, unsurprisingly, has only a couple of mentions for commercial surveys since the start of the year (though arguably providing cogent analysis can help you get cited). In the same time period, the New York Times hardly mentions any either, other than a couple on recruitment topics. The FT has more, though not as many as once. In April, it had only five or six stories based on commercial one-company “PR surveys”. Examples included BAML’s well established investor survey, one on diversity from consultants BoardIQ, and one on supply chains from Deloitte. It’s more like one a week than one a day. Depending on how high you set the credibility bar, that must be one in ten, or more likely one in a hundred, of the surveys that the FT receives.
Where do the main types of survey stand today?
Myriad types of survey exist, but I’d pick out four main buckets:
The first is the fact-based survey – counting up and presenting facts that already exist, indeed are often publicly available. One of the big four accountancy firms used to add up the number of IPOs in the UK every quarter. While such surveys often gain from being fact-based, the stakes have changed. More of this type of information can be counted up faster online, often in near real-time, by media outlets and by specialist business information providers. It is hard for a company to keep up. Hijacking official data to secure media comment is often a more cost-effective approach now.
The second I would call the business model/market sizing survey. The strategy consultancies are especially good at this. Based on some quantitative data (generally not revealed), they tell us that the widgets market has revenues of X and can be broken down into eight different segments they estimate at X, Y and Z. The most used symbol in such “market sizing” is ~. Combined with an interesting take on market dynamics, such content can be attractive, especially if coming from a recognised source. But most firms that are not management consultancies lack the chutzpah and resources to do this well. A related fad is the chart showing 100 fintech logos mapped by sub-sector. Without some numbers about market size, I see little evidence that these fly with major media.
The third category is the Index. It’s either based on some facts – how many slides do CFOs use in investor presentations or whatever - or some opinion research, or a mix of both. Done well, it can be powerful. Most of the time, it further obscures weak underlying data, and can even hide away something interesting. An index needs to be done regularly to show change. Most commercial index providers in financial markets know just how hard it is to make an index famous.
Fourthly is the most common category, the opinion survey. Generally in the B2B world the opinions of influential, often hard to measure, business people. This category is arguably most challenged by the trends discussed above, with big data allowing correlation (not the same as causation) to often be more compelling than opinion sampling.
What to do?
- Consider your motivation. If major media coverage is the main objective, assess carefully whether it is likely to work. Question your PR advisors, whether in-house or agency, and discount further any success probabilities they suggest. Look and see if there are similar surveys getting media traction – this can both show that there is appetite, and that there is competition. Ask some relevant journalists what they think of your survey idea but remember, they’re not the editor, and they don’t work for you.
- If you accept your survey is unlikely to get media attention all the time, it can be worth undertaking as marketing content for your audiences, supporting your reputation and sales. Just remember, in a time-crowded world, if the relevant media don’t find your survey interesting, it’s quite possible your target audiences won’t either. Measurement here is vital.
- The old adage about “write the headline” and then work out the survey questions to fit can be found on the websites of several Cognito competitors. It has some validity, but is simplistic. Surveys get covered when they are both interesting/controversial, credible/important and topical. Most companies can’t undertake surveys at the scale or speed of a political campaign, so need a balanced assessment of how interesting a survey is going to be in several months for a variety of audiences. Long-term commitment – including spokespeople who can refer to a survey over a prolonged period – is essential. Beware of silver bullets.
- Make sure there are some cross-tabs. Don’t only tell me what 100 CEOs think, but how 30 US CEOs differ from 30 in Europe. Such comparisons are often the best differentiator in securing media coverage, and in interesting survey materials for marketing. Cross-tabs suffer from lower sample sizes and consequently larger margins of error, so thinking about cross-tab construction during planning is vital.
- Specialist market research firms can give a survey some more credibility in terms of the quality of the sample reached, and are often an efficient outsourcing of the research. Remember, they are not communicators. A business survey undertaken by a credible market research firm (as opposed to a political opinion poll) does not increase its attractiveness to media by very much – there are simply too many surveys around.
- Undertaking research directly with your clients is certainly cheaper – you ask their views at an event, or you send them a survey monkey questionnaire. There are methodological issues here by definition, but such results can be reasonably credible, especially if you avoid any transparently leading questions.
- Can you use your own data, possibly at a meta-data level that does not identify your clients? It’s a growing trend, especially if you have credible financial market data. If you are telling a business story based on things you say your clients do, its appeal to media depends on your authority in the field and your market share: a 30% market share is very different from 3% as an indication of a market.
- “Alternative data” is hot. From weather to drones counting cars in parking lots. Anything that you have that looks like interesting alternative data may be attractive. If you’re making money out of it, it’s probably interesting to others.
- Finally – step out yourself and look in. Try to imagine whether you would find your survey and its data credible and interesting.