From time to time I will be providing commentary on research practices that make my blood boil, especially when questionable findings are published by the press or presented in meetings without any regard to the methodology being used.
Back in my agency days I was given the nickname "B.S. Detector" (hence the title of this column) because of my penchant for calling out the organizations as well as presenters who report findings that defy reality. Further, when these wacky findings are published by the trade press as fact, they often do even more damage and often have a disastrous effect on our business. I, along with my fellow media research pros, are the ones who have to deal with the fallout, mop up the mess and set the record straight for our clients.
Several months ago, the trade press (you know which ones you are) published top-line findings from a study that weekly time spent on TV and the Internet have reached parity, at 13 hours per week, a finding that is just plain ludicrous to anyone who knows anything about media usage patterns and trends.
This is not the first time this company whose name starts with the letter "F" has gone public with questionable information, nor is the second or third time.
Here are two other instances:
1) Back in 2000 company 'F' predicted that DVR penetration would climb to 40% U.S. by 2004. Many content companies and marketer altered their business plans based on this wildly off-the-mark forecast. By the way, as of April 2011 DVR penetration has finally approached the levels that were supposed to occur nearly eight years ago.
2) During a recent ANA TV conference, an executive from company "F" made the claim that advertisers have cut back on their investment in TV, which was blatantly false. He was called out on the carpet by David Poltrack who proceeded to tear him a new one and went so far as to calling his statements "B$L$H*T!" Here is the link to the Ad Age article, which also features the audio clip of the incident. CBS Network Chief challenges Forrester
David is not the only media research executive who has expressed outrage at poor research practices. Steve Sternberg (author of The Sternberg Report) also had plenty to say regarding the peddlers of bad information. Please click on the link for his commentary regarding how the press reports TV research studies.
Not too long ago, Jack Wakshlag, Chief Research Officer of Turner unleashed his fury during a session at the Outfront Conference when he caught presenter, Networked Insights using yet another panned piece of "B.S" research that depicted time spent online as being at parity with time spent with TV. When I questioned the presenter as to how the research was conducted, he did not have the slightest clue. Of course I chastised him for presenting information he was unable to explain.
Were Jack and I, or Steve and David disruptive? Probably. Were we provacative? Yes. Were we out of line? Absolutely not!
For those of you who use or report research information but are not-researchers, here are three basic guidelines that will help you determine whether a given piece of research has any merit.
1)The Sample: Is it based on probabilty or is it self-selected? Surveys and studies based on voluntary participants are fundamentally flawed.
2)The Media Measurement Technique: Is the research method well suited for the task at hand? Recall (Self-reported) is acceptable for reporting the incidence of a media activity, "Did you watch TV last week?," The technique is not well-suited for measuring time spent with a given medium, as in "How many hours did you watch TV last week?" A more structured recall method that has participants keep track of time spent by time interval (electronic diaries) is somewhat better. However it falls way short of techniques that measure media use as it happens.
Historical studies have shown that when recall techniques are used, consumers will consistantly under-report time spent with TV, while overstating time use with more socially accepted media choices such as: online, mobile and print. In fact, the CRE's Video Consumer Mapping Study documented the severe limitations of the recall method for time use. When followup interviews were conducted with VCM participants the day after they were observed, once again, time spent with TV was wildly underreported, and time spent online and with mobile devices were frequently overstated. In fact, one of the participants told us that he remembered using his iPhone for over two hours, when in fact he was actually observed using his iPhone for just 17 minutes!
3) For those of you in sales, you need to take responsibility for the quality of the research you use in pitches and presentations. If you can't explain the data on the chart then do not present it to me.
4) For those of you who are reporters, don't automatically regurgitate the vendors' press releases unless they are forthcoming with the methods used. I don't care that you are not researchers. That is not an excuse. There are plenty of us in this business who are. I, for one, would be happy to consult with you to vet the information to see if the findings have merit, and so would my industry brethren.
It has now become my personal mission to call out the purveyors of Bull$#&! research along with those of you who publish it or present it without question.
Trust me. You don't want to be next.