Which Songs Fit on Your Station?
June 8, 2015
Some researchers will try to convince you that you can get a useful reaction from respondents concerning which songs fit on your station. At NuVoodoo we believe that's wrong thinking.
The people who participate in any type of consumer research are called respondents because they respond - not everyone in the population will. They'll answer questions for which they really don't have a useful answer because it's in their nature to respond. Even people who begin an interview, but ultimately hang up or opt out because the questions are frustrating to answer are non-responders. They're not cooperative. Arbitron used to study non-responder bias, but in measured media it's only the responders who get a vote.
If you've participated in an online or telephone study for another business (you'd certainly opt out of any radio research, right?), you've likely ended up answering questions that really don't make sense to you as a consumer of that product or service. You may have been asked questions about features or attributes of a product that play no role whatsoever in why you do or don't use that product. But, since you're a respondent in such a setting, you respond.
After decades of collecting and analyzing music research data on zillions of songs we believe asking about fit is useless, because respondents don't know whether or not a song fits on your station. They'll give you an answer, because they're respondents. But, they really don't know whether or not the song would fit ... because they really don't know. They know what songs and types of songs they've heard on your station. So, they'll answer as best they can within that framework, but they can't make the critical judgment of whether or not the song fits on your station. Best case, they'll respond to what you've already taught them fits, so you're creating a tautology. Truth is, they don't know. It's not their job. That's your job. You decide what fits.
What's worse is that asking the additional questions gets respondents back into their heads instead of giving immediate responses to questions they can answer instinctively. In the car, listeners aren't making reasoned choices concerning whether to turn up a song or switch to another station. They're reacting. Immediately. No analysis.
We try to elicit that same kind of muscle-memory, automatic response in our music testing. Make it simple. Don't recognize the hook? Fine, it's unfamiliar. Love it? Great. Like it. Fine. So-so? Okay. Tired of it? Happens. Don't like it? We understand. Keep the interview simple. Give them easy-to-answer questions. Make it pleasant for the respondent. Get actionable information for programmers.
Music programming has always been a blend of art and science; a combination of data and intuition. Online services like Pandora do the best they can with algorithms and rubrics, but human-curated playlists and schedules augmented with actionable information from consumers to ground programmers in how people feel about songs have created magic for many stations over the years.
We think it's important for people to try to be fit, but we're not fans of asking respondents about which songs fit on which stations.