Content marketing can get your company attention if it’s done well, but publishing original research can earn something much more elusive: authority. Producing original research means you have insights about your industry that no one else owns. Given how hard it is to say something truly new and compelling with content marketing, it’s a small wonder more people don’t use it.
But not all research projects are created equal, and over the years I’ve learned to spot research inexperience. Here are 8 signals that tell me whether a study was well designed or produced.
- Telling stories vs. taking inventory: In my 15+ years writing and publishing research, this is the number one thing people get wrong. When someone undertakes a research project, they hear a siren call that says, “Let’s find out all the things!”After all, you’re spending all this time and money on a survey. Why not ask … everything? The problem with this approach is (a) your survey takers won’t complete your survey, and (b) reading through all the things is downright boring. Get focused! For example: a study should not be about artificial intelligence broadly, but the challenges of adopting AI tech in healthcare. Narrow your area of study so you can extract meaningful, never-before insights.
- A clearly stated methodology: A competent study — whether survey-based or otherwise — will publish a clear methodology that shares the sample size, how respondents were recruited, and demographic summaries relevant to the study (e.g., gender, years of experience, role, geography). These details help your reader gut-check whether your findings are worthwhile.
- Competent sampling (or, when lacking, transparency about potential bias). How did you source respondents for your survey? Did you use a panel and try to get an accurate sampling based on the underlying population? While ideal, this approach isn’t always cost-effective. For niche target groups, panels can cost well over $50 per complete, a price many companies simply can’t pay. Sampling your own audience is free and effective, but you should be transparent about the biases that may surface based on that audience. (For example: If you’re trying to determine how SEO-savvy new business owners are, polling members of an SEO tech platform means you’re getting opinions from people who are by definition more advanced.)
- Questions that don’t push a point of view: You think you’re being sneaky but I see you. Don’t design questions to push your product or service. The funny thing: If you ask a question that’s patently self-promotional, your survey takers will kick your butt. I was once involved in a study in which the client really wanted to ask a barely disguised question about whether people preferred their product or their competitors’. Our smart survey-takers smelled a rat and chose the more “primitive” product from the lineup. It was a bit of a “&%$# you and the horse you rode in on.” And it made me giggle.
- A report-writer who speaks data: Not all good writers have the skills to report your research findings. That’s because you want a writer-analyst, not just a writer. You want someone who’s not going to regurgitate the basic facts of the survey, but put those findings in context and explain why they’re fascinating. Experienced journalists are excellent analysts, and I seek them out for tough research reporting assignments.
- Third-party research that’s relevant, timely, and properly cited: Excellent research reports tie in other people’s data to make their case. For example, if you designed a survey about telemedicine in healthcare, your audience is going to be even more taken by your findings if you can show that other respectable research organizations are uncovering similar issues. But for the love of all things nerdy, please don’t quote a third-party study if it’s more than two years old. And quite frankly, in COVID times, I’m loath to quote studies more than 3 months old. Be smart about what data are truly relevant.
- Charts & graphs that can stand alone: This is a personal pet peeve, but it bears making this list. If you publish a blog post or any type of report that contains your original charts and graphs, each one of those visuals should be able to stand alone. In other words, if someone takes a screenshot of your chart, it should contain all the information a viewer needs to make sense of that insight, including the key finding, the question asked, the sample size, and the source. Find out more about designing templates for charts and graphs in a blog post I wrote for the Content Marketing Institute.
- Data visualizations designed for clarity, not sparkle: We all get taken in by cool visualizations. Who of us hasn’t swooned over Edward Tufte’s books? But let’s get real, few (I daresay none) of us need to channel Beautiful Information to publish thought leadership research. Your mantra should be KISS (keep it simple, stupid). Always design for clarity and don’t be afraid of the good ol’ bar chart or line chart if you’re inexperienced. My favorite book for chart and graph design is The Wall Street Journal’s Guide to Information Graphics. There is time, young grasshopper, to graduate to spider charts and heat maps.
Writing all that out is particularly painful because I sound like a research scold, but these are good early signals of whether a project is run competently. And I share these with a tremendous amount of humility, having made a lot of mistakes in my years designing research.
What would you add to the list?