In health research, clinical and lab data only go so far. Researchers often want to perform data collection about people’s awareness and attitudes, too — how individuals feel about a particular topic in the broader context of everyday life.
Surveys are the primary method to get those insights and, more importantly, see how they change over time. Unlike genomic data, which does not change over time, a person’s perceptions can be more dynamic and fluid. What a participant thinks today may differ from what you think tomorrow, next month, or next year, and all of those are worth tracking.
So says Megan Ruxton, Ph.D. As a survey methodologist on our Digital Health Services team, Ruxton spends her days refining survey strategy and questions for Vibrent collaborators, such as academic centers, consortiums, and life science companies.
Ruxton’s is a tough job because people aren’t always forthcoming about their lives, particularly regarding their health. So the methodologist’s goal is to ask the questions behind the questions — to get inside the very psychology of survey takers — for better, more meaningful data collection.
What makes for a good survey, a really good one where people are honest and insights are rich? We sat down with Ruxton to learn what she wishes every research team knew.
Ruxton: Researchers tend to make assumptions about how much their participants know. And instead of leading up to the more complex questions, they’ll launch right into that complexity. You have to use the survey as a learning tool for participants so they understand what you’re asking. That often means avoiding jargon and thinking critically about what’s common knowledge and what isn’t.
Ruxton: I like to think of it as the grandparent test. Whether it’s a grandparent who never graduated high school or someone with a Ph.D. — will they understand the question?
If we want to get more scientific, one tactic I recommend is cognitive interviewing. This is when you recruit a focus group of “practice” participants who you interview while they take the survey. That way, they can verbalize their thought process for each question. If you don’t have the time or resources for that, you can always run the survey by colleagues who aren’t close to the research to see if the material makes sense.
Ruxton: Sensitive questions are similar to complex questions in that respondents typically need a gradual lead-up to get there. For example, you don’t want to ask questions about sexual history immediately — try to ask other things first. And always include a “trigger warning” for sensitive topics. This warning should not only let people know those questions are coming but also reassure them that they can skip anything they don’t want to answer. This is preferable to having people opt out of the rest of the survey because a question made them uncomfortable.
Ruxton: No, the best format is a diversity of questions. This keeps people from getting bored or frustrated, and it also supports the integrity of responses. You don’t want all multiple choice, open-ended response, or matrix questions. It’s best to have a mix when it’s possible, but the format also needs to make sense for the question you’re asking.
Beyond that, keep it short and sweet and make things as easy as possible. Ultimately, we’re aiming to avoid cognitive burden when participants take surveys. You can mitigate that burden by limiting the number of questions per page and also limiting questions that require too much work from participants, such as open-ended responses.
Ruxton: Yes, and they often are. Survey methodologists have access to a library of validated question sets that have been tested against certain survey objectives. For example, say you want to assess depression. You don’t want to ask the participant, ‘Are you depressed?’. Instead, you might ask a series of things such as: ‘Do you still take joy in the things that used to bring you joy?’ and ‘Do you have contact with friends or family?’ and ‘Do you have hobbies?’ At Vibrent, we have many of these multi-question collections that have been verified to work for various research purposes.
Ruxton: Break questions into chunks so that people can complete the survey as they have time. With Vibrent’s platform, for example, participants can come and go, and it saves the data from each session.
Another advantage to dividing questions into these “mini-surveys” is that you can add features between sets to keep people hooked — like a fun animation, facts about the study, or other interactive media. You can also display a comparative analysis, such as charts showing how other people have answered the same questions. People are generally interested in seeing how their answers stack up to everyone else’s, so that’s a good way to keep them going.
Ruxton: Maybe. It’s a judgment call that depends on how early you are into the project and what value could come of any changes. You want consistency between one survey and the next, but at the same time, there’s no use in getting a lot of consistent data if it doesn’t tell you anything. Follow the responses from the beginning and establish frequent checkpoints to ensure things are going as you intend. If they’re not, yes, pivot — but be transparent about those changes to all stakeholders.
Ruxton: As early as possible. You want to build a relationship that not only entices them to take the initial survey, but potentially reengages them regularly through email, SMS, and phone outreach for future data collection, too.
We recommend establishing a recruitment and registration page with information about the study, the principal investigators, and ancillary research that may be taking place. For those who sign up, it becomes an education campaign after that to walk people through the basic expectations and why their contributions matter. There may also be a pre-survey or baseline survey to screen participants before they start.
Ruxton: It ideally continues. You may want to reengage participants over time to complete multiple surveys. Even if you don’t have a new survey for them to take, I still recommend some light engagement just so that participants don’t forget about you. It could be just a “mini” check-in survey with one or two questions to snag a quick pulse on something. It could be you asking whether a certain response has changed. Or it could be strictly informational, telling people how the research is going and how their participation is making a difference.
Ruxton: You should plan ahead and engage a survey methodology team in advance. It’s much easier to make recommendations from a clean slate rather than reacting to a bunch of pre-developed questions. When you can be more strategic at the get-go, you get better data collection that you can combine into the fuller ecosystem of EHR and other insights. And that’s what helps researchers paint more comprehensive pictures of real-life people.
Vibrent’s Digital Health Research Services team includes researchers and specialists with diverse expertise to help our collaborators with all stages of the research life cycle, and one of those skill sets is survey methodology. Our team also includes experts in IRB consulting, eConsent design, protocol workflow collaborative design, brand and logo design, diversity / UBR strategies, communication strategies, data collection, and custom analytics.
If you want to learn more about our team, or are interested in collaborating, request a pilot so we can begin discussions about your research aims.