Summary
Research is how we learn, but our findings depend upon robust data, derived from the populations we seek to serve. As teams build digital environments for research, effort must be put into ensuring that those environments support the populations of study. When constructing digital environments for research, teams must think broadly about the full research experience to include emails or SMS sent, websites, webapps or native apps provided, and any devices included or issued. In addition, it cannot be stressed enough: the work in creating these environments is never done!
Understanding Your Participants
Digital interventions are here to stay! But, how do you ensure your digital interventions are welcoming, inclusive, and supportive environments for research participants? The answer is simple: each and every facet of the research experience should be examined from several key perspectives, and a number of tools employed to address deficits. This is not an exhaustive list, but instead includes examples from various teams who work in this space.
Medical (and Research) Mistrust
If your population includes those who have historically been marginalized or mistreated by the medical field or research, your team must accept that no matter how prestigious the University of Michigan is in this space, or how eminently qualified and recognized the research team members, you are facing an uphill battle to engage. Recruitment and retention will require that your materials and processes build trust and respect autonomy. Research is always voluntary, and taking the time and effort to engage in research for those who, by default, have a diminished desire or faith in the process is simply not optional if you want valid data.
Considerations:
- For those who have been marginalized or mistreated in health care settings, their experiences have shown them that doctors do not, in fact, "know best!" and may be dismissive of an individual's needs, concerns, and wishes. Instead, provide information that rests on proven or generally accepted fact to communicate health-related topics.
- Avoid medical terms and unnecessary jargon. Clear writing is paramount, but language is flexible. "Antihypertensives" may be the correct classification for medication that lowers blood pressure, but participants are far more likely to recognize and personally use "blood pressure medicine/medication/pills" when communicating with their doctor or others. Seek to communicate clearly and in a way that is fully recognizable.
- Hyperlinks should be labeled and direct to trusted sources. A "CLICK HERE!" link isn't fully transparent, and implies that the participant doesn't need to know where it's going because your team knows best.
- Images used in your digital environments should reflect your population. Representation of your population in terms of race, ethnicity, gender, sexual orientation, age, and disability demonstrate to participants that you welcome those populations.
- Questions, whether they are from study surveys in research visit contexts, or administered through your digital environments, should be culturally sensitive and reflect what we know are a range of responses, which may included multiple choice. This could mean deviation from validated instruments, but here at the University of Michigan, we are leaders. Lead!
- Run your materials - recruitment, intervention content, and the digital experience - not only through subject matter experts, but through an appropriate Advisory Board. Boards made up of representatives of your population will give you critical insight into how your study will be perceived by participants so that you can make adjustments (and get IRB approval for them) before you enroll a single participant, and before you lose one because you've failed to build that critical trust with your environment.
- Very commonly our digital environments are constructed with and are integrated heavily with 3rd party applications and vendors. As a reminder, these 3rd parties should be carefully vetted by you, possibly by University groups such as Information Assurance, and when appropriate, should have fully-executed agreements (service contracts, data use agreements, etc.) to ensure that participant data (and your study data) are as protected as possible. Terms of Service (TOS) documents are commonly provided to participants, and we need to be fully transparent to participants about what those documents include. We need to be upfront when we know those TOS are locked for the duration of the study, or when updates to those are simply out of our control. Be prepared to answer questions.
Technology Literacy
Our teams love our technology, and we build and conduct research in digital environments with our shiny toys because we believe, passionately, that they will have profound impact on health and quality of life. But our populations of study may be no where near as tech-savvy and may also simply not trust the fancy gadget you hand them, or even the fancy phone in their own pocket! While the truly tech-illiterate may be a shrinking number in our study cohorts, they are still there, and the range of literacy can vary widely. Think carefully about how you ensure that your digital environments are accessible to those on both ends of the digital divide.
Considerations:
- Bluetooth® has been a staple method to gather data from peripheral devices such as blood pressure monitors, pedometers, and weight scales. However, there are multiple standards, they change, and devices can lose pairing. Any team with experience in using these devices knows, intimately, the pain in trying to talk a participant through the process of troubleshooting a device that's not syncing or re-pairing a failed connection. In addition, Bluethooth® requires a device of some sort to then transmit the data on to the study team or cloud service for retrieval, which adds another device to the mix. SIM-enabled devices are becoming more common and more affordable, and rely instead on the cellular network to send data and skip an intermediary device. In our studies these devices are preferred by both the research team AND the participants for their simplicity. Consider building your budgets accordingly.
- Digital environments that rely on one device type (i.e. iPhone) and/or the latest operating system, will automatically exclude those who use other and/or older devices. Even if you provide devices to participants as part of your protocol, adding yet another (and different) device into their routine can lead to participant burden, confusion, and drop-out, not to mention devices which go "missing." Build your budgets to ensure that you cover a wide range of digital devices, including older models or operating systems. This could also mean making some compromises in your digital design. A webapp approach is not as integrated as a native app, but it allows you a much larger cohort of people to enroll when you're operating in a "bring your own device" study.
Disability
One of the promises of digital interventions relates to how we can expand access for populations who may struggle to receive traditional medical care. Technology may help reduce in-person clinical care visits for those with disabilities that make traveling to or attending clinical appointments challenging, or provide additional support in managing activities of daily living, which could facilitate greater independence. You do not need to be an expert in this space to do good work here, but you do need to bring in the experts.
Considerations:
- Digital environments should be as accessible to screen readers as possible. There are some great online tools you can use to check yourself, but experts in the Equity, Civil Rights, and Title IX Office are also available for consultations here at UM. Use the incredible resources at your disposal and evaluate your environments and digital documents thoroughly.
- Photographs and illustrations may be worth a thousand words, but that carefully chosen image may be utterly incomprehensible to someone who is colorblind. There are online color blind tests or evaluators you can run your image through to see what your participants may see so that you can ensure your participants see what you want them to see.
- Any devices issued should be carefully considered for how accessible (or not) they could be. Are display screens exceedingly small? Are any audio prompts loud enough, or clear enough, and can those prompts be repeated? If it is a device that must be applied to the skin or wrapped around a limb, can it be done so appropriately by those with more compromised dexterity or range of motion? Understandably, the devices available in the marketplace for you to chose from may be limited, but do your best.
Other Topics/Elements to Note
Depending on what your research is seeking to do, there are other factors you should consider in building your digital environments. It's my hope that this article has your gears turning to evaluate your specific situation, but here are some more prompts:
- Behavioral interventions in a digital realm often rely on feeding data collected from participants directly back to them to spur or reinforce behavior change. Are the data outputs and/or visualizations truly appropriate for those purposes? Participants with low data literacy may truly struggle to parse complicated graphs, may struggle with numeracy, and not understand statements related to relative risk. Use terms and tools to make the delivered data as understandable and accessible as possible. For example - a stepcount graph that has the X axis at 0 but the Y axis set at 10k will heavily minimize those who are making great slow and gradual (and therefore safer!!) improvements in their walking. Instead, use visualizations that are adjusted for appropriate max ranges, or (much better) are relative to the submitted data for the displayed time points.
- It's not just about jargon or medical terms in the language you use, it's also about grade reading level. Template forms can make it exceedingly challenging to hit the IRBMED-recommended target of a 6th to 8th grade reading level, but do your best. At study visits, have scripts ready to take those documents and paraphrase them into more accessible language to your population.
- Don't assume your population is urban, or that elements of urban living are relevant. Encouraging people to just add walking "an extra city block" to their routine may be an utterly useless bit of advice to help increase step counts where city blocks don't exist. Recommending that people switch grocery stores to increase their intake of fresh fruits and vegetables ignores the fact that in rural settings, only one viable market may be available in reasonable driving distance. Consider also your study calendar and visits: can a study visit be fully viable as phone or video visit? Reducing the amount of travel to campus can ensure that rural participants have the same opportunities to participate fully in your study.
Evaluation
Evaluation of your digital environments absolutely cannot be neglected, and should be robust. A single question asking if your study would be recommended to another is a simply useless metric in isolation (and, in our experiences, always results in an overwhelmingly positive endorsement, even in studies where engagement with the digital environment was abysmal.) Instead, build your study team with, or plan to consult heavily with, mix-methods researchers such as those with the UM Mixed Methods Program who can help you develop a thorough evaluation of each element of your digital environment. This will not only help you answer your research question, and help you understand retention and engagement, but help you plan your 2.0 build to be even more inclusive. Plus, the results of that evaluation will help you obtain the funding for your 2.0 efforts when properly included in your next grant submission!
Notes
- This article was developed to support a poster presented at the MeTRIC Symposium, held on November 10th, 2023.
- A special thanks to Dr. Lorraine (Laurie) Buis, PhD, as well as the other outstanding faculty and staff I've collaborated with here at UM and at other institutions.
Resources
About the Author
|
Reema Kadri is a Project Manager at the Department of Family Medicine and a staff co-lead with the Behavioral Research Innovation and Support Program (BRISP) in the Michigan Institute of Clinical and Health Research (MICHR). Reema has managed or consulted on numerous research studies, but primarily those related to technology-mediated behavior change and telehealth.
|
|