Very often, suboptimal adherence to data collection procedures and/or a study intervention is a cause of a failed clinical trial. Recently, Koneksa’s Chief Scientific Officer, Dr. Elena Izmailova, who is a co-founder of the Digital Medicine Society (DiMe), was a co-author on a DiMe study that is a systematic review of what is known about adherence to wearables (also known as Biometric Monitoring Technologies, or BioMeTs) used for clinical research. In it, the authors reviewed 100 studies over five years (2014-2019) that used BioMeTs. Let’s talk about what they found, why it’s important, and who they are!
What did they find?
The authors found that 30% of the studies chose not to use a sensor-based, non-surrogate, quantitative measure of adherence. (As an example, a study might use a self-reported assessment instead of that type of measure, though that type of data would introduce a lot more possibility for inaccuracy.) The authors also found that in the 100 studies in their analysis, there were 37 different definitions of what the term “adherence” meant. And, they looked at what they called the “resolution” of each definition – for instance, a continuous time-based variable would be considered “high resolution” while a simple categorical variable would be “low.” Uniformity in definition was associated with a higher resolution. As a result, the authors recommended that sensor-based, non-surrogate, quantitative adherence data should be reported for all BioMeTs.
What does that mean in practice?
Well, as an example: One clinical trial might use a wearable to measure the minutes a patient exercised per day. In another trial a patient might be asked to check off different boxes related to how much they exercised per day. This paper’s findings show that the first trial would be more likely to use consistent definitions of adherence across all its parameters, whereas the second trial would be more likely to have a variety of different definitions.
Why does that matter?
This is important because consistency can, as the authors of the paper wrote, “improve the reliability and comparability of adherence measurements to support further BioMeT evaluation and decision-making in both research and clinical care settings.”
In short: the more precise your definitions are, the more your data will be able to tell you.
Who did this study?
The research paper was published in the Journal of Medical Internet Research and was called “Recommendations for Defining and Reporting Adherence Measured by Biometric Monitoring Technologies: Systematic Review.” Its authors, including Dr. Izmailova, are all members of the Digital Medicine Society (DiMe), a 510(c)(3) nonprofit professional society for the digital medicine community, under whose auspices the research was conducted.
Want to learn more?
You can read the paper in full at https://www.jmir.org/2022/4/e33537, and find the Digital Medicine Society (DiMe) at https://www.dimesociety.org/