5 Tips for Evaluating that New Research Study on Cannabis

Sources_of_Information.jpg

You’re scrolling through your news feed and you come across an announcement about a new study that claims cannabis use causes heart disease.

Should you believe it? Is it trustworthy? How can you tell?

Want to learn more about how to effectively evaluate sources of information about medical cannabis? Check out the new course on Evaluating Sources of Information about Medical Cannabis.


Determining the reliability of cannabis studies can be challenging, as research in this area frequently exhibits bias and methodological flaws. Compounding this issue is the tendency of media coverage to sensationalize and exaggerate findings, often aiming to attract attention rather than provide accurate representation.

Fortunately, there are some strategies you can use to help you effectively evaluate this study to determine if it is credible. Here are five tips you can use in your evaluation process:

Check your own biases and expectations going in

bias.jpg

We all have preconceived ideas and opinions that can cause us to either disregard a well-designed study or to overlook glaring red flags, both of which will get in the way of being able to properly evaluate a study and determine its validity. The key is being aware of your preconceptions so that you can overcome them.

Do a quick emotional scan to check for biases. Does the research claim make you feel excited? Or does it make you feel angry? If so, this means that you are likely carrying a bias for or against the study that can potentially cloud your judgement about it. Keep this in mind and try to stay neutral about the validity of the study until you can look at it more closely.

Go to the source and read the published study itself

research_study.jpg

Don’t just stop at the headline or trust someone else’s reporting of the study. Whenever possible, access the full text of the scientific article and read through it yourself – from the very top, with the title, publication information, and authors and their affiliations, all the way down to the bottom, with the acknowledgements, funding sources, and references. Throughout, watch out for common red flags, like:

  • Flaws in the experimental design. For example, small sample size, inadequate blinding/masking, poorly designed interventions, lack of appropriate controls, and failure to account for confounding variables.

 

  • Assuming causation when only association was found. This is unfortunately very common. Be especially careful of this when reviewing observational studies, such as case-control and cohort studies, as these types of studies can’t actually show direct evidence of causation. The only type of study that can really do this is randomized controlled trials (RCTs), and even then, you want to be careful and see this effect replicated in different individual well-designed studies before definitively concluding cause and effect.

 

  • Inappropriate extrapolation of results. For example, inferring that results from a study on mice also apply to humans, or generalizing conclusions to larger populations when the study contains a small sample size or lacks diversity in samples.

 

  • Conflict of interest. This could occur if individuals or organizations involved in the research have a competing financial, personal, professional, or institutional stake in the results, meaning that they have an outside interest in seeing the results come out a certain way.

 

As you read through the study, keep these red flags in mind. In the least, their presence should cause you to pause and consider how they might influence the validity and quality of the study in question.

Additionally, try to start off from a place of skepticism, especially if you already harbor a positive bias toward the study. Go in expecting that the study will need to convince you, rather than you giving it the benefit of the doubt.

Determine the study’s quality of evidence

evidence_(1).jpg

There are several different types of study designs, including randomized controlled trials (RCTs), cohort studies, and case reports, and they all have their own advantages, limitations, and place, depending on the research context. While no study design is perfect, some are inherently better quality than others, so it’s important to recognize which type is being used.

Generally speaking, the highest quality studies are RCTs and systematic reviews, as these study designs are more likely to minimize biases, control for confounding variables, and offer reliable conclusions. Observational studies, like case control and cohort studies, tend to be of medium quality of evidence, and the lowest quality of evidence are case reports or series, and animal or in vitro studies.

However, it’s important to remember that just because a study is a randomized controlled trial or a systematic review does not automatically make it credible or high-quality. You really need to read through the entire study and make sure that the design is solid, the data is analyzed correctly, and that there aren’t red flags that compromise the quality.

Consider the study within the larger context of available research

red_flag.jpg

Don’t be too swayed by any one study, especially if it is of lower quality evidence. One study is just one study, and it must be replicated before you can have sufficient confidence in the results. So, don’t just stop with the one study and take it as fact, take a look at what other types of studies in the field are showing. Does this study correspond with what is being found elsewhere, or is it way off by itself in its findings? It can be hard to tell if something is credible just by itself, but when you start looking at it within a larger context and comparing it with other studies, it becomes more clear. If it is way off from what you are finding elsewhere, that could be a big red flag.

And finally…

Have a trusted anchor source to go to when you are unsure

This can be helpful, especially if you are not very familiar with the given field or study system. An anchor source could be an expert that has proven their credibility and that you trust, or it could be a reputable organization, such as the Association of Cannabinoid Specialists. If you’re unsure, see if they have commented on the study in any way on their social media or reach out to them to get their take on the study. They can likely help you better understand how valid it is and how it fits within the larger body of evidence.

 

Want to learn more about how to effectively evaluate sources of information about medical cannabis? Check out the new course on Evaluating Sources of Information about Medical Cannabis.