The rise of social media platforms has reprogrammed the way we access and interact with new information. We have the opportunity to share and engage with personal experiences with millions of users around the world, allowing for discussions with a wider reach than ever before. It has also revolutionized the way we engage with health information. In the past, health information was distributed primarily by physicians and used by their patients directly (1).
However, the beginning of the ‘modern medical era’, which began at the end of World War II, challenged the physician-patient hierarchy and emphasized patient autonomy (2). Prior to the modern medical era were thousands of years where the paternalistic model - or the ‘doctor knows best’ model - defined physician-patient interactions. Paternalism assumes that the physician knows what is best for the patient, and therefore can withhold important information from and make health decisions for the patient (3).
With the shift away from the paternalistic view of medicine, as well as the ability to access health information online, this dynamic has changed. Patients are more in charge of their own health than ever before, and have a more collaborative role with health professionals (1). People are able to share their health journeys with others online and seek peer-to-peer recommendation (15). Learning from lived experience is a great way to expand your knowledge beyond the clinic and gain insight into the day-to-day lives behind folks with different health experiences.
While this is beneficial to patient autonomy, knowledge exchange, and peer support, it can be a double edged sword. There are amazing influencers (see missINFORMED approved list) who responsibly share credible health information and lived experience in a way that empowers their followers, and we LOVE them. However, there are just as many sharing incorrect and often downright dangerous information for views or to push products - it is important to look out for such influencers on social media and talk about the harm that they impose on their viewers.
In this blog post I will provide an overview of how social media algorithms work, why misinformation thrives on social media, and its impact on consumers like you and I.
Social media algorithms function as echo chambers
The scientific community has grown increasingly concerned about health misinformation on social media - and for good reason (4). Social media can be a cozy home to many unfounded claims and conspiracies designed to instill fear or sell bogus health products. This not only harms our individual health but can also give rise to social movements that have serious consequences for our collective, public health.
Of course, the most fitting example of this right now is anti-vaccination movements. The spread of misinformation by anti-vaxx groups on social media has played a role in the resurgence of previously eradicated diseases (4, 5). Measles, for example, was thought to be eradicated in 2000 in the United States through vaccination efforts. However, the measles, mumps, and rubella (MMR) vaccine has been a hot topic for anti-vaxx activists since a fraudulent - and since retracted - paper linked the vaccine to autism (6). This false claim led to the MMR vaccine becoming a target of anti-vaxx rhetoric by social media groups, and demonized by popular celebrities - most notably Jenny McCarthy.
The result? Measles, mumps, & rubella (MMR) vaccination has been decreasing consistently, and over the past three years has finally slipped below the ‘herd immunity’ level of 95% (7). This has resulted in measles outbreaks re-emerging from unvaccinated people - such as the 2014 Disneyland outbreak in California that spread from unvaccinated children to infect over 50 people (5,7). This isn’t just a US problem - Canada has also seen a substantial increase in mumps cases since 2016 (8). The spread of misinformation has real-life implications, especially in the context of public health.
To understand how misinformation spreads so easily on social media, I think it’s useful to understand how social media algorithms work to personalize your experience. Social media platforms use inferential analytics - a statistical method that allows the software to make educated guesses about the properties of its users (9). These properties include things like race, gender, sexual orientation, political opinions, and other aspects of your personality. Platforms tailor content to your profile based on these properties, and get feedback from your interactions with content to create a personalized feed (15).
The ultimate goal? To make your feed as relevant as possible to you to keep you online as long as possible. As you interact with things you like, Instagram shows you more of that content. This is harmless when it comes to things like funny cat videos and TikTok dances, but potentially dangerous when it comes to health information. It is very easy for your feed to become an echo chamber - an environment where praise and value is placed on information that supports one idea, and ignores/ridicules information that doesn’t fit that narrative (15, 10).
Let’s explore this through the example of anti-vaccination, and imagine you are a person who is vaccine-hesitant. You aren’t necessarily anti-vaxx, but you aren’t sure if it’s right for you. You come across an influencer talking about vaccine injury, and to learn more you engage with more of their content. Your feed slowly shows you more and more of similar content, and all of a sudden you are seeing hundreds of posts demonizing vaccines and promoting anti-vaxx narratives. What you won’t see on your reels is that in Canada, the reported adverse events for all vaccines is less than 0.013% - and of these, 91% are non-serious events (11).
With the sheer volume of health-related posts on social media, you may think it is just as likely that you will come across evidence-based health information as health misinformation, but this isn’t the case. The problem is that misinformation is much more popular and more frequently shared on social media than evidence-based information (12). Misinformation narratives are dominated by personal stories, negativity, and opinionated tones which evoke anxiety and mistrust in readers (13). This emotional response renders people vulnerable to misinformation, and can create a sense of urgency to react to and share emotionally charged information with others - allowing it to spread quickly online (13). What makes this even more concerning is that it has been shown that after only one exposure to a piece of fake information, you are more likely to believe it is accurate regardless of the source (14).
Social media narratives allow misinformation to thrive
Social media influencers have an important role to play, as they can speed up the spread of information and create more opportunities for wrongful information to flourish (15). Compared to traditional media sources, information can be shared on social media quickly, without peer review, and for no cost. Content shared online can spread like wildfire through large followings, shares, and boosts by social media algorithms. The other difference compared to traditional news sharing is that we are more likely to trust influencers (16). This, in part, comes from the way we perceive influencers. For example, influencers often use communication strategies which create perceived similarity between them and their followers, as well as feelings of familiarity and sympathy (17). With this perceived similarity, personal stories become extra powerful, because it is easier for you to put yourself in their shoes (16).
With great power comes great responsibility, but unfortunately this power is often abused. A personal narrative can be shared to encourage education, resource-sharing, awareness, and other positive outcomes within a community. However, the same narrative can be twisted to push ‘life-changing’ products, unscientific alternative treatments, and misleading information to gain money and a bigger following. Considering that social media platforms incentivize people to share content that has high engagement - you can see how the second option may be more attractive. Social media users carry the burden of trying to distinguish between the two, but it isn’t always simple to do. It has been shown that listening to personal anecdotes worsens a person’s ability to engage in scientific reasoning - even when researchers controlled for education levels and thinking patterns (18). Whether the narrative shared is factual doesn’t matter - fiction can be just as impactful (19). So misinformation is not just more likely to appear on social media, the way it is communicated may discourage scientific reasoning on the subject at hand.
We can see how this plays out in real-life through a study that looked at information sharing on Facebook, Reddit, Twitter, and Pinterest - 4 of the most popular social media sites worldwide (20). Researchers looked at popular posts about breast cancer across these platforms, specifically the post content and sharing behaviors that followed. They found that the volume of verified, evidence-based content was slightly greater than the amount of misinformation. However, misinformation was shared over 3x more on average than evidence-based content (20). They also found that most of the articles sharing misinformation weren’t completely fabricated - but instead were misleading or took information out of context. Context is key when it comes to understanding scientific information, and when valid information gets taken out of context it can be very damaging.
What if I told you I had an incredible essential oil that could cure cancer? What if I told you studies have shown that it can kill cancer cells and reduce the size of tumors. Sounds pretty incredible right? Distributors for Doterra - an essential oil multi-level marketing company (MLM) - have marketed the oil Copaiba as a powerful anti-cancer treatment (21). While it is true that Copaiba has been shown to kill cancer cells and reduce tumors, this has only been shown in laboratory studies (22). No clinical trials exist for Copaiba, which means we do not have an understanding of how it works in the human body. Doterra reps’ praise of Copaiba was entirely based on its ability to kill cells in a dish without any implications for its interaction with actual human physiology. So while their claims had some basis in fact, the anti-cancer properties of Copaiba were taken out of context and indicated as a ‘miracle oil’ for cancer when there is no evidence-based research to support it as a cancer treatment. This, in part with other outrageous health claims led Doterra to be slapped with a warning against making misleading claims from the FDA in 2014 (23). A piece of advice in general - never trust health advice from an MLM distributor.
Social media companies know - and don’t care - that they’re toxic
Let’s shift gears to talk about the other problem about following health information online: social media companies don’t care about you, they care about profit. Facebook Inc has recently been under fire due to a series of articles published by The Wall Street Journal based on leaked internal Facebook documents (24). The documents, dubbed ‘The Facebook Papers’ showcase research that indicates Facebook knew about - and ignored - the harms of its services. The research indicated negative mental health effects on teens - including worsening body image in teen girls. But despite this research existing internally, Facebook was developing a version of Instagram specifically for children under 13 (25).
Before the internal documents were even released, a group of Democratic US lawmakers urged Facebook to ‘abandon its plans to launch a version of Instagram for kids’ because ‘Facebook has a clear record of failing to protect children on its platforms’ and ‘when it comes to putting people before profits, Facebook has forfeited the benefit of the doubt’ (26). It took the leak of the ‘Facebook Papers’ and the resulting backlash before Facebook paused the development of ‘Instagram Kids’, despite ‘[believing it] is the right thing to do’ (25). At the end of the day, social media companies benefit by keeping as many people (apparently even children) on the site as long as possible so the platform can predict your behaviors for marketing purposes (remember when we talked about inferential analysis), and have more influence over your behaviors to drive profits (9).
Influencers, particularly those trying to sell something, can also get caught up in this profit-making game. Instagram is the most important social media outlet for influencers because the platform allows for the most efficient interaction, and has the most opportunities for paid collaboration (17). Influencers can make money in ways such as (but not limited to) sponsored content, selling merchandise, and affiliate marketing (30). While they are marketing this content, influencers are also marketing themselves to you.
One study from Germany showed that the way health/fitness influencers communicate information builds a connection between external beauty and perceived well-being (17). Their posts show imagery of an ‘ideal’ body type with a strong implicit message that this body image cannot be achieved without their advice - and the products they push. The study found that the majority of posts by fitness influencers focus on some aspect of the influencers body, and 71% of these posts had at least one brand being advertised. Although this doesn’t seem outright harmful, it creates the mental connection between products and the body type being shown, even if this is unrealistic or dangerous for the viewer. The pressure to maintain an ‘ideal, healthy’ body image has dominated social media for the past decade, with health and wellness posts mainly highlighting the importance of weight loss and management to improve wellbeing (27).
This narrative is packed with assumptions that are disproportionately harmful to people who live in larger bodies, or have existing body image issues. Equating beauty and health with thinness has been identified as a risk factor for developing body image concerns, disordered eating, and excessive exercise behaviors. The hashtag #thinspiration was banned on Instagram in 2012 following research suggesting it was harmful to body image, but has since been replaced with #fitspiration. #fitspiration disproportionately targets women, and normalizes compulsive exercise and spending excessive time preparing food. This messaging goes way beyond the realm of fitness as well, as influencers promote ‘detox teas’, ‘healing essential oils’, and other sham products that have no proven benefit, and may be potentially harmful. (Here is a reputed source about how to identify misleading claims about nutrition online).
Natural health products exist in a legislative ‘gray area’
One of the biggest issues with buying into health products online is that they’re largely unregulated (28). When you buy health products from reputable sellers in Canada, they are supported by regulatory bodies such as the Canadian Food and Drug Administration and Natural Health Product regulations. But the internet is international, and you can easily access products that are completely unregulated - and these can just as easily be marketed by influencers. Many of these products exist in a ‘grey’ area of legislation that haven’t been challenged by regulatory bodies (28).
Additionally, fraudulent health products are under-reported to anti-fraud agencies, which may be due to embarrassment or fear of appearing gullible by buyers. The main problem with these types of health products is they are often attached to outlandish claims. The greatest example of this in pop culture can be seen through Gwyneth Paltrow and her snake-oil peddling company Goop (say it with me - f*ck Goop!).
Goop had advertised two products with incredible claims - an essential oil blend that fights depression, and vaginal jade/quartz eggs that ‘boosts sexual energy and health’ (29). Of course, she was required to backtrack in 2018 when she settled out of a false advertising lawsuit regarding these products.
If you take away anything from this article, it’s to think twice before giving into health claims on social media. If you read about a magical serum that can cure depression - think for a second. If this were actually true, do you really think you would be hearing about it for the first time from a random influencer on social media (or Gwyneth Paltrow)? Probably not. The unfortunate reality is that quick fixes simply do not exist for the majority of health concerns, and if they did you likely would not be able to purchase them from an affiliate link on your favorite influencers page.
Be cautious about what health advice you follow and the products you consume, and be critical of the sources you obtain information from. Most importantly, make sure to check in on yourself and reflect on your social media use. Reflect on how you feel after engaging with different types of content, and use the algorithm to your advantage by engaging with content that makes you feel good and ignoring everything that doesn’t!
Influencers WE trust!
The official missINFORMED-approved list of influencers:
Samantha Yammine @science.sam
Jen Gunter @drjengunter
Dr. Ruchi Murthy @drruchimurthy
Dr. Liz Marnik @sciencewhizliz
Dr. Onye Nnorom @dr.o.nnorom
Dr. Heather Irobunda @drheatherirobundamd
Dr. Risa Hoshino @dr.risahoshino
Dr. Nicole Sparks @nicolealiciamd
Dr. Katrine Wallace @epidemiologistkat
1.Díaz-Martín AM, Schmitz A, Yagüe Guillén MJ. Are Health e-Mavens the New Patient Influencers? Frontiers in Psychology [Internet]. 2020 [cited 2022 Jan 14];11. Available from: https://www.frontiersin.org/article/10.3389/fpsyg.2020.00779
2. Siegler M. The progression of medicine. From physician paternalism to patient autonomy to bureaucratic parsimony. Arch Intern Med. 1985 Apr;145(4):713–5.
3. Provider-Patient Relationship - MU School of Medicine [Internet]. [cited 2022 Jan 14]. Available from: https://medicine.missouri.edu/centers-institutes-labs/health-ethics/faq/provider-patient-relationship
4. Li Y-J, Cheung CMK, Shen X-L, Lee MKO. Health Misinformation on Social Media: A Literature Review. 2019;12.
5. Evrony A, Caplan A. The overlooked dangers of anti-vaccination groups’ social media presence. Hum Vaccin Immunother. 2017 Apr 13;13(6):1475–6.
6.Hooker BS. Measles-mumps-rubella vaccination timing and autism among young african american boys: a reanalysis of CDC data. Transl Neurodegener. 2014 Aug 27;3(1):16.
7. Mellerson JL. Vaccination Coverage for Selected Vaccines and Exemption Rates Among Children in Kindergarten — United States, 2017–18 School Year. MMWR Morb Mortal Wkly Rep [Internet]. 2018 [cited 2022 Jan 14];67. Available from: https://www.cdc.gov/mmwr/volumes/67/wr/mm6740a3.htm
8. Use of Measles-Mumps-Rubella (MMR) Vaccine for the Management of Mumps Outbreaks in Canada - Canada.ca [Internet]. [cited 2022 Jan 14]. Available from: https://www.canada.ca/en/public-health/services/publications/vaccines-immunization/use-measles-mumps-rubella-vaccine-management-outbreaks-canada.html
9. Use of Measles-Mumps-Rubella (MMR) Vaccine for the Management of Mumps Outbreaks in Canada - Canada.ca [Internet]. [cited 2022 Jan 14]. Available from: https://www.canada.ca/en/public-health/services/publications/vaccines-immunization/use-measles-mumps-rubella-vaccine-management-outbreaks-canada.html
10. Cinelli M, Morales GDF, Galeazzi A, Quattrociocchi W, Starnini M. The echo chamber effect on social media. PNAS [Internet]. 2021 Mar 2 [cited 2022 Jan 14];118(9). Available from: https://www.pnas.org/content/118/9/e2023301118
11. Canada PHA of. Vaccine safety in Canada, 2017 [Internet]. 2019 [cited 2022 Jan 14]. Available from: https://www.canada.ca/en/public-health/services/reports-publications/canada-communicable-disease-report-ccdr/monthly-issue/2018-44/issue-12-december-6-2018/article-4-vaccine-safety-in-canada-2017.html
12. Wang Y, McKee M, Torbica A, Stuckler D. Systematic Literature Review on the Spread of Health-related Misinformation on Social Media. Social Science & Medicine. 2019 Nov 1;240:112552.
13. Confronting Health Misinformation: The U.S. Surgeon General’s Advisory on Building a Healthy Information Environment. [Internet]. 2021 [cited 2022 Jan 14]. Available from: https://www.hhs.gov/sites/default/files/surgeon-general-misinformation-advisory.pdf
14. Pennycook G, Cannon TD, Rand DG. Prior exposure increases perceived accuracy of fake news. Journal of Experimental Psychology: General. 2018;147(12):1865–80.
15. Lutkenhaus RO, Jansz J, Bouman MP. Tailoring in the digital era: Stimulating dialogues on health topics in collaboration with social media influencers. DIGITAL HEALTH. 2019 Jan 1;5:2055207618821521.
16. Lee JA, Eastin MS. I Like What She’s #Endorsing: The Impact of Female Social Media Influencers’ Perceived Sincerity, Consumer Envy, and Product Type. Journal of Interactive Advertising. 2020 Jan 2;20(1):76–91.
17. Pilgrim K, Bohnet-Joschko S. Selling health and happiness how influencers communicate on Instagram about dieting and exercise: mixed methods research. BMC Public Health. 2019 Aug 6;19(1):1054.
18. Rodriguez F, Rhodes RE, Miller KF, Shah P. Examining the influence of anecdotal stories and the interplay of individual differences on reasoning. Thinking & Reasoning. 2016 Jul 2;22(3):274–96.
19. Dahlstrom MF. The narrative truth about scientific misinformation. PNAS [Internet]. 2021 Apr 13 [cited 2022 Jan 14];118(15). Available from: https://www.pnas.org/content/118/15/e1914085117
20. Biancovilli P, Makszin L, Csongor A. Breast cancer on social media: a quali-quantitative study on the credibility and content type of the most shared news stories. BMC Women’s Health. 2021 May 15;21(1):202.
21. Essential oil sellers believe they have a cure for cancer | Stuff.co.nz [Internet]. [cited 2022 Jan 14]. Available from: https://www.stuff.co.nz/national/health/97569498/essential-oil-sellers-believe-they-have-a-cure-for-cancer
22. In vivo and in vitro studies on the anticancer activity of Copaifera multijuga hayne and its fractions - PubMed [Internet]. [cited 2022 Jan 14]. Available from: https://pubmed.ncbi.nlm.nih.gov/14595585/
23. DoTerra Distributors’ Drug Claims Violate the Law | Truth In Advertising [Internet]. [cited 2022 Jan 14]. Available from: https://www.truthinadvertising.org/doterra-distributors-drug-claims-violate-the-law/
24. Facebook Struggles to Quell Uproar Over Instagram’s Effect on Teens - The New York Times [Internet]. [cited 2022 Jan 14]. Available from: https://www.nytimes.com/2021/10/01/technology/facebook-instagram-teenagers.html?action=click&module=RelatedLinks&pgtype=Article
25. Pausing “Instagram Kids” and Building Parental Supervision Tools [Internet]. [cited 2022 Jan 14]. Available from: https://about.instagram.com/blog/announcements/pausing-instagram-kids
26. U.S. lawmakers urge Facebook to drop plans for Instagram for kids | Reuters [Internet]. [cited 2022 Jan 14]. Available from: https://www.reuters.com/article/legal-facebook-instagram-children-idUSKCN2CZ1YE
27. The pursuit of wellness: Social media, body image and eating disorders - ScienceDirect [Internet]. [cited 2022 Jan 14]. Available from: https://www-sciencedirect-com.proxy1.lib.uwo.ca/science/article/pii/S019074092032082X
28. Garrett B, Murphy S, Jamal S, MacPhee M, Reardon J, Cheung W, et al. Internet health scams—Developing a taxonomy and risk-of-deception assessment tool. Health & Social Care in the Community. 2019;27(1):226–40.
29. Gwyneth Paltrow’s Goop Agrees To Pay $145,000 To Settle False Advertising Lawsuit : NPR [Internet]. [cited 2022 Jan 14]. Available from: https://www.npr.org/2018/09/07/645665387/gwyneth-paltrows-goop-agrees-to-pay-145-000-to-settle-false-advertising-lawsuit
30. Bradley S. How much money Instagram influencers make [Internet]. Business Insider. [cited 2022 Jan 14]. Available from: https://www.businessinsider.com/how-much-money-instagram-influencers-earn-examples-2021-6