害羞草研究所

Skip to content

Health misinformation creates 害羞草研究所榳hack-a-mole害羞草研究所 situation for tech platforms

World Health Organization analysis found misinformation in up to 51 per cent of social media posts
web1_2024031817038-63930360bdcd42a4da8befa3c7791e074832fd58736454c9a4e63ddfecfe1747
For social media companies, how to address health information has become a perennial question that has only grown in importance as the number of platforms multiplied and people began spending increasing amounts of time online. Now, it害羞草研究所檚 not uncommon to spot medical misinformation with almost every scroll.This March 18, 2010, file photo shows the YouTube website in Los Angeles. THE CANADIAN PRESS/AP/Richard Vogel

When Dr. Garth Graham thinks about health misinformation on social media platforms, he envisions a garden.

No matter how bountiful or verdant that garden is, even the head of YouTube害羞草研究所檚 global health division admits it害羞草研究所檚 often in need of tending.

害羞草研究所淗ow do you weed and pull out the bad information?害羞草研究所 he questioned.

害羞草研究所淏ut also害羞草研究所ow do you plant the seeds and make sure people have access to good information as well as high quality information?害羞草研究所

For social media companies, these have become perennial questions that have only grown in importance as the number of platforms multiplied and people began spending increasing amounts of time online.

Now, it害羞草研究所檚 not uncommon to spot misinformation with almost every scroll.

A 2022 paper published in the Bulletin of the World Health Organization reviewed 31 studies examining how prevalent misinformation is. The analysis found misinformation in up to 51 per cent of social media posts associated with vaccines, up to 28.8 per cent of content associated with COVID-19, and up to 60 per cent of posts related to pandemics.

An estimated 20 to 30 per cent of YouTube videos about emerging infectious diseases were also found to contain inaccurate or misleading information.

The consequences can be harmful, if not deadly.

Research the Council of Canadian Academies released in 2023 said COVID-19 misinformation alone contributed to more than 2,800 Canadian deaths and at least $300 million in hospital and ICU visits.

Platforms take the risks seriously, Graham said in an interview. 害羞草研究所淲e are always concerned about anything that may produce harm.害羞草研究所

That concern often leads platforms to remove anything violating their content policies.

YouTube, for example, has banned content denying the existence of some medical conditions or contradicting health authority guidance on prevention and treatment.

Examples embedded in its medical misinformation policy show the company removes posts promoting turpentine, gasoline and kerosene as a treatment for certain conditions because these substances cause death. Ivermectin, used to treat parasitic worms in animals and humans, and hydroxychloroquine, a malaria drug, are also barred from being promoted as COVID-19 cures.

When it comes to vaccines, YouTube bans videos alleging immunizations cause cancer or paralysis.

Facebook and Instagram parent company Meta Platforms Inc. refused to comment for this story and TikTok did not respond to a request for comment, but in broad strokes, these companies have similar policies to YouTube.

Yet Timothy Caulfield, a University of Alberta professor focused on health law and policy, still spots medical misinformation on platforms. He recently asked his students to search for stem cell content and several posts spreading unproven therapies came up easily.

Still, he sympathizes with some of the challenges tech companies face because he sees conquering health misinformation as a game of 害羞草研究所渨hack-a-mole.害羞草研究所

He says there害羞草研究所檚 a nimbleness to spreaders of misinformation, who are often motivated to keep finding ways to circumvent removal policies because their posts can boost profits and brands or spread an ideology.

害羞草研究所淭hey can work around the moderation strategies, but that just shows how we害羞草研究所檙e not going to fix this with one tool,害羞草研究所 Caulfield said.

害羞草研究所淭his is going to be an on ongoing battle.害羞草研究所

In its misinformation policy posted on its website, Meta acknowledges the difficulties, saying 害羞草研究所渨hat is true one minute may not be true the next minute.害羞草研究所

害羞草研究所淧eople also have different levels of information about the world around them and may believe something is true when it is not,害羞草研究所 the policy says.

In an attempt to keep up with everything, Meta relies on independent experts to assess how true content is and whether it is likely to directly contribute imminent harm before it is removed. Third-party fact checking organizations are also contracted to review and rate the accuracy of its most viral content.

At YouTube, workers, including some who form an 害羞草研究所渋ntelligence desk害羞草研究所 monitoring posts and news to detect trends that might need to be mitigated, are used along with machine learning programs, which the company says are well suited to detecting patterns in misinformation.

Some responsibility is also put on credible health-care practitioners and institutions, whose content platforms highlight recommendations to make it easier for users to find trustworthy information.

YouTube, for example, has partnered with organizations including the University Health Network and the Centre for Addiction and Mental Health in Toronto.

CAMH runs a YouTube channel where medical professionals explain everything from schizophrenia to eating disorders. Production funding came from YouTube, but the institution害羞草研究所檚 resources were used for script writing and clinical review, CAMH spokeswoman Hayley Clark said in an email.

Graham sees it as a good example of the health-care profession 害羞草研究所渕eeting people where they are,害羞草研究所 which he said is 害羞草研究所渉ow we battle misinformation.害羞草研究所

害羞草研究所(Credible information) has to be in the palm of people害羞草研究所檚 hands so that they can have dinner conversations, so when they害羞草研究所檙e sitting down in their couch that they害羞草研究所檙e empowered,害羞草研究所 he said.

But when it comes to other organizations and doctors, 害羞草研究所渨e can害羞草研究所檛 assume that all of them have the capacity to do this,害羞草研究所 said Heidi Tworek, an associate professor at the University of British Columbia, whose research focuses on the effects of new media technologies.

These organizations want to get credible information out, but in the cash-strapped and time-lacking health-care industry, there害羞草研究所檚 always another patient to help.

害羞草研究所淪ome health-care institutions would say, 害羞草研究所極K, we害羞草研究所檝e got X amount of money, we害羞草研究所檝e got to choose what we spend it on. Maybe we want to spend it on something other than communications,害羞草研究所櫤π卟菅芯克鶟 Tworek said.

In some instances, doctors are also 害羞草研究所渄oing it off the side of their desk害羞草研究所ecause they think it is valuable,害羞草研究所 but that subjects them to new risks like online attacks and sometimes even death threats.

害羞草研究所淪ome people don害羞草研究所檛 want to enter those spaces at all because they see what happens to others,害羞草研究所 she said.

To better combat medical misinformation, she would like platforms to act more responsibly because she often notices their algorithms push problematic content to the top of social media timelines.

However, she and Caulfield agree health misinformation needs an all-hands-on-deck approach.

害羞草研究所淭he platforms bear a lot of responsibility. They害羞草研究所檙e becoming like utilities and we know the impact that they have on public discourse, on polarization,害羞草研究所 Caulfield said.

害羞草研究所淏ut we also need to teach critical thinking skills.害羞草研究所

That could begin at school, where students could learn how to identify credible sources and detect when something could be incorrect 害羞草研究所 lessons he害羞草研究所檚 heard Finland begins in kindergarten.

No matter when or how that education takes place, he said the bottom line is 害羞草研究所渨e need to give citizens the tools to discern what害羞草研究所檚 misinformation.害羞草研究所

READ ALSO:





(or

害羞草研究所

) document.head.appendChild(flippScript); window.flippxp = window.flippxp || {run: []}; window.flippxp.run.push(function() { window.flippxp.registerSlot("#flipp-ux-slot-ssdaw212", "Black Press Media Standard", 1281409, [312035]); }); }