[ad_1]
The internet is a source of many things, such as yummy recipes, tech deals and horrible misinformation. The latter often spreads through social media sites, something they have to combat (or usually choose to ignore). Right now, YouTube is choosing to fight, announcing a new long-term policy plan to grapple with medical misinformation, especially about cancer.
YouTube’s new guidelines for health content will fall under three categories: prevent, treatment and denial misinformation. Prevent will allegedly review and remove videos that oppose guidelines set out by trusted authorities or contradict vaccine safety and efficacy (the platform banned content with vaccine misinformation in 2021). Treatment should center on taking down any misinformation about — unsurprisingly — treating medical conditions, including unproven remedies. The platform claims that denial will focus on removing any content that makes a false claim, such as that people didn’t die due to COVID-19.
“To determine if a condition, treatment or substance is in scope of our medical misinformation policies, we’ll evaluate whether it’s associated with a high public health risk, publicly available guidance from health authorities around the world, and whether it’s generally prone to misinformation,” YouTube’s Director and Global Head of Healthcare and Public Health Partnerships Dr. Garth Graham and its VP and Global Head of Trust and Safety Matt Halprin said in the joint release outlining the new policies.
Starting now, YouTube says it will be removing videos specifically about cancer which violate any of these policies — an effort it claims will ramp up more in the coming weeks. For example, if a video states that garlic cures cancer, it’s coming down. YouTube is also sharing a playlist of science-backed cancer-related videos and teaming up with Mayo Clinic to create even more informational videos about cancer.
These policies come less than two months after YouTube announced it would “stop removing content that advances false claims that widespread fraud, errors, or glitches occurred in the 2020 and other past US Presidential elections” because it curtailed political speech. So misinformation is allowed when it threatens democracy, just not across every category on the site — cool. Though, YouTube does say that it will allow some health videos with falsehoods to remain if the context is right, such as public interest. The platform says in some cases, content will be allowed to stay up but will be given an age restriction.
[ad_2]
Source link