Wednesday 29 September 2021

YouTube bans vaccine misinformation videos

SAN FRANCISCO (KRON) -- Nearly a year after COVID-19 vaccines became available in the U.S., YouTube is expanding its crackdown on vaccine misinformation.

The video website updated its guidelines on Wednesday to add specifics for those spreading lies about "currently administered vaccines that are approved and confirmed to be safe and effective by local health authorities and the WHO."

Here's what moderators are looking out for:

  • Content alleging that vaccines cause chronic side effects, outside of rare side effects that are recognized by health authorities
  • Content claiming that vaccines do not reduce transmission or contraction of disease 
  • Content misrepresenting the substances contained in vaccines

YouTube said since last year, it has already removed over 130,00 videos for violating COVID-19 vaccine policies.

There are exceptions to the new guidelines. If a channel doesn't show a pattern of promoting vaccine hesitancy, they will allow it to upload personal testimonials about the vaccine. The platform will also allow content about vaccine policies, new vaccine trials, and historical vaccine successes or failures.

COVID-19 vaccine misinformation has been enough of a concern to health officials that the U.S. Surgeon General issued an advisory in July to warn Americans about the threat.

"Health misinformation is an urgent threat to public health. It can cause confusion, sow mistrust, and undermine public health efforts, including our ongoing work to end the COVID-19 pandemic," said U.S. Surgeon General Dr. Vivek Murthy.

The surgeon general cited a study performed in the UK and the U.S. which showed "scientific-sounding misinformation is more strongly associated with declines in vaccination intent."

In the advisory, Murthy called for social media and tech companies to address misinformation on their platforms.

Facebook and Twitter have also taken steps to do so.

An analysis by the Center for Countering Digital Hate (CCDH) found that just 12 people accounted for up to 73% of anti-vaccine content on Facebook platforms, when looking at posts between Feb. 1, 2021 and March 16, 2021.

Facebook responded to the analysis last month, saying it removed over three dozen Pages, groups and Facebook or Instagram accounts linked to these 12 people. They also contend they have removed over 20 million pieces of content since the beginning of the pandemic for spreading COVID-19 and vaccine misinformation.

The 'digital dozen, as the CCDH called it, also accounted for up to 17% of harmful posts on Twitter.

Since December 2020, Twitter has shared its approach to removing COVID-19 vaccine misinformation, which includes removing tweets, placing warning labels on tweets that "advance unsubstantiated rumors, disputed claims, as well as incomplete or out-of-context information about vaccines," and allowing users to report specific tweets for misinformation.



from KRON4 https://ift.tt/2Y2sxgB


No comments:

Post a Comment