[ad_1]
YouTube will start adding fact-check labels and re-order its ranking system for videos about health with help from a nonprofit group just days after President Joe Biden tore into social media for “killing people” by spreading falsehoods about vaccines.
YouTube, part of Alphabet Inc.’s Google, announced the changes on Monday morning. Beneath certain videos about health, the company will add information panels, like those currently found at the bottom of clips about popular conspiracy theory topics such as the moon landing. YouTube will also start displaying select videos more prominently on the site when people search for health terms, similar to how it now treats certain news topics.
For both the material in these panels and its ranking system, YouTube said it will rely on a recent set of guidelines for verifying online information from the National Academy of Medicine, a non-governmental organization.
“These principles for health sources are the first of its kind,” Garth Graham, director of health care for YouTube, wrote in a blog post. “We hope that other companies will also review and consider how these principles might help inform their work with their own products and platforms.”
Graham, a former health insurance executive, was hired by YouTube at the start of the year to lead a new effort to highlight and produce these videos.
Like social networks Facebook Inc. and Twitter Inc., YouTube has scrambled to better moderate its flood of user-generated media to deal with misinformation on Covid-19 and vaccines. The platform has removed thousands of videos for violating its misinformation rules since the pandemic began, which has led to criticism for being too censorious, particularly from the political right.
Yet the Biden Administration has gone on an offensive against technology companies in its push for more Americans to get vaccinated and the surgeon general released a report on health misinformation. And on Friday, Biden criticized social networks for their role in letting anti-vaccination material spread.
Democratic Senator Amy Klobuchar said Sunday that misinformation on social media about vaccines adds urgency to her call to change liability standards for what is published on their platforms.
“YouTube removes content in accordance with our Covid-19 misinformation policies,” the company said. “We also demote borderline videos and prominently surface authoritative content for Covid-19-related search results, recommendations, and context panels.”
The company said it will continue working with health organizations and other medical experts to prevent the spread of misinformation.
Twitter also said it would do its part to “elevate authoritative health information,” while Facebook expanded on its defense over the weekend with a blog on how it can’t take the blame for missed target for U.S. vaccinations.
“At a time when Covid-19 cases are rising in America, the Biden administration has chosen to blame a handful of American social media companies,” Facebook’s Vice President of Integrity Guy Rosen said in the post. “Facebook is not the reason this goal was missed.”
YouTube has also argued that it doesn’t operate the same way as social networks, since it doesn’t have the same sort of viral sharing. But its algorithm does feed people a majority of the videos watched on the site.
The surgeon general’s report didn’t mention YouTube by name, but it cited a recent academic study that examined vaccine information on YouTube. Since late 2019, pro-vaccine videos show up higher in YouTube’s search rankings, according to the research. But once viewers watch an anti-vaccine video, YouTube’s system ends up exposing them to more of the same, it added.
A YouTube spokesperson said the company has been working on these recent changes since February. They are coming to viewers in the U.S. first, and the company is planning to add more countries.
This story has been published from a wire agency feed without modifications to the text.
Never miss a story! Stay connected and informed with Mint.
Download
our App Now!!
[ad_2]